The curse of dimensionality. Or is it a blessing?

  Рет қаралды 7,008

AI Coffee Break with Letitia

AI Coffee Break with Letitia

Күн бұрын

Пікірлер: 21
@martinl3260
@martinl3260 3 жыл бұрын
I was breaking my head trying to understand the curse, I was ready to make a custom 10 dimension data set and do computing process by hand to get the idea. You saved me, thanks.
@Kevin.Kawchak
@Kevin.Kawchak 4 ай бұрын
Thank you.
@Elivery
@Elivery 4 жыл бұрын
Thanks a lot for your video! Enjoyed every second of it; especially now that I am learning methods for dim reduction in the transcription biology
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
The next video is coming soon. Working on it right now! 😊
@thegoru0106
@thegoru0106 2 жыл бұрын
Best explanation I have found so far
@joseanealmeida5781
@joseanealmeida5781 2 жыл бұрын
Very nice, thank you 😃
@Youkouleleh
@Youkouleleh 4 жыл бұрын
Thanks for the video
@EkShunya
@EkShunya Жыл бұрын
Wow ur so sweet for explaining it so kindly and concisely bless you
@ANTIMONcom
@ANTIMONcom 4 жыл бұрын
"with millions of dimentions, there is always a way down". Gaute Einevoll and Terrence Sejnowski had a good AI podcast episode that touched on how high dimentionallity is a benefit for training neural networks and how working in high dimentions is something that was seen as "bad/unmanagable" by the early neural network scientists that had their background from physics. Fun to see how paradigms in AI have changed so much, and also keep changing really fast even today
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
Interesting what you report. Only I do not understand why physicists would be uncomfortable with high dimensions. Physicists are very comfortable with high dimensionality, think about string theory with 10-11 dimensions or even space-time, where time is in a sense the fourth dimension.
@ANTIMONcom
@ANTIMONcom 4 жыл бұрын
@@AICoffeeBreak It is a bit of time since i listenend to the episode. Could be i remember some of the points a bit inaccuratlly🤔. I think the views in that field also has changed over the years. They were talking about 40-50 years ish years ago, so I am sure physic-paradigms also has changed during all that time 🤷‍♂️ If you are i interested and likes podcasts the episode is on Spotify. I think KZbin auto deletes comments with links, but you can search for "on ai and the underlying algorithms", and you should find it. It is kept on an approchable level for mortals so maybe most of it is a bit to easy for a professor 😆. But I thought the two old professors had some interesting insight into the ANN history and how people was thinking in the "way back before time".
@ANTIMONcom
@ANTIMONcom 4 жыл бұрын
@@AICoffeeBreak Forgot to mention it in my original comment; but really good video by the way 😃
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
@@ANTIMONcom Thanks, found the podcast, I'll listen to it. You made me curious. 😀
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
I listened to parts of the podcast, especially about the relations between physicists and dimensionality (about minute 42). It interests me, especially because I am (like the speakers) a non-practicing physicist (I studied physics in Bachelors and Masters). But what they say is more of the anecdotal-short type, with pointers coming up in a bigger discussion that can be misinterpreted. It also depends a lot which area of physics you are talking about: They say that biology (unlike physics) is when you are computing without knowing all numbers. But this is also the case in many areas of physics such as environmental physics, big parts of astrophysics, biophysics, etc... An example of a short and easy to misinterpret is "we come from a field where there are very few parameters". This is just not true, or perhaps true in very few cases, in the most simple (or simplified) problems. My first course of theoretical physics had from the very beginning functions like this f(x1, x2, ..., xn) and this was the status quo also in more advanced lectures. 😅 I think physicists are usually comfortable with many dimensions. But humans/people are not and physicists are also humans, then sometimes these two sides collide. 😁
@DerPylz
@DerPylz 4 жыл бұрын
Good video!
@younique9710
@younique9710 Жыл бұрын
So what's the problem if the distance between points increases?
@lintingwei
@lintingwei Жыл бұрын
this is gold!
@dionbridger5944
@dionbridger5944 Жыл бұрын
About your example of the case where higher dimensions are a blessing - isn't this actually a case of dimensionality reduction? :) You are discarding the original two dimensions and using only the third to separate the data. I guess having a higher number of dimensions is a blessing insofar as it gives you a lot of freedom to pick different ways to perform dimensionality reduction?
@wasima4463
@wasima4463 2 жыл бұрын
towards the end, it is not well explained. which is a bit disappointing as the video seemed interesting in the beginning and in the middle
PCA explained with intuition, a little math and code
9:00
AI Coffee Break with Letitia
Рет қаралды 6 М.
This is why Deep Learning is really weird.
2:06:38
Machine Learning Street Talk
Рет қаралды 416 М.
24 Часа в БОУЛИНГЕ !
27:03
A4
Рет қаралды 7 МЛН
Vampire SUCKS Human Energy 🧛🏻‍♂️🪫 (ft. @StevenHe )
0:34
Alan Chikin Chow
Рет қаралды 138 МЛН
MAMBA and State Space Models explained | SSM explained
22:27
AI Coffee Break with Letitia
Рет қаралды 57 М.
The Curse of Dimensionality
11:42
Shaina Race Bennett
Рет қаралды 2,8 М.
Thinking outside the 10-dimensional box
27:07
3Blue1Brown
Рет қаралды 3 МЛН
Lecture 4 "Curse of Dimensionality / Perceptron" -Cornell CS4780 SP17
47:43
ConvNeXt: A ConvNet for the 2020s - Paper Explained (with animations)
19:20
AI Coffee Break with Letitia
Рет қаралды 22 М.
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 1,5 МЛН
Curse of Dimensionality : Data Science Basics
8:45
ritvikmath
Рет қаралды 28 М.