Eigendecomposition : Data Science Basics

  Рет қаралды 75,038

ritvikmath

ritvikmath

Күн бұрын

Пікірлер
@zigzagarmchair2367
@zigzagarmchair2367 10 ай бұрын
omg, you are really great at explaining things by using only a pen and a whiteboard, without the need for fancy digital animation, this is definitely what I call a REAL "Education"!!!
@nizogos
@nizogos 17 күн бұрын
Does fancy digital animation make education worse for you?It offers the insights of experts in those subjects directly to you, without having to study a subject 20 years to understand it in depth.
@Vinladar
@Vinladar 4 жыл бұрын
This is definitely a great explanation of eigendecomposition. I kind of got into this rabbit hole trying to understand singular value decomposition, and this video helped me understand that as well. Thanks for your help understanding this.
@vinceb8041
@vinceb8041 4 жыл бұрын
Lmao I'm in the exact same rabbithole :D
@tachimegun
@tachimegun 3 жыл бұрын
holy shit I guess I'm not alone lmao
@n1984ster
@n1984ster 2 жыл бұрын
+1
@vikramm1115
@vikramm1115 2 жыл бұрын
same here bro
@informatiktechmathscience4246
@informatiktechmathscience4246 2 жыл бұрын
haha mee too
@lenaso4555
@lenaso4555 3 жыл бұрын
Holy shit you're literally blowing my mind (in a positive way) with your videos. I've never understood Eigendecomposition (and many more of the topics you're explaining) but now it all makes sense. Please never stop with your videos!
@verule3928
@verule3928 26 күн бұрын
Thank you so much for explaining this so clearly. I was struggling to understand this for so long and you just made it so easier. You are an excellent teacher!!
@dalisabe62
@dalisabe62 Ай бұрын
And it is affordable. Thirty lectures like this could be an entire course in linear algebra priced at a fraction of what universities charge in tuition and save money, time, classroom space and energy on campus commute. However, we could only go too far doing matrices by hand, so the course would need a software package like Mathematica, Matlab or Maple to crunch the numbers. Thanks a great deal for the quality presentation.
@martinpign868
@martinpign868 4 жыл бұрын
Finally, someone that shows it simple and clear and answers the most important question: why? Thank you!
@ritvikmath
@ritvikmath 4 жыл бұрын
No problem!
@tojewel
@tojewel 4 жыл бұрын
Wish I could give more than one like. This channel is so underrated.
@andreipitkevich1088
@andreipitkevich1088 Жыл бұрын
Surprisingly good explanation. Thanks a lot! I especially liked that all the information goes in order without gaps and an example of practical application is given.
@zz-9463
@zz-9463 4 жыл бұрын
Never seen such a clear explanation! Thank you so much!
@dimidoff5431
@dimidoff5431 4 жыл бұрын
This is a great explanation, been stuck trying to understand PCA and this really helps
@amisha065
@amisha065 9 ай бұрын
I'm just learning these basics and your videos are very comprehensive and highly informative. Looking forward to completing all the videos in the playlist!!
@365HockeyGirl
@365HockeyGirl 4 жыл бұрын
Watched this video as a refresher for my ML class and it was super helpful. Thanks!!!
@derrickagyemang1259
@derrickagyemang1259 2 ай бұрын
Great video, love the clarity of the explanation
@sainandankandikattu9077
@sainandankandikattu9077 4 жыл бұрын
Honestly.... U deserve atleast a million subscribers.... A moron professor in our Econometrics class didn't even try to do this in his class! Thanks professor ritvik!
@saraaltamirano
@saraaltamirano 4 жыл бұрын
While Ritvik is indeed A-MA-ZING, perhaps you should be a bit nicer to your econometrics professor :-)
@蔡小宣-l8e
@蔡小宣-l8e 2 жыл бұрын
Brief and clear! Thank you. 简短,清晰!
@RiteshSingh-ru1sk
@RiteshSingh-ru1sk 3 жыл бұрын
Wow this is the best video on Eigen Decomposition. Thanks a lot man!
@ImolaS3
@ImolaS3 3 жыл бұрын
A superb explanation that i got the first time through. Liked and subscribed!
@tarunbhatia8652
@tarunbhatia8652 3 жыл бұрын
best video on eigen val decomposition on any platform. Thanks man!
@ritvikmath
@ritvikmath 3 жыл бұрын
Wow, thanks!
@olz6928
@olz6928 17 күн бұрын
Hey! This video is great and it has helped me a lot. As feedback I will tell you that when the video began everything was on the whiteboard. This felt really overwhelming to me. This might be something you want to think about in the future.
@yanwang248
@yanwang248 4 жыл бұрын
This channel is extremely useful, thank you very much
@himanshu1179
@himanshu1179 Ай бұрын
Beautifully explained Ritvik. 👍
@souravdey1227
@souravdey1227 3 жыл бұрын
Such a succinct explanation.. can you just explain why we normalised the eigen vectors?
@luca7x689
@luca7x689 3 жыл бұрын
Thank you so much. I always love to learn why things are important. Makes studying much more interesting :)
@usama57926
@usama57926 3 жыл бұрын
Beautiful explanation........ Thanks.............
@amritpalsingh6440
@amritpalsingh6440 3 жыл бұрын
Best help I found online. Thanks :)
@ritvikmath
@ritvikmath 3 жыл бұрын
You're welcome!
@JosephRivera517
@JosephRivera517 4 жыл бұрын
This gives a lot of information about the process of doing it and its value in data science. Thanks.
@kally3432
@kally3432 3 жыл бұрын
I really love your explanations, really helpful
@ritvikmath
@ritvikmath 3 жыл бұрын
Appreciated!
@jambulingamlogababu8914
@jambulingamlogababu8914 Жыл бұрын
Thank you very much for your detailed answer with appropriate examples and its benefit
@sanjeetwalia5077
@sanjeetwalia5077 4 жыл бұрын
I liked the video, very explanatory and understandable
@madhamj
@madhamj 3 жыл бұрын
Love bro! This explanation was so clear
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad to hear it!
@abrhk96
@abrhk96 4 жыл бұрын
You made it so easy to understand! Thank you!
@ritvikmath
@ritvikmath 4 жыл бұрын
Glad it helped!
@robertovolpi
@robertovolpi 9 ай бұрын
Outstanding explanation! It is very difficult to find that subject in a linear algebra college textbook.
@yingma6770
@yingma6770 3 жыл бұрын
Great explanation! Can you please give an example in machine learning or data science when we need to do the same linear transformation again and again?
@langwen8685
@langwen8685 4 жыл бұрын
Amazing clear explanation! Love u dude! Thx a million!
@rahulvansh2390
@rahulvansh2390 2 жыл бұрын
Only one doubt, what's the reason behind normalizing eigenvectors? Btw, your content, the way of explaining these scary concepts taught me something that even MIT lectures couldn't. Thank you so much sir, please keep making such videos! More power to you sir :)
@TheRohit901
@TheRohit901 2 жыл бұрын
Because any scalar multiplied by a eigenvector also remains a eigenvector only, hence we generally take unit vector
@moritz4150
@moritz4150 Жыл бұрын
thanks, very easy to follow you in your thought process. Helped me very much!
@ritvikmath
@ritvikmath Жыл бұрын
Glad it helped!
@deepak_kori
@deepak_kori Жыл бұрын
OMG the application part was amazing😍
@ritvikdhupkar5861
@ritvikdhupkar5861 4 жыл бұрын
Great Clear explanations... Thanks a lot!
@enozeren
@enozeren Жыл бұрын
Great short explenation! Thanks!
@baraaa.2338
@baraaa.2338 4 жыл бұрын
Awesome Explanation.. Keep it up!
@ritvikmath
@ritvikmath 4 жыл бұрын
Thanks a lot!
@bungercolumbus
@bungercolumbus 24 күн бұрын
really well explained good job.
@y031962
@y031962 4 жыл бұрын
thanks for posting it, it would have been nicer to show how matix to the power is used in data science.
@ramsaini3745
@ramsaini3745 4 ай бұрын
your videos are helpful and concise at the same time, thats rare on today's yt
@younesarabnedjadi4419
@younesarabnedjadi4419 2 жыл бұрын
OMG, literally understood the eigen shit in 8 minutes, thank you so much
@ritvikmath
@ritvikmath 2 жыл бұрын
Awesome!
@YTGiomar
@YTGiomar 10 ай бұрын
Damn, just a good video. Thank you very much for explaining
@mahdijavadi2747
@mahdijavadi2747 3 жыл бұрын
Thanks a lot for this clear explanation!
@abhinavmishra9401
@abhinavmishra9401 4 жыл бұрын
Thanks a lot. This was sublime.
@ritvikmath
@ritvikmath 4 жыл бұрын
You're very welcome!
@ekaterinaburakova8629
@ekaterinaburakova8629 11 ай бұрын
Wow, such a good explanation!
@ritvikmath
@ritvikmath 11 ай бұрын
Glad it was helpful!
@CStrik3r
@CStrik3r 4 жыл бұрын
Great explanation !
@jatinkumar4410
@jatinkumar4410 3 жыл бұрын
Thanks...Very nice explanation...
@ritvikmath
@ritvikmath 3 жыл бұрын
You are welcome
@suvikarhu4627
@suvikarhu4627 2 жыл бұрын
7:54 Shouldn't you do the rightmost multiplication first? Lambda * U inverse.
@tanmaygupta8288
@tanmaygupta8288 11 ай бұрын
thankyou so much, u are a saviour
@보라색사과-l1r
@보라색사과-l1r 4 жыл бұрын
Thank you for this amazingly simple explanation! Could you give me an example of that kind of multiplication used in Machine Learning?
@sgifford1000
@sgifford1000 4 жыл бұрын
You have a great channel! Thanks for the insight which is hard to come by. Just one confusing area to me at the time was the definition of the 2x2 matrices for u1 and u2. They look like 3x2 matrices with values 1 & u1 (or u2). I did figure it out though. Thanks!
@ritvikmath
@ritvikmath 4 жыл бұрын
Thank you!
@Galmion
@Galmion 2 жыл бұрын
can you elaborate on this? I still don't get how it isn't a 3x2 matrix.
@Arycke
@Arycke Жыл бұрын
​@@Galmionit shouldn't have been written in the way it was in my opinion, as it causes confusion. Those "1's" are just dot dot dots, ..., meant to be arbitrary entries
@Arycke
@Arycke Жыл бұрын
​@@Galmionthe Matrix U is the 2 eigenvectors, u1 and u2, put next to each other in one matrix. And since u1 and u2 are 2x1 vectors, putting them together in a matrix makes it a 2x2
@Arycke
@Arycke Жыл бұрын
​@@GalmionI would have chosen an example with no square roots as the first example personally. Say your eigenvectors are u1= [2] [3] u2 = [4] [5] Then U, the eigenvector matrix: U = [2 4] [3 5] Hope this helps.
@shashankelsalvador
@shashankelsalvador 2 жыл бұрын
Best intro ever
@user-wr4yl7tx3w
@user-wr4yl7tx3w 2 жыл бұрын
But are most matrices decomposable to eigendecomposition? Then doesn’t that mean limited use?
@香港地舖購物
@香港地舖購物 2 жыл бұрын
Great video ! Can you also touch on the topic of LU Decomposition, Jordan Canonical Form, Rayleigh quotient, etc. ?
@bashiruddin3891
@bashiruddin3891 3 жыл бұрын
very nice explanation
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks for liking
@balazsbaranyai8115
@balazsbaranyai8115 4 жыл бұрын
Man, this rocks! thank you!
@boryanasvetichkova8458
@boryanasvetichkova8458 Жыл бұрын
Great video, thanks!
@rozhanmirzaei3512
@rozhanmirzaei3512 3 ай бұрын
Love this. Thank u❤
@amirhosseinmirkazemi765
@amirhosseinmirkazemi765 4 жыл бұрын
You are AWESOME! thank you!
@UsmanAbbas-k2c
@UsmanAbbas-k2c 11 ай бұрын
Super helpful. Thanks
@heejuneAhn
@heejuneAhn 4 жыл бұрын
Your explanation is the best I have ever seen. But your explanation does not explain what each component really means, ie. The First U^-1, map/rotate the input vectors, and then stretch the result in each eigenvector direction and then finally reverse-rotate the vector (restoring into the original axis).
@rimshasardarsardarrimsha3209
@rimshasardarsardarrimsha3209 3 жыл бұрын
Can y tell me about what are the pros of this topic?
@yangwang9688
@yangwang9688 3 жыл бұрын
What is the difference between decomposition and factorisation?
@ritvikmath
@ritvikmath 3 жыл бұрын
I think they're often used interchangeably
@hassanshahzad3922
@hassanshahzad3922 3 жыл бұрын
this is awesome!
@rohanchess8332
@rohanchess8332 Жыл бұрын
Why do we need normalized eigenvectors? won't any eigenvectors from the family of eigenvectors suffice
@elahedastan4945
@elahedastan4945 2 жыл бұрын
great explanation
@thomasstiglich3484
@thomasstiglich3484 3 жыл бұрын
Great video!
@michael8899aspen
@michael8899aspen 3 жыл бұрын
iF P=6 OR p=7 is this arbitrary p=8?
@gvbvwockee
@gvbvwockee 4 жыл бұрын
Thank you. Thank you. Thank you.
@ritvikmath
@ritvikmath 4 жыл бұрын
Any time!
@DRmrTG
@DRmrTG Жыл бұрын
Thanks for your help!
@NinjaAdorable
@NinjaAdorable 4 жыл бұрын
That was beautiful !!!! :')
@ritvikmath
@ritvikmath 4 жыл бұрын
thanks!
@Fat_Cat_Fly
@Fat_Cat_Fly 4 жыл бұрын
Fantastic!!!!!!!!!!!!!!!!!!
@tehminakakar8753
@tehminakakar8753 Жыл бұрын
hey did anyone solve for the eigenvectors? Maybe I am wrong, I got x1 = -2/3 x2 and x2 = -3/2 x1 when solving the equations for lamda = -5. if anyone got the answer please let me know.
@rugahun
@rugahun 2 жыл бұрын
pff great video, i feel bad i didnt knew this guy erlier, saves a lot of time.
@yunkkim1159
@yunkkim1159 Жыл бұрын
great job , I had no idea before the video now I know everything
@williamjmccartan8879
@williamjmccartan8879 3 ай бұрын
Great job, peace
@jneal4154
@jneal4154 10 ай бұрын
Excellent. I was struggling to understand how the form A=ULU^-1 is reached from the definition of an eigenvalue (Au=lu) as explained in my textbook, but the way you explained it made it all click for me. Thanks!
@ritvikmath
@ritvikmath 10 ай бұрын
Excellent!
@alejandropalaciosgarcia2767
@alejandropalaciosgarcia2767 3 жыл бұрын
Excellent
@marc2752
@marc2752 2 жыл бұрын
awesome thanks
@zddmsmify
@zddmsmify 4 жыл бұрын
Ojalá me lo hubieran explicado así de fácil cuando lo estudiaba hace casi 30 años
@zddmsmify
@zddmsmify 4 жыл бұрын
Explicación excepcional
@ritvikmath
@ritvikmath 4 жыл бұрын
Gracias por las amables palabras!
@davidfield5295
@davidfield5295 Жыл бұрын
Good video
@satyamgupta4808
@satyamgupta4808 Жыл бұрын
very nice
@svengunther7653
@svengunther7653 3 жыл бұрын
Thanks man!
@thedailyepochs338
@thedailyepochs338 3 жыл бұрын
Beautiful
@mydodethailung395
@mydodethailung395 Жыл бұрын
Amazing
@NeoZondix
@NeoZondix 2 жыл бұрын
Thanks
@klingefjord
@klingefjord 4 жыл бұрын
Hang on, if a matrix times its inverse is the identity matrix, why can't the formula for eigendecomposition (U * lambda * U^-1) be simplified as just lambda?
@Pukimaxim
@Pukimaxim 4 жыл бұрын
You cannot rearrange the equation with matrices multiplication as you would with numbers/variables
@Rudolf-ul1zh
@Rudolf-ul1zh 4 жыл бұрын
Exactly, matrix multiplication is not commutative!
@gelvis11.11
@gelvis11.11 Жыл бұрын
Thank you
@ritvikmath
@ritvikmath Жыл бұрын
Of course!
@kiracite
@kiracite Жыл бұрын
10/10 ty
@krishnachauhan2850
@krishnachauhan2850 3 жыл бұрын
Awsome
@yashjain6372
@yashjain6372 2 жыл бұрын
nice
@Shayan7755
@Shayan7755 2 жыл бұрын
damn i like you, good job
@Arycke
@Arycke Жыл бұрын
SVD is superior imo
@abdelrahmanelbeltagy3942
@abdelrahmanelbeltagy3942 2 жыл бұрын
thanks*10^10000
@indianmovierecaps3192
@indianmovierecaps3192 3 жыл бұрын
thanksssssssssssssssssssssssssssssssssssssssss
@marcoloya
@marcoloya 3 жыл бұрын
Coool
@newmanokereafor2368
@newmanokereafor2368 3 жыл бұрын
Beautiful and handsome and pretty and
Singular Value Decomposition : Data Science Basics
9:00
ritvikmath
Рет қаралды 50 М.
3 x  3 eigenvalues and eigenvectors
12:29
Prime Newtons
Рет қаралды 158 М.
Caleb Pressley Shows TSA How It’s Done
0:28
Barstool Sports
Рет қаралды 60 МЛН
«Жат бауыр» телехикаясы І 30 - бөлім | Соңғы бөлім
52:59
Qazaqstan TV / Қазақстан Ұлттық Арнасы
Рет қаралды 340 М.
I never understood why you can't go faster than light - until now!
16:40
FloatHeadPhysics
Рет қаралды 5 МЛН
Eigenvalues & Eigenvectors : Data Science Basics
11:58
ritvikmath
Рет қаралды 152 М.
Singular Value Decomposition (SVD) and Image Compression
28:56
Serrano.Academy
Рет қаралды 96 М.
Eigendecomposition Explained
7:35
DataMListic
Рет қаралды 9 М.
Visualize Spectral Decomposition | SEE Matrix, Chapter 2
15:55
Visual Kernel
Рет қаралды 86 М.
Singular Value Decomposition (SVD): Mathematical Overview
12:51
Steve Brunton
Рет қаралды 412 М.
I Helped 2,000 People Walk Again
15:31
MrBeast
Рет қаралды 26 МЛН
The Jacobian : Data Science Basics
10:04
ritvikmath
Рет қаралды 38 М.
Caleb Pressley Shows TSA How It’s Done
0:28
Barstool Sports
Рет қаралды 60 МЛН