omg, you are really great at explaining things by using only a pen and a whiteboard, without the need for fancy digital animation, this is definitely what I call a REAL "Education"!!!
@nizogos17 күн бұрын
Does fancy digital animation make education worse for you?It offers the insights of experts in those subjects directly to you, without having to study a subject 20 years to understand it in depth.
@Vinladar4 жыл бұрын
This is definitely a great explanation of eigendecomposition. I kind of got into this rabbit hole trying to understand singular value decomposition, and this video helped me understand that as well. Thanks for your help understanding this.
@vinceb80414 жыл бұрын
Lmao I'm in the exact same rabbithole :D
@tachimegun3 жыл бұрын
holy shit I guess I'm not alone lmao
@n1984ster2 жыл бұрын
+1
@vikramm11152 жыл бұрын
same here bro
@informatiktechmathscience42462 жыл бұрын
haha mee too
@lenaso45553 жыл бұрын
Holy shit you're literally blowing my mind (in a positive way) with your videos. I've never understood Eigendecomposition (and many more of the topics you're explaining) but now it all makes sense. Please never stop with your videos!
@verule392826 күн бұрын
Thank you so much for explaining this so clearly. I was struggling to understand this for so long and you just made it so easier. You are an excellent teacher!!
@dalisabe62Ай бұрын
And it is affordable. Thirty lectures like this could be an entire course in linear algebra priced at a fraction of what universities charge in tuition and save money, time, classroom space and energy on campus commute. However, we could only go too far doing matrices by hand, so the course would need a software package like Mathematica, Matlab or Maple to crunch the numbers. Thanks a great deal for the quality presentation.
@martinpign8684 жыл бұрын
Finally, someone that shows it simple and clear and answers the most important question: why? Thank you!
@ritvikmath4 жыл бұрын
No problem!
@tojewel4 жыл бұрын
Wish I could give more than one like. This channel is so underrated.
@andreipitkevich1088 Жыл бұрын
Surprisingly good explanation. Thanks a lot! I especially liked that all the information goes in order without gaps and an example of practical application is given.
@zz-94634 жыл бұрын
Never seen such a clear explanation! Thank you so much!
@dimidoff54314 жыл бұрын
This is a great explanation, been stuck trying to understand PCA and this really helps
@amisha0659 ай бұрын
I'm just learning these basics and your videos are very comprehensive and highly informative. Looking forward to completing all the videos in the playlist!!
@365HockeyGirl4 жыл бұрын
Watched this video as a refresher for my ML class and it was super helpful. Thanks!!!
@derrickagyemang12592 ай бұрын
Great video, love the clarity of the explanation
@sainandankandikattu90774 жыл бұрын
Honestly.... U deserve atleast a million subscribers.... A moron professor in our Econometrics class didn't even try to do this in his class! Thanks professor ritvik!
@saraaltamirano4 жыл бұрын
While Ritvik is indeed A-MA-ZING, perhaps you should be a bit nicer to your econometrics professor :-)
@蔡小宣-l8e2 жыл бұрын
Brief and clear! Thank you. 简短,清晰!
@RiteshSingh-ru1sk3 жыл бұрын
Wow this is the best video on Eigen Decomposition. Thanks a lot man!
@ImolaS33 жыл бұрын
A superb explanation that i got the first time through. Liked and subscribed!
@tarunbhatia86523 жыл бұрын
best video on eigen val decomposition on any platform. Thanks man!
@ritvikmath3 жыл бұрын
Wow, thanks!
@olz692817 күн бұрын
Hey! This video is great and it has helped me a lot. As feedback I will tell you that when the video began everything was on the whiteboard. This felt really overwhelming to me. This might be something you want to think about in the future.
@yanwang2484 жыл бұрын
This channel is extremely useful, thank you very much
@himanshu1179Ай бұрын
Beautifully explained Ritvik. 👍
@souravdey12273 жыл бұрын
Such a succinct explanation.. can you just explain why we normalised the eigen vectors?
@luca7x6893 жыл бұрын
Thank you so much. I always love to learn why things are important. Makes studying much more interesting :)
@usama579263 жыл бұрын
Beautiful explanation........ Thanks.............
@amritpalsingh64403 жыл бұрын
Best help I found online. Thanks :)
@ritvikmath3 жыл бұрын
You're welcome!
@JosephRivera5174 жыл бұрын
This gives a lot of information about the process of doing it and its value in data science. Thanks.
@kally34323 жыл бұрын
I really love your explanations, really helpful
@ritvikmath3 жыл бұрын
Appreciated!
@jambulingamlogababu8914 Жыл бұрын
Thank you very much for your detailed answer with appropriate examples and its benefit
@sanjeetwalia50774 жыл бұрын
I liked the video, very explanatory and understandable
@madhamj3 жыл бұрын
Love bro! This explanation was so clear
@ritvikmath3 жыл бұрын
Glad to hear it!
@abrhk964 жыл бұрын
You made it so easy to understand! Thank you!
@ritvikmath4 жыл бұрын
Glad it helped!
@robertovolpi9 ай бұрын
Outstanding explanation! It is very difficult to find that subject in a linear algebra college textbook.
@yingma67703 жыл бұрын
Great explanation! Can you please give an example in machine learning or data science when we need to do the same linear transformation again and again?
@langwen86854 жыл бұрын
Amazing clear explanation! Love u dude! Thx a million!
@rahulvansh23902 жыл бұрын
Only one doubt, what's the reason behind normalizing eigenvectors? Btw, your content, the way of explaining these scary concepts taught me something that even MIT lectures couldn't. Thank you so much sir, please keep making such videos! More power to you sir :)
@TheRohit9012 жыл бұрын
Because any scalar multiplied by a eigenvector also remains a eigenvector only, hence we generally take unit vector
@moritz4150 Жыл бұрын
thanks, very easy to follow you in your thought process. Helped me very much!
@ritvikmath Жыл бұрын
Glad it helped!
@deepak_kori Жыл бұрын
OMG the application part was amazing😍
@ritvikdhupkar58614 жыл бұрын
Great Clear explanations... Thanks a lot!
@enozeren Жыл бұрын
Great short explenation! Thanks!
@baraaa.23384 жыл бұрын
Awesome Explanation.. Keep it up!
@ritvikmath4 жыл бұрын
Thanks a lot!
@bungercolumbus24 күн бұрын
really well explained good job.
@y0319624 жыл бұрын
thanks for posting it, it would have been nicer to show how matix to the power is used in data science.
@ramsaini37454 ай бұрын
your videos are helpful and concise at the same time, thats rare on today's yt
@younesarabnedjadi44192 жыл бұрын
OMG, literally understood the eigen shit in 8 minutes, thank you so much
@ritvikmath2 жыл бұрын
Awesome!
@YTGiomar10 ай бұрын
Damn, just a good video. Thank you very much for explaining
@mahdijavadi27473 жыл бұрын
Thanks a lot for this clear explanation!
@abhinavmishra94014 жыл бұрын
Thanks a lot. This was sublime.
@ritvikmath4 жыл бұрын
You're very welcome!
@ekaterinaburakova862911 ай бұрын
Wow, such a good explanation!
@ritvikmath11 ай бұрын
Glad it was helpful!
@CStrik3r4 жыл бұрын
Great explanation !
@jatinkumar44103 жыл бұрын
Thanks...Very nice explanation...
@ritvikmath3 жыл бұрын
You are welcome
@suvikarhu46272 жыл бұрын
7:54 Shouldn't you do the rightmost multiplication first? Lambda * U inverse.
@tanmaygupta828811 ай бұрын
thankyou so much, u are a saviour
@보라색사과-l1r4 жыл бұрын
Thank you for this amazingly simple explanation! Could you give me an example of that kind of multiplication used in Machine Learning?
@sgifford10004 жыл бұрын
You have a great channel! Thanks for the insight which is hard to come by. Just one confusing area to me at the time was the definition of the 2x2 matrices for u1 and u2. They look like 3x2 matrices with values 1 & u1 (or u2). I did figure it out though. Thanks!
@ritvikmath4 жыл бұрын
Thank you!
@Galmion2 жыл бұрын
can you elaborate on this? I still don't get how it isn't a 3x2 matrix.
@Arycke Жыл бұрын
@@Galmionit shouldn't have been written in the way it was in my opinion, as it causes confusion. Those "1's" are just dot dot dots, ..., meant to be arbitrary entries
@Arycke Жыл бұрын
@@Galmionthe Matrix U is the 2 eigenvectors, u1 and u2, put next to each other in one matrix. And since u1 and u2 are 2x1 vectors, putting them together in a matrix makes it a 2x2
@Arycke Жыл бұрын
@@GalmionI would have chosen an example with no square roots as the first example personally. Say your eigenvectors are u1= [2] [3] u2 = [4] [5] Then U, the eigenvector matrix: U = [2 4] [3 5] Hope this helps.
@shashankelsalvador2 жыл бұрын
Best intro ever
@user-wr4yl7tx3w2 жыл бұрын
But are most matrices decomposable to eigendecomposition? Then doesn’t that mean limited use?
@香港地舖購物2 жыл бұрын
Great video ! Can you also touch on the topic of LU Decomposition, Jordan Canonical Form, Rayleigh quotient, etc. ?
@bashiruddin38913 жыл бұрын
very nice explanation
@ritvikmath3 жыл бұрын
Thanks for liking
@balazsbaranyai81154 жыл бұрын
Man, this rocks! thank you!
@boryanasvetichkova8458 Жыл бұрын
Great video, thanks!
@rozhanmirzaei35123 ай бұрын
Love this. Thank u❤
@amirhosseinmirkazemi7654 жыл бұрын
You are AWESOME! thank you!
@UsmanAbbas-k2c11 ай бұрын
Super helpful. Thanks
@heejuneAhn4 жыл бұрын
Your explanation is the best I have ever seen. But your explanation does not explain what each component really means, ie. The First U^-1, map/rotate the input vectors, and then stretch the result in each eigenvector direction and then finally reverse-rotate the vector (restoring into the original axis).
@rimshasardarsardarrimsha32093 жыл бұрын
Can y tell me about what are the pros of this topic?
@yangwang96883 жыл бұрын
What is the difference between decomposition and factorisation?
@ritvikmath3 жыл бұрын
I think they're often used interchangeably
@hassanshahzad39223 жыл бұрын
this is awesome!
@rohanchess8332 Жыл бұрын
Why do we need normalized eigenvectors? won't any eigenvectors from the family of eigenvectors suffice
@elahedastan49452 жыл бұрын
great explanation
@thomasstiglich34843 жыл бұрын
Great video!
@michael8899aspen3 жыл бұрын
iF P=6 OR p=7 is this arbitrary p=8?
@gvbvwockee4 жыл бұрын
Thank you. Thank you. Thank you.
@ritvikmath4 жыл бұрын
Any time!
@DRmrTG Жыл бұрын
Thanks for your help!
@NinjaAdorable4 жыл бұрын
That was beautiful !!!! :')
@ritvikmath4 жыл бұрын
thanks!
@Fat_Cat_Fly4 жыл бұрын
Fantastic!!!!!!!!!!!!!!!!!!
@tehminakakar8753 Жыл бұрын
hey did anyone solve for the eigenvectors? Maybe I am wrong, I got x1 = -2/3 x2 and x2 = -3/2 x1 when solving the equations for lamda = -5. if anyone got the answer please let me know.
@rugahun2 жыл бұрын
pff great video, i feel bad i didnt knew this guy erlier, saves a lot of time.
@yunkkim1159 Жыл бұрын
great job , I had no idea before the video now I know everything
@williamjmccartan88793 ай бұрын
Great job, peace
@jneal415410 ай бұрын
Excellent. I was struggling to understand how the form A=ULU^-1 is reached from the definition of an eigenvalue (Au=lu) as explained in my textbook, but the way you explained it made it all click for me. Thanks!
@ritvikmath10 ай бұрын
Excellent!
@alejandropalaciosgarcia27673 жыл бұрын
Excellent
@marc27522 жыл бұрын
awesome thanks
@zddmsmify4 жыл бұрын
Ojalá me lo hubieran explicado así de fácil cuando lo estudiaba hace casi 30 años
@zddmsmify4 жыл бұрын
Explicación excepcional
@ritvikmath4 жыл бұрын
Gracias por las amables palabras!
@davidfield5295 Жыл бұрын
Good video
@satyamgupta4808 Жыл бұрын
very nice
@svengunther76533 жыл бұрын
Thanks man!
@thedailyepochs3383 жыл бұрын
Beautiful
@mydodethailung395 Жыл бұрын
Amazing
@NeoZondix2 жыл бұрын
Thanks
@klingefjord4 жыл бұрын
Hang on, if a matrix times its inverse is the identity matrix, why can't the formula for eigendecomposition (U * lambda * U^-1) be simplified as just lambda?
@Pukimaxim4 жыл бұрын
You cannot rearrange the equation with matrices multiplication as you would with numbers/variables
@Rudolf-ul1zh4 жыл бұрын
Exactly, matrix multiplication is not commutative!