Absolutely marvellous lecture! Never heard a lecture of this quality and clarity with absolutely the right speed. Taken me five decades to understand these terms! Thank you!
@sowmyas4213Ай бұрын
I am enjoying your lecture, starting from the basics. Many things I failed to understand as a student finally makes sense. You are a marvelous teacher and Thank you so much!
@cypherecon59892 ай бұрын
hands down the best video on eigenvalues & -vectors on youtube
@RS-hs6niАй бұрын
Some people are born to teach. Thanks a lot for this amazing series of videos about multivariate analysis.
@Nikita-uv7zkАй бұрын
Thank you very much for making this video.I don't have a statistics background, yet it was easy to comprehend. The pace of teaching was perfect and there is a lot of clarity in explaining the concept.
@tilestatsАй бұрын
Glad you enjoyed it!
@Vickdeem_0077 ай бұрын
I wish all KZbin channels would teach so concisely and easy to understand as you.
@shark-p4o7 ай бұрын
this channel should be over 1 million subscribers. Its contents are far better than 99% of the youtube maths and ML contents
@VKjkd2 ай бұрын
Truly feel this is the best explanations I’ve read or seen. I knew the details were lurking in plain sight thank you. This channel deserves millions of views.
@mrbilalkhan4 ай бұрын
Best tutorial ever on Eigenvectors and Eigenvalues. Thanks
@iamgrinhausgasesАй бұрын
I just wanted to say THANK YOU for making such a wonderful tutorial! I finally have a proper understanding of this topic.
@burburchacha Жыл бұрын
straight forward explanation. Thank you! I don't know why all the other videos on this topic have to make things so overcomplicated.
@sakkariyaibrahim26505 ай бұрын
Great. After wandering through hundreds of video now I know what an Eigen vector and Eigen value is👍
@kuleeeet12 ай бұрын
thank you very much for this channel it is like a treasure
@thenumbersguy828810 ай бұрын
Excellent, the approach, the pace, the details!
@v.vidhya6752 Жыл бұрын
The best explanation I got from this channel regarding Eigen vectors and Eigen values and orthogonal eigen vectors. Thanks a ton
@baduraldeenali24112 жыл бұрын
This the most clear and simple demo I have seen.. thank you very much.
@Lucyferandtheson003 Жыл бұрын
This is the best lecture on eigenvectors and eigenvalues. I now have a good basic understanding of what these are. Thank you so much sir
@DhanasekaranT-de4wz Жыл бұрын
This important topic has been eluding me for several years. But this video made it clear about the concept with appropriate examples. ❤❤❤❤
@sefatergbashi Жыл бұрын
I absolutely love this channel. The explanations are so clear that I understand every bit of this complex topic
@hadidavardoust78939 ай бұрын
The best training of Eigenvectors
@Frdy123459 ай бұрын
You are an amazing teacher, thanks.
@travelingmenagerie Жыл бұрын
You are amazing. All my life I've been fearful of this topic, but I totally get it now. What I really like is the fact that you use a very simple data table to show things in concrete terms, and you work through the example using that table. Thank you so very much!!
@elenapopova45692 жыл бұрын
Great explanation about eigenvectors and eigenvalues! My favorite one! Thank you very much! :))))
@jillurrahman69137 ай бұрын
awesome teaching skill..thank you so much boss
@manjusha7526 Жыл бұрын
Simple and easy video on eigen values and eigen vectors
@MrDavidaslv7 ай бұрын
Awesome explanation 👍 Eigenvector vs eigenvalue.
@debasiskar46622 жыл бұрын
What an excellent description of things. Hats off to you.
@tilestats2 жыл бұрын
Thank you!
@thegtdoc9 ай бұрын
excellent series.
@aindrilasaha15923 жыл бұрын
Great explanation, really helpful for beginners..thank you so much!!
@tilestats3 жыл бұрын
Thank you!
@mahdi15942 жыл бұрын
bro, great job, love the way you explain things. You might see this comment copied and pasted across few of your other videos, I am just doing this for the algorithm.
@tilestats2 жыл бұрын
Thank you!
@mrbilalkhan4 ай бұрын
At 08:20 I wonder how will you normalize the eigenvector if there were three rows in the vector instead of two?
@tilestats4 ай бұрын
Just extend the equation by the third value. You sum the three squared values and then take the square root of the sum.
@vgreddysaragada Жыл бұрын
very well explained
@HaithamAhmed-kr8yl10 ай бұрын
Good explanation
@jaypople88852 ай бұрын
Thanks Man ❤❤❤
@Mathclub635 ай бұрын
Absolutely ❤❤❤❤
@HeavenlyGodlyAngelic3 ай бұрын
Thank you
@romanemul13 жыл бұрын
This explanation is super sweet and i have seen it couple of times on other webs and channels. But when i see applications of SVD , PCA or face recognition that intuition about "stretched and not rotated" or "its just a multiple of vector" is lost somewhere. Or how that is has to do with determinant = 0 ?
@tilestats3 жыл бұрын
Yes, the math behind PCA is not easy. However, the purpose of this video is to introduce the students about eigenvectors before they watch my videos about PCA. I would recommend that you watch my videos about multivariate statistics in order at my homepage: www.tilestats.com/
@danielchimezie86802 жыл бұрын
Splendid. Thank you.
@tilestats2 жыл бұрын
Thank you!
@dailyenglishforyou91198 ай бұрын
Thanks
@A.Safwat810 ай бұрын
Thanks!!
@ilhomsadriddinov36272 жыл бұрын
great
@shanzou25482 жыл бұрын
Hi, I was just confused about why n*n matrix have n eigenvector and eigenvalues
@tilestats2 жыл бұрын
It has to do with the degree of the polynomial in the calculations. For example, a two-by-two matrix results in a degree of 2, which can result in a maximum of two roots. In the next video, I show how the polynomial is generated in the calculations: kzbin.info/www/bejne/gKXGf5hjYsumr6M
@datascience-vf9kx2 жыл бұрын
How can i download the slides
@tilestats2 жыл бұрын
www.tilestats.com/shop/
@younique9710 Жыл бұрын
Thank you for that great video! I wonder if we can calculate eigenvalues and eigenvectors with a 3x2 matrix.
@tilestats Жыл бұрын
No, it has to be a square matrix.
@younique9710 Жыл бұрын
@@tilestats Thank you for your response. So, can we not use canonical correlation analysis for a n-by-5 matrix?
@tilestats Жыл бұрын
Sure, because you compute the eigenvectors on the covariance matrix, which is a square matrix: kzbin.info/www/bejne/aKW4pqyNidmDp68