Great clarity. You clearly understand your stuff from a deep level so it's easy to teach.
@patyyyou4 жыл бұрын
Nicely done. It hit the right level for someone who understands the linear algebra behind Eigenvectors and Eigenvalues but still needed to make the leap of connecting a dot or two in the application of PCA to a problem. Again, thank you!
@tusharkush74 жыл бұрын
This video needs a golden buzzer.
@simonala7090 Жыл бұрын
Agreed!!!
@MathematicsMadeSimple14 жыл бұрын
Clear explanation. Thank you for shading more light especially on the application of eigenvalues and vectors.
@sebgrootus11 ай бұрын
Incredible video. Genuinely exactly what i needed.
@riazali-vi8tu5 жыл бұрын
Well explained, you should do more videos
@davestaggers29813 жыл бұрын
Graphical interpretation of covariance is very intuitive and useful for me. Thank you.
@roshinroy51292 жыл бұрын
Awesome explanation!! Nobody did it better!
@f0xn0v4 Жыл бұрын
I have always dreaded statistics, but this video made these concepts so simple while connecting it to Linear algebra. Thank you so much ❤
@blackshadowofmysoul3 жыл бұрын
Best PCA Visual Explanation! Thank You!!!
@apesnajnin4 жыл бұрын
Really, amazing lecture! It's make my conception clear regarding eigenvalue and eigenvector. Thanks a lot!
@eturkoz5 жыл бұрын
Your explanations are awesome! Thank you!
@anuraratnasiri55165 жыл бұрын
Beautifully explained! Thank you so much!
@Muuip2 жыл бұрын
Great concise presentation, much appreciated! 👍
@Trubripes9 ай бұрын
Thanks for concisely explaining that PCA is just SVD on the covariance matrix.
@bottom2975 ай бұрын
Extremely helpful. Thank you!
@simonala7090 Жыл бұрын
Would love to request an in person version
@TheSyntaxerror15 жыл бұрын
Love this video, great work!
@danielheckel27555 жыл бұрын
Nice visual explanation of covariance!
@softpeachhy89673 жыл бұрын
1:37 shouldn’t the covariance be divided by (n-1)?
@DanielDa23 жыл бұрын
great explanation
@basavg12 жыл бұрын
Very Nice..pls keep posting
@Pedritox09533 жыл бұрын
Good explanation
@sakkariyaibrahim26503 жыл бұрын
Good lecture
@Timbochop2 жыл бұрын
Good job, no wasted time
@matato29322 жыл бұрын
thank you for this amazing and simple explanation
@Lapelu9 Жыл бұрын
I thought PCA was a hard concept. Your video is so great!
@prof.laurenzwiskott2 жыл бұрын
Very nice video. I plan to use it for my teaching. What puzzles me a bit is that the PCs you give as an example are not orthogonal to each other.
@nickweimer61265 жыл бұрын
Great job explaining this
@adriantorresnunez6 жыл бұрын
Best explanation I have heard from PCA. Thank you
@vietdaoquoc7629 Жыл бұрын
thank you for this amazing video
@Agastya0073 жыл бұрын
Plz do more videos
@skewbinge61573 жыл бұрын
thanks for this simple yet very clear explanation
@123arskas2 жыл бұрын
Thank you. It was beautiful
@VivekTR4 жыл бұрын
Hello Emma, Great job! Very nicely explained.
@Darkev773 жыл бұрын
I do understand that eigenvalues represent the factor by which the eigenvectors are scaled, but how do they signify “the importance of certain behaviors in a system”, what other information do eigenvalues tell us other than a scaling factor? Also, why do eigenvectors point towards the spread of data?
@malstroemphi1096 Жыл бұрын
If you consider a raw matrix or just geometric examples eigenvalues are just a scaling factor indeed. And you cannot say much more. But here, we are talking with additional context: we know we are doing statistics and putting "data" into a covariance matrix, which means we can now add more interpretations. The eigen vector is not just some eigenvector of some matrix, it's the eigenvector of a *covariance matrix* in the context of statistics, we've put data into a matrix whose elements measure all the possible spread of data, which is why we can now say an eigenvector points towards the spread of data and its length (eigenvalue) relates to the importance of that spread.
@1291jes4 жыл бұрын
This is excellent, Emma... I will subscribe to your videos!
@zendanmoko50052 жыл бұрын
Thank you! very nice video, well explained!
@Agastya0073 жыл бұрын
I love the way u spelled "data" at [3:34]😁😁
@Rami_Zaki-k2b Жыл бұрын
PS: Video is targeted at people who already have a deep knowledge of what the video is trying to explain.
@latanezimbardo71293 жыл бұрын
1:28 I personally visualise covariance like this, I always thought i was wrong, I have never seen others doing this, how come??
@stephenaloia66953 жыл бұрын
Thank you, Ma'am!
@TechLord795 жыл бұрын
Very well done!
@szilike_103 жыл бұрын
Believe it or not, I've been wondering a lot about the concept of covariance because every video seems to miss the reason behind the idea. But I think I kind of figured it out today before watching this video and I drew the same exact thing that is in the thumbnail. So I guess was thinking correctly : ))
@jordigomeztorreguitart4 жыл бұрын
Great explication. Thank you.
@abdulrahmanmohamed88005 жыл бұрын
A very good explanation.
@vitokonte4 жыл бұрын
Very nice explanation!
@Matt-bq9fi4 жыл бұрын
Great explanation!
@user-or7ji5hv8y3 жыл бұрын
Wow, that was quite good explanation.
@liviyabags4 жыл бұрын
I LOVE YOU !!!!! whattay explanation... thank you so much
@skshahid5565 Жыл бұрын
Why do you stop making videos?
@m.y.s42605 жыл бұрын
awesome explanation! thx!
@tusharpandey65844 жыл бұрын
awesome explanation! make more vids pls
@haroldsu4 жыл бұрын
Thank you for this great lecture.
@arjunbemarkar74144 жыл бұрын
How do u find eigenvalues and eigenvectors from the covariance matrix?
@Eta_Carinae__4 жыл бұрын
Same as usual, right? Find lambda using det(Sigma - lambda * I) = 0, so just take lambda away from the main diagonal of the Cov. Matrix, take the determinant of that and you'd be left with some polynomial of lambda which you then solve for, each solution being a unique eigenvalue.
@saDikus12 жыл бұрын
Great video! Can anyone tell how she decided that PC1 is spine length and PC2 is Body mass? Should we guess (hypothesize) this in real world scenarios?
@crispinfoli94484 жыл бұрын
Great video, thank you!
@콘충이4 жыл бұрын
Awesome!
@subinnair38356 жыл бұрын
Dear mam, How did you obtain the matrix at 5:30 ?
@emfreedman39056 жыл бұрын
Find the Covariance Matrix of these variables, like at 2:15, and find its eigen decomposition (find its two dominant eigenvectors). The matrix at 5:30 is the two dominant eigenvectors. Each column is an eigenvector.
@subinnair38356 жыл бұрын
Emma Freedman thank u ! The video's explanation was great and covered all the fundamentals required to fully understand PCA !! 😃
@DoFlamingo_1P4 жыл бұрын
AWESOMEEE 🤘🤘🤘
@EdeYOlorDSZs3 жыл бұрын
poggers explination thankyou
@SpeakerSparkTalks5 жыл бұрын
nicely explained
@nbr27373 жыл бұрын
beautiful, thanks a lot!
@raghav55206 жыл бұрын
Well explained
@AEARArg4 жыл бұрын
Congratulations Emma, your work is excellent!
@siliencea93624 жыл бұрын
thank you so much!! :)
@mahmoudreda10835 жыл бұрын
thanks A LOT
@spyhunter00662 жыл бұрын
Around the minute of 1.36, you said "we divide by n for covariance", but we divide by n-1, instead. Please, do check on that. Thanks for the video. Maybe, I sohuld say estimated covariance has the n-1 division.
@astiksachan81354 жыл бұрын
5:12 was very good
@thryce824 жыл бұрын
nice job was always kinda confused by this.
@KimJennie-fl3sg4 жыл бұрын
I just love the voice🙄😸
@ivandda007 ай бұрын
ty
@checkout83526 жыл бұрын
Tnks
@Shrek-poohАй бұрын
“Eigen vectors perform same transformations as matrices”: What does this even mean? Project the behaviour of the system and Also signify the relative dominance of certain behaviours within the system. What does this even mean?
@tractatusviii74654 жыл бұрын
investigate hedge/hogs
@getmotivated36195 жыл бұрын
You are awesome... u make a mediocre out of a knownothing.
@astiksachan81354 жыл бұрын
4:35
@ABC-hi3fy3 жыл бұрын
No one explains why they use covariance matrix. Why not use actual data and find its igen vector/igen values. I have been watching hundreds of videos books. No one explains that. It just doesn't make sense to me to use covariance matrix. Covariance is very useless parameter. It doesn't tell you much at all.
@malstroemphi1096 Жыл бұрын
No it does especially using PCA. But you are right, you need actual data. Say the data are 3D points of some 3d objects, if you use this technique (build a cov matrix using the 3D points and do the PCA of it) then you will find a vector aligned with overall direction of the shape: for instance you will find the main axis of a 3d cylinder. This is quite a useful information.
@28940313 жыл бұрын
babe var(x,x) makes no sense. either you say var(x) or cov(x,x)