Visual Explanation of Principal Component Analysis, Covariance, SVD

  Рет қаралды 95,737

Em Freedman

Em Freedman

Күн бұрын

Пікірлер: 88
@zacmac
@zacmac 3 жыл бұрын
Great clarity. You clearly understand your stuff from a deep level so it's easy to teach.
@patyyyou
@patyyyou 4 жыл бұрын
Nicely done. It hit the right level for someone who understands the linear algebra behind Eigenvectors and Eigenvalues but still needed to make the leap of connecting a dot or two in the application of PCA to a problem. Again, thank you!
@tusharkush7
@tusharkush7 4 жыл бұрын
This video needs a golden buzzer.
@simonala7090
@simonala7090 Жыл бұрын
Agreed!!!
@MathematicsMadeSimple1
@MathematicsMadeSimple1 4 жыл бұрын
Clear explanation. Thank you for shading more light especially on the application of eigenvalues and vectors.
@sebgrootus
@sebgrootus 11 ай бұрын
Incredible video. Genuinely exactly what i needed.
@riazali-vi8tu
@riazali-vi8tu 5 жыл бұрын
Well explained, you should do more videos
@davestaggers2981
@davestaggers2981 3 жыл бұрын
Graphical interpretation of covariance is very intuitive and useful for me. Thank you.
@roshinroy5129
@roshinroy5129 2 жыл бұрын
Awesome explanation!! Nobody did it better!
@f0xn0v4
@f0xn0v4 Жыл бұрын
I have always dreaded statistics, but this video made these concepts so simple while connecting it to Linear algebra. Thank you so much ❤
@blackshadowofmysoul
@blackshadowofmysoul 3 жыл бұрын
Best PCA Visual Explanation! Thank You!!!
@apesnajnin
@apesnajnin 4 жыл бұрын
Really, amazing lecture! It's make my conception clear regarding eigenvalue and eigenvector. Thanks a lot!
@eturkoz
@eturkoz 5 жыл бұрын
Your explanations are awesome! Thank you!
@anuraratnasiri5516
@anuraratnasiri5516 5 жыл бұрын
Beautifully explained! Thank you so much!
@Muuip
@Muuip 2 жыл бұрын
Great concise presentation, much appreciated! 👍
@Trubripes
@Trubripes 9 ай бұрын
Thanks for concisely explaining that PCA is just SVD on the covariance matrix.
@bottom297
@bottom297 5 ай бұрын
Extremely helpful. Thank you!
@simonala7090
@simonala7090 Жыл бұрын
Would love to request an in person version
@TheSyntaxerror1
@TheSyntaxerror1 5 жыл бұрын
Love this video, great work!
@danielheckel2755
@danielheckel2755 5 жыл бұрын
Nice visual explanation of covariance!
@softpeachhy8967
@softpeachhy8967 3 жыл бұрын
1:37 shouldn’t the covariance be divided by (n-1)?
@DanielDa2
@DanielDa2 3 жыл бұрын
great explanation
@basavg1
@basavg1 2 жыл бұрын
Very Nice..pls keep posting
@Pedritox0953
@Pedritox0953 3 жыл бұрын
Good explanation
@sakkariyaibrahim2650
@sakkariyaibrahim2650 3 жыл бұрын
Good lecture
@Timbochop
@Timbochop 2 жыл бұрын
Good job, no wasted time
@matato2932
@matato2932 2 жыл бұрын
thank you for this amazing and simple explanation
@Lapelu9
@Lapelu9 Жыл бұрын
I thought PCA was a hard concept. Your video is so great!
@prof.laurenzwiskott
@prof.laurenzwiskott 2 жыл бұрын
Very nice video. I plan to use it for my teaching. What puzzles me a bit is that the PCs you give as an example are not orthogonal to each other.
@nickweimer6126
@nickweimer6126 5 жыл бұрын
Great job explaining this
@adriantorresnunez
@adriantorresnunez 6 жыл бұрын
Best explanation I have heard from PCA. Thank you
@vietdaoquoc7629
@vietdaoquoc7629 Жыл бұрын
thank you for this amazing video
@Agastya007
@Agastya007 3 жыл бұрын
Plz do more videos
@skewbinge6157
@skewbinge6157 3 жыл бұрын
thanks for this simple yet very clear explanation
@123arskas
@123arskas 2 жыл бұрын
Thank you. It was beautiful
@VivekTR
@VivekTR 4 жыл бұрын
Hello Emma, Great job! Very nicely explained.
@Darkev77
@Darkev77 3 жыл бұрын
I do understand that eigenvalues represent the factor by which the eigenvectors are scaled, but how do they signify “the importance of certain behaviors in a system”, what other information do eigenvalues tell us other than a scaling factor? Also, why do eigenvectors point towards the spread of data?
@malstroemphi1096
@malstroemphi1096 Жыл бұрын
If you consider a raw matrix or just geometric examples eigenvalues are just a scaling factor indeed. And you cannot say much more. But here, we are talking with additional context: we know we are doing statistics and putting "data" into a covariance matrix, which means we can now add more interpretations. The eigen vector is not just some eigenvector of some matrix, it's the eigenvector of a *covariance matrix* in the context of statistics, we've put data into a matrix whose elements measure all the possible spread of data, which is why we can now say an eigenvector points towards the spread of data and its length (eigenvalue) relates to the importance of that spread.
@1291jes
@1291jes 4 жыл бұрын
This is excellent, Emma... I will subscribe to your videos!
@zendanmoko5005
@zendanmoko5005 2 жыл бұрын
Thank you! very nice video, well explained!
@Agastya007
@Agastya007 3 жыл бұрын
I love the way u spelled "data" at [3:34]😁😁
@Rami_Zaki-k2b
@Rami_Zaki-k2b Жыл бұрын
PS: Video is targeted at people who already have a deep knowledge of what the video is trying to explain.
@latanezimbardo7129
@latanezimbardo7129 3 жыл бұрын
1:28 I personally visualise covariance like this, I always thought i was wrong, I have never seen others doing this, how come??
@stephenaloia6695
@stephenaloia6695 3 жыл бұрын
Thank you, Ma'am!
@TechLord79
@TechLord79 5 жыл бұрын
Very well done!
@szilike_10
@szilike_10 3 жыл бұрын
Believe it or not, I've been wondering a lot about the concept of covariance because every video seems to miss the reason behind the idea. But I think I kind of figured it out today before watching this video and I drew the same exact thing that is in the thumbnail. So I guess was thinking correctly : ))
@jordigomeztorreguitart
@jordigomeztorreguitart 4 жыл бұрын
Great explication. Thank you.
@abdulrahmanmohamed8800
@abdulrahmanmohamed8800 5 жыл бұрын
A very good explanation.
@vitokonte
@vitokonte 4 жыл бұрын
Very nice explanation!
@Matt-bq9fi
@Matt-bq9fi 4 жыл бұрын
Great explanation!
@user-or7ji5hv8y
@user-or7ji5hv8y 3 жыл бұрын
Wow, that was quite good explanation.
@liviyabags
@liviyabags 4 жыл бұрын
I LOVE YOU !!!!! whattay explanation... thank you so much
@skshahid5565
@skshahid5565 Жыл бұрын
Why do you stop making videos?
@m.y.s4260
@m.y.s4260 5 жыл бұрын
awesome explanation! thx!
@tusharpandey6584
@tusharpandey6584 4 жыл бұрын
awesome explanation! make more vids pls
@haroldsu
@haroldsu 4 жыл бұрын
Thank you for this great lecture.
@arjunbemarkar7414
@arjunbemarkar7414 4 жыл бұрын
How do u find eigenvalues and eigenvectors from the covariance matrix?
@Eta_Carinae__
@Eta_Carinae__ 4 жыл бұрын
Same as usual, right? Find lambda using det(Sigma - lambda * I) = 0, so just take lambda away from the main diagonal of the Cov. Matrix, take the determinant of that and you'd be left with some polynomial of lambda which you then solve for, each solution being a unique eigenvalue.
@saDikus1
@saDikus1 2 жыл бұрын
Great video! Can anyone tell how she decided that PC1 is spine length and PC2 is Body mass? Should we guess (hypothesize) this in real world scenarios?
@crispinfoli9448
@crispinfoli9448 4 жыл бұрын
Great video, thank you!
@콘충이
@콘충이 4 жыл бұрын
Awesome!
@subinnair3835
@subinnair3835 6 жыл бұрын
Dear mam, How did you obtain the matrix at 5:30 ?
@emfreedman3905
@emfreedman3905 6 жыл бұрын
Find the Covariance Matrix of these variables, like at 2:15, and find its eigen decomposition (find its two dominant eigenvectors). The matrix at 5:30 is the two dominant eigenvectors. Each column is an eigenvector.
@subinnair3835
@subinnair3835 6 жыл бұрын
Emma Freedman thank u ! The video's explanation was great and covered all the fundamentals required to fully understand PCA !! 😃
@DoFlamingo_1P
@DoFlamingo_1P 4 жыл бұрын
AWESOMEEE 🤘🤘🤘
@EdeYOlorDSZs
@EdeYOlorDSZs 3 жыл бұрын
poggers explination thankyou
@SpeakerSparkTalks
@SpeakerSparkTalks 5 жыл бұрын
nicely explained
@nbr2737
@nbr2737 3 жыл бұрын
beautiful, thanks a lot!
@raghav5520
@raghav5520 6 жыл бұрын
Well explained
@AEARArg
@AEARArg 4 жыл бұрын
Congratulations Emma, your work is excellent!
@siliencea9362
@siliencea9362 4 жыл бұрын
thank you so much!! :)
@mahmoudreda1083
@mahmoudreda1083 5 жыл бұрын
thanks A LOT
@spyhunter0066
@spyhunter0066 2 жыл бұрын
Around the minute of 1.36, you said "we divide by n for covariance", but we divide by n-1, instead. Please, do check on that. Thanks for the video. Maybe, I sohuld say estimated covariance has the n-1 division.
@astiksachan8135
@astiksachan8135 4 жыл бұрын
5:12 was very good
@thryce82
@thryce82 4 жыл бұрын
nice job was always kinda confused by this.
@KimJennie-fl3sg
@KimJennie-fl3sg 4 жыл бұрын
I just love the voice🙄😸
@ivandda00
@ivandda00 7 ай бұрын
ty
@checkout8352
@checkout8352 6 жыл бұрын
Tnks
@Shrek-pooh
@Shrek-pooh Ай бұрын
“Eigen vectors perform same transformations as matrices”: What does this even mean? Project the behaviour of the system and Also signify the relative dominance of certain behaviours within the system. What does this even mean?
@tractatusviii7465
@tractatusviii7465 4 жыл бұрын
investigate hedge/hogs
@getmotivated3619
@getmotivated3619 5 жыл бұрын
You are awesome... u make a mediocre out of a knownothing.
@astiksachan8135
@astiksachan8135 4 жыл бұрын
4:35
@ABC-hi3fy
@ABC-hi3fy 3 жыл бұрын
No one explains why they use covariance matrix. Why not use actual data and find its igen vector/igen values. I have been watching hundreds of videos books. No one explains that. It just doesn't make sense to me to use covariance matrix. Covariance is very useless parameter. It doesn't tell you much at all.
@malstroemphi1096
@malstroemphi1096 Жыл бұрын
No it does especially using PCA. But you are right, you need actual data. Say the data are 3D points of some 3d objects, if you use this technique (build a cov matrix using the 3D points and do the PCA of it) then you will find a vector aligned with overall direction of the shape: for instance you will find the main axis of a 3d cylinder. This is quite a useful information.
@2894031
@2894031 3 жыл бұрын
babe var(x,x) makes no sense. either you say var(x) or cov(x,x)
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 420 М.
Singular Value Decomposition (SVD): Mathematical Overview
12:51
Steve Brunton
Рет қаралды 413 М.
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
We Attempted The Impossible 😱
00:54
Topper Guild
Рет қаралды 56 МЛН
Леон киллер и Оля Полякова 😹
00:42
Канал Смеха
Рет қаралды 4,7 МЛН
Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra
17:16
Covariance, Clearly Explained!!!
22:23
StatQuest with Josh Starmer
Рет қаралды 577 М.
Principal Component Analysis (PCA) - easy and practical explanation
10:56
PCA 6 - Relationship to SVD
9:14
Herman Kamper
Рет қаралды 24 М.
The covariance matrix
13:57
Serrano.Academy
Рет қаралды 102 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 409 М.
Principal Component Analysis (PCA)
6:28
Visually Explained
Рет қаралды 248 М.
7 Outside The Box Puzzles
12:16
MindYourDecisions
Рет қаралды 421 М.
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН