Let's explore the math behind principal component analysis! --- Like, Subscribe, and Hit that Bell to get all the latest videos from ritvikmath ~ --- Check out my Medium: / ritvikmathematics
Finally, a video that explains the math behind PCA so clearly. Went through all the other videos and it helped a lot! Thank you!
@qaarloshilaal27783 жыл бұрын
Thanks infinitely for all your videos, you're literally the best at explaining these concept in a clear and excellent way in order to continue with what we have to study/ do! Huge respect man.
@ritvikmath3 жыл бұрын
You're very welcome!
@pigtowndanzee5 жыл бұрын
Love your teaching style. Keep these videos coming!
@TamNguyen-qi8di4 жыл бұрын
Dear rivitmath, Thank you so much sir for your clear explanation. Even being in my last year of college, I am still struggling with the basics of statistics. With your help, I have been striving exponentially in class and looking to graduate from college in this semester. Your videos have been so so so helpful and i wish you an amazing health to continue with your content. I wish you could have been my professor in college. Thank you for putting out the high quality contents. Words can't describe how much I appreciate you, sir. Thank you. You have changed my life.
@ritvikmath4 жыл бұрын
Thanks for the kind words. Wishing you much success!
@joachimguth62264 жыл бұрын
Very well presented. You are a great teacher. Hopefully you are going to cover the entire AI space.
@ritvikmath4 жыл бұрын
That is the goal!
@warrenbaker41243 жыл бұрын
@@ritvikmath Oh wow!!! I'm so happy to see you're taking this on. I'm a huge fan and this is a real highlight for me. Thanks for all you do!!
@Moiez101 Жыл бұрын
@@ritvikmath i fully support that goal! I just started with data science bro. Loving your videos, you're a great teacher.
@clxdyy.luveditxs7 ай бұрын
Thank god I found your channel. I am studying masters degree in computer science in a prestigious university and cost me a lot of money but your channel is very useful to dig deeper and understand many things. Stay on the good work!
@silverstone-z2d18 күн бұрын
You're a brilliant teacher! Thanks for sharing the Lagrange multipliers trick especially to derive the solution to the optimization problem. Most of the resources I referred to before this entirely skip this part and only state the solution to be the eigenvector corresponding to the max eigenvalue of the covariance matrix
@resoluation3459 ай бұрын
The best series to explain the maths behind PCA
@bhajman1233 жыл бұрын
Byfar the most accessible description of pca...finally was able to clearly connect the covar matrix and the eigen values to variance maximization
@amaramar49699 ай бұрын
I had to go thru the prerequisite videos to clarify my concepts first, but after that this PCA explanation is amazing! I think you are equivalent to 10 college professors out there in terms of teaching skills. I hope you get that proportion money and the college professors feel ashamed and work harder to catchup to your standards. Again, amazing!
@vinceb80413 жыл бұрын
I've been wrestling to get all intuitional and computational components for doing pca for a while, and seeing it all come together here helps tremendously! Great as always, 10/10 video :)
@paulbrown58394 жыл бұрын
This is a very strong video. It requires proper study. I hope you do more of this great stuff. Thank You!
@alphar854 жыл бұрын
I stopped at 01:33 and I am going to watch the other 5 videos. you are such a blessing mate.
@shivamkak7981 Жыл бұрын
Such a well curated explanation of PCA, thanks so much!
@jaivratsingh99662 жыл бұрын
Simply excellent!
@worldpulse3652 жыл бұрын
Simple and straight to the point. aBsolutely welldone!
@133839297 Жыл бұрын
You have a gift for teaching.
@thinkingAutomata3 жыл бұрын
Thanks Ritvik. Excellent explanation of PCA. Good job, well done!
@BleachWizz4 жыл бұрын
I'm loving your content, you're showing a part of math that is not usually shown. The part where you actually use it, where you make your choices and why are you choosing them. Like it's nice to understand the equations and why it gives you a 0 on the sweet spot, but it's also nice to remind that it not only works but it was build to work with that intention. So in the end you still need to figure out how do you get your problem to fit in one of those, what can you choose in these big generic operations to fit it into your problem.
@ritvikmath4 жыл бұрын
Thanks for the feedback! I do try to focus a lot more on the "why" questions rather than the "how" questions.
@_arkadij9 ай бұрын
Very appreciative of the explanation why we end up with using vectors corresponding to the biggest Eigenvalues. Thanks so much
@Chill_Magma Жыл бұрын
Straight to the point and thorough you deserve to be subscribed from my 3 accounts
@robertbillette46713 жыл бұрын
Like everyone else has mention, amazing clarity and style.
@mashakozlovtseva43784 жыл бұрын
Everything was clearly understood from math side! Thank you for your link on Medium account!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@riteshsaha68814 ай бұрын
This is super helpful. Way better than my professor's explanation
@DeRocks16075 ай бұрын
You are great teacher.. ultimately I understood
@volsurf12744 жыл бұрын
Concise, clear and superbly explained. Thanks!
@ritvikmath4 жыл бұрын
Glad it was helpful!
@christinejiang63868 ай бұрын
wow! thank you! I watched all the videos before watching this one, they really helps a lot!
@vinceb80413 жыл бұрын
12:20 Quick note on why going down the list of eigenvalues is legit, the covariance matrix is a symmetric matrix, and it can be shown that if such a matrix has more than one eigenvalues that are not the same, the corresponding eigenvectors will be orthogonal.
@nahidakhter86463 жыл бұрын
Beautifully explained! Thanks so much!
@ShubhamYadav-ut9ho6 ай бұрын
Amazing explanation as always
@543phi4 жыл бұрын
Thanks for this video! As a Data Science student, your lecture helped to clarify a lot....I appreciate your teaching style.
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@cll25985 ай бұрын
Epic explanation
@MaxDavidsonArgentina4 жыл бұрын
Thanks for sharing your knowledge. It's great to have people like you helping out!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@arun_kanthali2 жыл бұрын
Great Explanation.. Thank-you 👍
@subhabhadra6192 жыл бұрын
Awesomely represented..
@berkoec3 жыл бұрын
Such a well-explained video - keep up the great work!
@ritvikmath3 жыл бұрын
Thanks a ton!
@sidddddddddddddd Жыл бұрын
What you've called the closed form of the covariance matrix is actually the biased estimator of the covariance matrix \Sigma. And if you divide by (N-1) instead of (N), you get the unbiased estimator of \Sigma. Awesone video! Thanks :D
@sandeepc28334 жыл бұрын
Cleared most of my doubts. Thanks a lot.
@ahmadawad47824 жыл бұрын
Watched many videos about linear algebra and PCA. You're the one who made it clear for me. Thanks!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@jhonportella56183 жыл бұрын
Great, great video I really appreciate your effort and good methodology to teach. I have a question on the projection math. on your projection video you obtained P=XUU but here you used P=U*XU. Maybe this is a silly question but I would really appreciate if you can tell me why this equivalence is possible. Many thanks
@martinw.97862 жыл бұрын
Thank you very much for the explanations - very very well done. Your references to the mathematical backround is key!
@paulntalo14254 жыл бұрын
You have made it clear. Thank you
@aravindsaraswatula25615 ай бұрын
Awesome video
@nuamaaniqbal63732 жыл бұрын
cant thank u enough!! u r truly the boss!
@kakabudi2 жыл бұрын
Really great video! Thanks for explaining this concept wonderfully!
@Chill_Magma Жыл бұрын
Seeing your videos increases my confidence on math stuff :DDD
@cameronbaird56582 жыл бұрын
Phenomenal video, thank you for the hard work 👏
@yarenlerler67 Жыл бұрын
Ahh such a clean explanation. I really appreciate! I will have practical statics for astrophysics exam soon, and I was having some problem with the theory part. All your videos were very helpful! I hope I am gonna get a good grade from the exam. :)
@simranjoharle4220 Жыл бұрын
Your videos are extremely helpful! Thank you!
@ritvikmath Жыл бұрын
Glad you like them!
@proxyme36282 жыл бұрын
Brilliant explanation of why eigen vector is the one from maximum optimisation, never saw such great explanation before. Wish your course is in Coursera. I do not think any text book explains the eigen value as Lagrangian Multiplier and eigen vector as maximising variance. Thanks so much.
@erfanbayat39746 ай бұрын
this video is amazing
@wandering-byte2 күн бұрын
Just what I was looking for
@Rockyzach88 Жыл бұрын
Just finished the LA section in the Deep Learning book and I can tell this is going to help supplement and fill in this gaps of understanding. Good vid.
@ritvikmath Жыл бұрын
I hope so!
@pratik.patil8711 ай бұрын
Thanks Ritvik, I went through multiple resources to figure out this exact questions " why does eigen vectors and eigen values of a covariance matrix represent the direction and strength of the biggest increase in variance" . Thanks your video clarifies it beautifully. One question still though, I understand the equation we use to maximise but why do we need the constraint(uT u =1)?
@ajanasoufiane39035 жыл бұрын
Great video, it would be nice if you could show the big picture through the SVD decomposition :)
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@shashanksundi56693 жыл бұрын
Just perfect !! Thank you :)
@bilalbayrakdar71004 ай бұрын
bro you are the best, thanks for you effort
@akrylic_5 жыл бұрын
There's a property of transposes around 6:45 that you could have mentioned, and I got tripped up for a second. The reason why you can write u^T*(xi-xbar) as (xi-xbar) ^T*u is because (AB)^T =(B^T)(A^T) It's a cool trick, but not obvious
@ritvikmath4 жыл бұрын
Very true, thanks for filling in the missing step!
@zechengchang34443 жыл бұрын
Can you explain more? How does (AB)^T =(B^T)(A^T) have anything to do with u^T*(xi-xbar)? Thanks.
@ItahangLimbu2 ай бұрын
@@zechengchang3444 (A^T B)^T = B^T * A
@Tankwell-cq5ky2 жыл бұрын
Very well presented - well done!😊😊
@knp43565 жыл бұрын
Hey Ritvik, It would be great if you can generate some problems for viewers to solve. Watching is great but if you can supplement with actual problems then it would drive the points into viewers head. You can then further post solutions on your medium site. Hopefully at least 4-5 problems per each video. I've watched many videos on DS subjects but something in your teaching method is making it simpler to understand. Thanks.
@ritvikmath5 жыл бұрын
I honestly really appreciate that you're trying to help me be more effective at what I do. I think it's a great idea and I'll look into it. Thanks :)
@fahimfaisal46603 жыл бұрын
Excellent
@nandhinin7994 жыл бұрын
Clearly explained, helped me greatly in understanding the basis of PCA.
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@かいじゅ-y3j Жыл бұрын
This video is super great! I was wondering why Covariance matrix is used to compute PCA, but this video made my doubts clear!!
@ritvikmath Жыл бұрын
Glad it was helpful!
@mwave33882 жыл бұрын
I'm preparing for a job interview. Thanks, the best PCA video I found.
@mmarva35973 жыл бұрын
Thank you very much !! really helpful
@Sriram-kj6kl2 жыл бұрын
Your videos help a lot man.. Thank you 👍
@rajathjain3144 жыл бұрын
Very Intuitive, Great Job Ritvik!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@Cybrean13 жыл бұрын
Excellent presentation and delivery … wish you all the success!
@ritvikmath3 жыл бұрын
Thank you! You too!
@georgegkenios4863 жыл бұрын
Amazing work mate!
@ritvikmath3 жыл бұрын
Thanks a lot!
@seetaramdantu31903 жыл бұрын
excellent...well explained
@ritvikmath3 жыл бұрын
Glad it was helpful!
@herberthubert68283 жыл бұрын
you rock, thank you
@gc63274 жыл бұрын
Hi Ritvik- Can you do a video on factor analysis. That would be huge! Thanks buddy!
@ОлегЗалесский-й1б8 ай бұрын
great explanation. Really appreciate it. thanks
@ritvikmath8 ай бұрын
Glad it was helpful!
@kisholoymukherjee2 жыл бұрын
Hi ritvik, thanks for the video. Can you please tell me how the vector projection formula is being used to calculate the projection of xi on u here? The formulae in the two videos seem to be quite different. Would really appreciate if you could help understand the underlying math
@ArpitAnand-yd7tr Жыл бұрын
That's just a dot product between the potential u1 and Xi. It gives the magnitude of the projection in the direction of the unit vector u
@ernestanonde32182 жыл бұрын
great video
@mainakmukherjee3444 Жыл бұрын
We find the equation of the variance of the vector, on which we are going to project the data, and then tried maximizing it, because, the vector, for which the variance will be highest (max eigen value), is gonna retain most of the information of the data, after dimensionality reduction.
@suvikarhu46272 жыл бұрын
@ritvikmath 5:02 I don't understand where is this formula of projection (proj(xi)=ut xi u) coming from. The projection video does not say that. What the projection video exactly says is that the proj(xi) = (xi dot u)*u. No transpose there! Where did you get that transpose from? And the dot product is missing ? Another question, at 5:50 why do you take only the magnitude of the vector?
@ItahangLimbu2 ай бұрын
he wrote (x.u)u and dot product is just x^T*u so he write (x^T * u) u here `u` is a vector and (x^T * u) is a constant or say scaling factor
@ItahangLimbu2 ай бұрын
Second question: he is calculating the variance in the projecting say i project vector v1 and v2 to a vector `v' we are calculating the amount variance in the projection and the projected amount is (x^T * u) the constant scaling factor so he only used (x^T * u)
@muhammadghazy99412 жыл бұрын
thank you man appreciate it
@MohamedMostafa-kg6gk3 жыл бұрын
Thank you for this great explanation .
@ritvikmath3 жыл бұрын
You are welcome!
@GeoffryGifari5 ай бұрын
Hmmm i noticed that if two categories are strongly correlated, the plot will look close to a straight line. Going to multidimensional space, that "line" looks like the vector u1 in the video, on which the data are projected. Does that mean PCA will perform better the more correlated two (or more) categories are?
@santiagolicea38142 жыл бұрын
This is a great explanation, thanks a lot. It'll be great if you can also make a video showing a practical example with some data set, showing how you use the eigenvectors projection matrix to transform the initial data set.
@deplo4 жыл бұрын
Hi Ritvikmath, thank you for your super informative videos! I took all courses on this topic but I was wondering if you could expand it with factor analysis and correspondence analysis. It would be interesting to know how different methods work and relate to each other because it would provide a deeper perspective. Thanks
@thirumurthym79803 жыл бұрын
@ 4.54 - you are referring about projection video - on how you arrive projections formula. There is no such mention of U transpose in that projections video.
@quark37 Жыл бұрын
Fun video. Thank-you. And thanks for all the pre-req videos. Question: I've seen other videos that describe PCA vectors as orthogonal, but using eigenvectors they would not necessarily be orthogonal, right? What is the correct way to think about the orthogonality of PCA vectors? Thanks. * I think I answered my own question. The eigenvectors in question are of the covariance matrix of the related variables. This matrix is symmetrical so the eigenvectors will be orthogonal. Correct?
@yurongluo4479 ай бұрын
Your video is helpful for us. Can you create one video to explain Independent Component Analysis in detail? Thanks.
@rabiizahir28853 жыл бұрын
Thanks a lot.
@brofessorsbooks33525 жыл бұрын
Good!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@Markks100 Жыл бұрын
I don't understand why the projected form of Xi on U1 is U1^TXiU. From your lecture on vector projections, P=(X.U)U, so why the change?
@XXZSaikou7 ай бұрын
nicely explained! but I noticed you didn't mention the need to standardize the original data for PCA. Is standardization a little trick to make things faster or is it needed in the underlying math?
@AshishKGor2 жыл бұрын
Thanks sir.
@poornanagasai262 Жыл бұрын
It's really a great explanation and one question I got is from the video of vector projection it is clear that the vector onto which we wanna project has the value is (u.x)u where (u.x) is the magnitude and u being the unit vector. Here comes my question in this present video(math behind pca) you used (u^T .x)u as the vector magnitude of the vector which is projected on to. What is the difference in using u and u^t(u transpose)? Can you please answer me?
@PR-ud4fp2 жыл бұрын
Thanks 😊
@brianogrady375 ай бұрын
I wish you specified what values represented the Principal Conponents earlier on. But great video regardless.
@alejandropalaciosgarcia27673 жыл бұрын
Bro, you are awsome
@404nohandlefound2 жыл бұрын
Could you please explain how this links to SVD
@DarkShadow-tm2dk4 жыл бұрын
I HOPE U WILL REPLY 🛑🛑🛑 Aren't we suppose to standardize data before applying PCA and if we do standardize data then mean = 0 At 8:12 the second part of the equation will get cancelled right? So the equation changes
@darshansolanki55354 жыл бұрын
Best video!!
@zilezile49424 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@iOSGamingDynasties3 жыл бұрын
Great video. However, I am not quite understanding why projection can be written as u_1^T x_i u. What's the relationship between u and u_1? They have to have the same direction right? Also for the projection formula, it seems that the projection formula should equal to (u_1^T X_i) / ||u_1||^2 * u_1 but it seems that there has to be an extra 1/||u_1|| in the projection formula.