Principal Component Analysis (The Math) : Data Science Concepts

  Рет қаралды 96,038

ritvikmath

ritvikmath

Күн бұрын

Let's explore the math behind principal component analysis!
---
Like, Subscribe, and Hit that Bell to get all the latest videos from ritvikmath ~
---
Check out my Medium:
/ ritvikmathematics

Пікірлер: 175
@kaeruuuu_
@kaeruuuu_ 3 жыл бұрын
Necessary videos: 1. kzbin.info/www/bejne/jmibpX94jph1g80 (Vector Projections) 2. kzbin.info/www/bejne/nZ3EmoNoZ5d9jaM (Eigenvalues & Eigenvectors) 3. kzbin.info/www/bejne/bKC9hWpoYtOhr6s (LaGrange Multipliers) 4. kzbin.info/www/bejne/m2iWYWZpn7-Heas (Derivative of a Matrix) 5. kzbin.info/www/bejne/Z2aVpYaPqc6EmNk (Covariance Matrix)
@loveena419
@loveena419 3 жыл бұрын
Finally, a video that explains the math behind PCA so clearly. Went through all the other videos and it helped a lot! Thank you!
@qaarloshilaal2778
@qaarloshilaal2778 3 жыл бұрын
Thanks infinitely for all your videos, you're literally the best at explaining these concept in a clear and excellent way in order to continue with what we have to study/ do! Huge respect man.
@ritvikmath
@ritvikmath 3 жыл бұрын
You're very welcome!
@pigtowndanzee
@pigtowndanzee 5 жыл бұрын
Love your teaching style. Keep these videos coming!
@TamNguyen-qi8di
@TamNguyen-qi8di 4 жыл бұрын
Dear rivitmath, Thank you so much sir for your clear explanation. Even being in my last year of college, I am still struggling with the basics of statistics. With your help, I have been striving exponentially in class and looking to graduate from college in this semester. Your videos have been so so so helpful and i wish you an amazing health to continue with your content. I wish you could have been my professor in college. Thank you for putting out the high quality contents. Words can't describe how much I appreciate you, sir. Thank you. You have changed my life.
@ritvikmath
@ritvikmath 4 жыл бұрын
Thanks for the kind words. Wishing you much success!
@joachimguth6226
@joachimguth6226 4 жыл бұрын
Very well presented. You are a great teacher. Hopefully you are going to cover the entire AI space.
@ritvikmath
@ritvikmath 4 жыл бұрын
That is the goal!
@warrenbaker4124
@warrenbaker4124 3 жыл бұрын
@@ritvikmath Oh wow!!! I'm so happy to see you're taking this on. I'm a huge fan and this is a real highlight for me. Thanks for all you do!!
@Moiez101
@Moiez101 Жыл бұрын
@@ritvikmath i fully support that goal! I just started with data science bro. Loving your videos, you're a great teacher.
@clxdyy.luveditxs
@clxdyy.luveditxs 7 ай бұрын
Thank god I found your channel. I am studying masters degree in computer science in a prestigious university and cost me a lot of money but your channel is very useful to dig deeper and understand many things. Stay on the good work!
@silverstone-z2d
@silverstone-z2d 18 күн бұрын
You're a brilliant teacher! Thanks for sharing the Lagrange multipliers trick especially to derive the solution to the optimization problem. Most of the resources I referred to before this entirely skip this part and only state the solution to be the eigenvector corresponding to the max eigenvalue of the covariance matrix
@resoluation345
@resoluation345 9 ай бұрын
The best series to explain the maths behind PCA
@bhajman123
@bhajman123 3 жыл бұрын
Byfar the most accessible description of pca...finally was able to clearly connect the covar matrix and the eigen values to variance maximization
@amaramar4969
@amaramar4969 9 ай бұрын
I had to go thru the prerequisite videos to clarify my concepts first, but after that this PCA explanation is amazing! I think you are equivalent to 10 college professors out there in terms of teaching skills. I hope you get that proportion money and the college professors feel ashamed and work harder to catchup to your standards. Again, amazing!
@vinceb8041
@vinceb8041 3 жыл бұрын
I've been wrestling to get all intuitional and computational components for doing pca for a while, and seeing it all come together here helps tremendously! Great as always, 10/10 video :)
@paulbrown5839
@paulbrown5839 4 жыл бұрын
This is a very strong video. It requires proper study. I hope you do more of this great stuff. Thank You!
@alphar85
@alphar85 4 жыл бұрын
I stopped at 01:33 and I am going to watch the other 5 videos. you are such a blessing mate.
@shivamkak7981
@shivamkak7981 Жыл бұрын
Such a well curated explanation of PCA, thanks so much!
@jaivratsingh9966
@jaivratsingh9966 2 жыл бұрын
Simply excellent!
@worldpulse365
@worldpulse365 2 жыл бұрын
Simple and straight to the point. aBsolutely welldone!
@133839297
@133839297 Жыл бұрын
You have a gift for teaching.
@thinkingAutomata
@thinkingAutomata 3 жыл бұрын
Thanks Ritvik. Excellent explanation of PCA. Good job, well done!
@BleachWizz
@BleachWizz 4 жыл бұрын
I'm loving your content, you're showing a part of math that is not usually shown. The part where you actually use it, where you make your choices and why are you choosing them. Like it's nice to understand the equations and why it gives you a 0 on the sweet spot, but it's also nice to remind that it not only works but it was build to work with that intention. So in the end you still need to figure out how do you get your problem to fit in one of those, what can you choose in these big generic operations to fit it into your problem.
@ritvikmath
@ritvikmath 4 жыл бұрын
Thanks for the feedback! I do try to focus a lot more on the "why" questions rather than the "how" questions.
@_arkadij
@_arkadij 9 ай бұрын
Very appreciative of the explanation why we end up with using vectors corresponding to the biggest Eigenvalues. Thanks so much
@Chill_Magma
@Chill_Magma Жыл бұрын
Straight to the point and thorough you deserve to be subscribed from my 3 accounts
@robertbillette4671
@robertbillette4671 3 жыл бұрын
Like everyone else has mention, amazing clarity and style.
@mashakozlovtseva4378
@mashakozlovtseva4378 4 жыл бұрын
Everything was clearly understood from math side! Thank you for your link on Medium account!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@riteshsaha6881
@riteshsaha6881 4 ай бұрын
This is super helpful. Way better than my professor's explanation
@DeRocks1607
@DeRocks1607 5 ай бұрын
You are great teacher.. ultimately I understood
@volsurf1274
@volsurf1274 4 жыл бұрын
Concise, clear and superbly explained. Thanks!
@ritvikmath
@ritvikmath 4 жыл бұрын
Glad it was helpful!
@christinejiang6386
@christinejiang6386 8 ай бұрын
wow! thank you! I watched all the videos before watching this one, they really helps a lot!
@vinceb8041
@vinceb8041 3 жыл бұрын
12:20 Quick note on why going down the list of eigenvalues is legit, the covariance matrix is a symmetric matrix, and it can be shown that if such a matrix has more than one eigenvalues that are not the same, the corresponding eigenvectors will be orthogonal.
@nahidakhter8646
@nahidakhter8646 3 жыл бұрын
Beautifully explained! Thanks so much!
@ShubhamYadav-ut9ho
@ShubhamYadav-ut9ho 6 ай бұрын
Amazing explanation as always
@543phi
@543phi 4 жыл бұрын
Thanks for this video! As a Data Science student, your lecture helped to clarify a lot....I appreciate your teaching style.
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@cll2598
@cll2598 5 ай бұрын
Epic explanation
@MaxDavidsonArgentina
@MaxDavidsonArgentina 4 жыл бұрын
Thanks for sharing your knowledge. It's great to have people like you helping out!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@arun_kanthali
@arun_kanthali 2 жыл бұрын
Great Explanation.. Thank-you 👍
@subhabhadra619
@subhabhadra619 2 жыл бұрын
Awesomely represented..
@berkoec
@berkoec 3 жыл бұрын
Such a well-explained video - keep up the great work!
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks a ton!
@sidddddddddddddd
@sidddddddddddddd Жыл бұрын
What you've called the closed form of the covariance matrix is actually the biased estimator of the covariance matrix \Sigma. And if you divide by (N-1) instead of (N), you get the unbiased estimator of \Sigma. Awesone video! Thanks :D
@sandeepc2833
@sandeepc2833 4 жыл бұрын
Cleared most of my doubts. Thanks a lot.
@ahmadawad4782
@ahmadawad4782 4 жыл бұрын
Watched many videos about linear algebra and PCA. You're the one who made it clear for me. Thanks!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@jhonportella5618
@jhonportella5618 3 жыл бұрын
Great, great video I really appreciate your effort and good methodology to teach. I have a question on the projection math. on your projection video you obtained P=XUU but here you used P=U*XU. Maybe this is a silly question but I would really appreciate if you can tell me why this equivalence is possible. Many thanks
@martinw.9786
@martinw.9786 2 жыл бұрын
Thank you very much for the explanations - very very well done. Your references to the mathematical backround is key!
@paulntalo1425
@paulntalo1425 4 жыл бұрын
You have made it clear. Thank you
@aravindsaraswatula2561
@aravindsaraswatula2561 5 ай бұрын
Awesome video
@nuamaaniqbal6373
@nuamaaniqbal6373 2 жыл бұрын
cant thank u enough!! u r truly the boss!
@kakabudi
@kakabudi 2 жыл бұрын
Really great video! Thanks for explaining this concept wonderfully!
@Chill_Magma
@Chill_Magma Жыл бұрын
Seeing your videos increases my confidence on math stuff :DDD
@cameronbaird5658
@cameronbaird5658 2 жыл бұрын
Phenomenal video, thank you for the hard work 👏
@yarenlerler67
@yarenlerler67 Жыл бұрын
Ahh such a clean explanation. I really appreciate! I will have practical statics for astrophysics exam soon, and I was having some problem with the theory part. All your videos were very helpful! I hope I am gonna get a good grade from the exam. :)
@simranjoharle4220
@simranjoharle4220 Жыл бұрын
Your videos are extremely helpful! Thank you!
@ritvikmath
@ritvikmath Жыл бұрын
Glad you like them!
@proxyme3628
@proxyme3628 2 жыл бұрын
Brilliant explanation of why eigen vector is the one from maximum optimisation, never saw such great explanation before. Wish your course is in Coursera. I do not think any text book explains the eigen value as Lagrangian Multiplier and eigen vector as maximising variance. Thanks so much.
@erfanbayat3974
@erfanbayat3974 6 ай бұрын
this video is amazing
@wandering-byte
@wandering-byte 2 күн бұрын
Just what I was looking for
@Rockyzach88
@Rockyzach88 Жыл бұрын
Just finished the LA section in the Deep Learning book and I can tell this is going to help supplement and fill in this gaps of understanding. Good vid.
@ritvikmath
@ritvikmath Жыл бұрын
I hope so!
@pratik.patil87
@pratik.patil87 11 ай бұрын
Thanks Ritvik, I went through multiple resources to figure out this exact questions " why does eigen vectors and eigen values of a covariance matrix represent the direction and strength of the biggest increase in variance" . Thanks your video clarifies it beautifully. One question still though, I understand the equation we use to maximise but why do we need the constraint(uT u =1)?
@ajanasoufiane3903
@ajanasoufiane3903 5 жыл бұрын
Great video, it would be nice if you could show the big picture through the SVD decomposition :)
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@shashanksundi5669
@shashanksundi5669 3 жыл бұрын
Just perfect !! Thank you :)
@bilalbayrakdar7100
@bilalbayrakdar7100 4 ай бұрын
bro you are the best, thanks for you effort
@akrylic_
@akrylic_ 5 жыл бұрын
There's a property of transposes around 6:45 that you could have mentioned, and I got tripped up for a second. The reason why you can write u^T*(xi-xbar) as (xi-xbar) ^T*u is because (AB)^T =(B^T)(A^T) It's a cool trick, but not obvious
@ritvikmath
@ritvikmath 4 жыл бұрын
Very true, thanks for filling in the missing step!
@zechengchang3444
@zechengchang3444 3 жыл бұрын
Can you explain more? How does (AB)^T =(B^T)(A^T) have anything to do with u^T*(xi-xbar)? Thanks.
@ItahangLimbu
@ItahangLimbu 2 ай бұрын
@@zechengchang3444 (A^T B)^T = B^T * A
@Tankwell-cq5ky
@Tankwell-cq5ky 2 жыл бұрын
Very well presented - well done!😊😊
@knp4356
@knp4356 5 жыл бұрын
Hey Ritvik, It would be great if you can generate some problems for viewers to solve. Watching is great but if you can supplement with actual problems then it would drive the points into viewers head. You can then further post solutions on your medium site. Hopefully at least 4-5 problems per each video. I've watched many videos on DS subjects but something in your teaching method is making it simpler to understand. Thanks.
@ritvikmath
@ritvikmath 5 жыл бұрын
I honestly really appreciate that you're trying to help me be more effective at what I do. I think it's a great idea and I'll look into it. Thanks :)
@fahimfaisal4660
@fahimfaisal4660 3 жыл бұрын
Excellent
@nandhinin799
@nandhinin799 4 жыл бұрын
Clearly explained, helped me greatly in understanding the basis of PCA.
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@かいじゅ-y3j
@かいじゅ-y3j Жыл бұрын
This video is super great! I was wondering why Covariance matrix is used to compute PCA, but this video made my doubts clear!!
@ritvikmath
@ritvikmath Жыл бұрын
Glad it was helpful!
@mwave3388
@mwave3388 2 жыл бұрын
I'm preparing for a job interview. Thanks, the best PCA video I found.
@mmarva3597
@mmarva3597 3 жыл бұрын
Thank you very much !! really helpful
@Sriram-kj6kl
@Sriram-kj6kl 2 жыл бұрын
Your videos help a lot man.. Thank you 👍
@rajathjain314
@rajathjain314 4 жыл бұрын
Very Intuitive, Great Job Ritvik!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@Cybrean1
@Cybrean1 3 жыл бұрын
Excellent presentation and delivery … wish you all the success!
@ritvikmath
@ritvikmath 3 жыл бұрын
Thank you! You too!
@georgegkenios486
@georgegkenios486 3 жыл бұрын
Amazing work mate!
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks a lot!
@seetaramdantu3190
@seetaramdantu3190 3 жыл бұрын
excellent...well explained
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@herberthubert6828
@herberthubert6828 3 жыл бұрын
you rock, thank you
@gc6327
@gc6327 4 жыл бұрын
Hi Ritvik- Can you do a video on factor analysis. That would be huge! Thanks buddy!
@ОлегЗалесский-й1б
@ОлегЗалесский-й1б 8 ай бұрын
great explanation. Really appreciate it. thanks
@ritvikmath
@ritvikmath 8 ай бұрын
Glad it was helpful!
@kisholoymukherjee
@kisholoymukherjee 2 жыл бұрын
Hi ritvik, thanks for the video. Can you please tell me how the vector projection formula is being used to calculate the projection of xi on u here? The formulae in the two videos seem to be quite different. Would really appreciate if you could help understand the underlying math
@ArpitAnand-yd7tr
@ArpitAnand-yd7tr Жыл бұрын
That's just a dot product between the potential u1 and Xi. It gives the magnitude of the projection in the direction of the unit vector u
@ernestanonde3218
@ernestanonde3218 2 жыл бұрын
great video
@mainakmukherjee3444
@mainakmukherjee3444 Жыл бұрын
We find the equation of the variance of the vector, on which we are going to project the data, and then tried maximizing it, because, the vector, for which the variance will be highest (max eigen value), is gonna retain most of the information of the data, after dimensionality reduction.
@suvikarhu4627
@suvikarhu4627 2 жыл бұрын
@ritvikmath 5:02 I don't understand where is this formula of projection (proj(xi)=ut xi u) coming from. The projection video does not say that. What the projection video exactly says is that the proj(xi) = (xi dot u)*u. No transpose there! Where did you get that transpose from? And the dot product is missing ? Another question, at 5:50 why do you take only the magnitude of the vector?
@ItahangLimbu
@ItahangLimbu 2 ай бұрын
he wrote (x.u)u and dot product is just x^T*u so he write (x^T * u) u here `u` is a vector and (x^T * u) is a constant or say scaling factor
@ItahangLimbu
@ItahangLimbu 2 ай бұрын
Second question: he is calculating the variance in the projecting say i project vector v1 and v2 to a vector `v' we are calculating the amount variance in the projection and the projected amount is (x^T * u) the constant scaling factor so he only used (x^T * u)
@muhammadghazy9941
@muhammadghazy9941 2 жыл бұрын
thank you man appreciate it
@MohamedMostafa-kg6gk
@MohamedMostafa-kg6gk 3 жыл бұрын
Thank you for this great explanation .
@ritvikmath
@ritvikmath 3 жыл бұрын
You are welcome!
@GeoffryGifari
@GeoffryGifari 5 ай бұрын
Hmmm i noticed that if two categories are strongly correlated, the plot will look close to a straight line. Going to multidimensional space, that "line" looks like the vector u1 in the video, on which the data are projected. Does that mean PCA will perform better the more correlated two (or more) categories are?
@santiagolicea3814
@santiagolicea3814 2 жыл бұрын
This is a great explanation, thanks a lot. It'll be great if you can also make a video showing a practical example with some data set, showing how you use the eigenvectors projection matrix to transform the initial data set.
@deplo
@deplo 4 жыл бұрын
Hi Ritvikmath, thank you for your super informative videos! I took all courses on this topic but I was wondering if you could expand it with factor analysis and correspondence analysis. It would be interesting to know how different methods work and relate to each other because it would provide a deeper perspective. Thanks
@thirumurthym7980
@thirumurthym7980 3 жыл бұрын
@ 4.54 - you are referring about projection video - on how you arrive projections formula. There is no such mention of U transpose in that projections video.
@quark37
@quark37 Жыл бұрын
Fun video. Thank-you. And thanks for all the pre-req videos. Question: I've seen other videos that describe PCA vectors as orthogonal, but using eigenvectors they would not necessarily be orthogonal, right? What is the correct way to think about the orthogonality of PCA vectors? Thanks. * I think I answered my own question. The eigenvectors in question are of the covariance matrix of the related variables. This matrix is symmetrical so the eigenvectors will be orthogonal. Correct?
@yurongluo447
@yurongluo447 9 ай бұрын
Your video is helpful for us. Can you create one video to explain Independent Component Analysis in detail? Thanks.
@rabiizahir2885
@rabiizahir2885 3 жыл бұрын
Thanks a lot.
@brofessorsbooks3352
@brofessorsbooks3352 5 жыл бұрын
Good!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@Markks100
@Markks100 Жыл бұрын
I don't understand why the projected form of Xi on U1 is U1^TXiU. From your lecture on vector projections, P=(X.U)U, so why the change?
@XXZSaikou
@XXZSaikou 7 ай бұрын
nicely explained! but I noticed you didn't mention the need to standardize the original data for PCA. Is standardization a little trick to make things faster or is it needed in the underlying math?
@AshishKGor
@AshishKGor 2 жыл бұрын
Thanks sir.
@poornanagasai262
@poornanagasai262 Жыл бұрын
It's really a great explanation and one question I got is from the video of vector projection it is clear that the vector onto which we wanna project has the value is (u.x)u where (u.x) is the magnitude and u being the unit vector. Here comes my question in this present video(math behind pca) you used (u^T .x)u as the vector magnitude of the vector which is projected on to. What is the difference in using u and u^t(u transpose)? Can you please answer me?
@PR-ud4fp
@PR-ud4fp 2 жыл бұрын
Thanks 😊
@brianogrady37
@brianogrady37 5 ай бұрын
I wish you specified what values represented the Principal Conponents earlier on. But great video regardless.
@alejandropalaciosgarcia2767
@alejandropalaciosgarcia2767 3 жыл бұрын
Bro, you are awsome
@404nohandlefound
@404nohandlefound 2 жыл бұрын
Could you please explain how this links to SVD
@DarkShadow-tm2dk
@DarkShadow-tm2dk 4 жыл бұрын
I HOPE U WILL REPLY 🛑🛑🛑 Aren't we suppose to standardize data before applying PCA and if we do standardize data then mean = 0 At 8:12 the second part of the equation will get cancelled right? So the equation changes
@darshansolanki5535
@darshansolanki5535 4 жыл бұрын
Best video!!
@zilezile4942
@zilezile4942 4 жыл бұрын
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@iOSGamingDynasties
@iOSGamingDynasties 3 жыл бұрын
Great video. However, I am not quite understanding why projection can be written as u_1^T x_i u. What's the relationship between u and u_1? They have to have the same direction right? Also for the projection formula, it seems that the projection formula should equal to (u_1^T X_i) / ||u_1||^2 * u_1 but it seems that there has to be an extra 1/||u_1|| in the projection formula.
@ahmad3823
@ahmad3823 8 ай бұрын
Amazing
@ritvikmath
@ritvikmath 7 ай бұрын
Thank you! Cheers!
Multi-Armed Bandit : Data Science Concepts
11:44
ritvikmath
Рет қаралды 94 М.
Vector Projections : Data Science Basics
14:58
ritvikmath
Рет қаралды 68 М.
If people acted like cats 🙀😹 LeoNata family #shorts
00:22
LeoNata Family
Рет қаралды 25 МЛН
كم بصير عمركم عام ٢٠٢٥😍 #shorts #hasanandnour
00:27
hasan and nour shorts
Рет қаралды 11 МЛН
Twin Telepathy Challenge!
00:23
Stokes Twins
Рет қаралды 121 МЛН
How To Choose Mac N Cheese Date Night.. 🧀
00:58
Jojo Sim
Рет қаралды 99 МЛН
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Lagrange Multipliers : Data Science Basics
10:01
ritvikmath
Рет қаралды 56 М.
SVM (The Math) : Data Science Concepts
10:19
ritvikmath
Рет қаралды 109 М.
Ali Ghodsi, Lec 1: Principal Component Analysis
1:11:42
Data Science Courses
Рет қаралды 102 М.
Data Analysis 6: Principal Component Analysis (PCA) - Computerphile
20:09
Eigenvalues & Eigenvectors : Data Science Basics
11:58
ritvikmath
Рет қаралды 150 М.
17: Principal Components Analysis_ - Intro to Neural Computation
1:21:19
MIT OpenCourseWare
Рет қаралды 38 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 3,8 МЛН
If people acted like cats 🙀😹 LeoNata family #shorts
00:22
LeoNata Family
Рет қаралды 25 МЛН