8   Orthogonal Projections
8:24
3 жыл бұрын
7   Orthogonal Matrix
10:20
3 жыл бұрын
5   A More Complete Example
12:16
3 жыл бұрын
6   Orthonormal Basis
8:13
3 жыл бұрын
11   EigenFacts
8:52
3 жыл бұрын
12   Introduction to PCA
10:51
3 жыл бұрын
14   Eigenvalues Give Variance
13:35
3 жыл бұрын
13   2D data in a 3D world
4:24
3 жыл бұрын
15   Loadings and Scores
5:30
3 жыл бұрын
16   Covariance vs  Correlation PCA
20:24
17   The BiPlot ukfood
10:44
3 жыл бұрын
18   FIFA Players PCA
10:00
3 жыл бұрын
19   PCA Recap
7:33
3 жыл бұрын
20   Factor Analysis
23:33
3 жыл бұрын
21   Biased Regression and PCR
7:26
3 жыл бұрын
23   PCR in SAS   BaseballSalary
10:33
3 жыл бұрын
27   Noise Reduction via SVD
5:38
3 жыл бұрын
25   Partial Least Squares w: Example
5:18
26   The Singular Value Decomposition
10:06
24   PCR Example with Big Data
6:01
3 жыл бұрын
9   Solving Systems Part Four
13:12
3 жыл бұрын
7   Solving Systems Part Two
8:20
3 жыл бұрын
13   Advanced Arithmetic
11:59
3 жыл бұрын
10   ExtraExamples
11:54
3 жыл бұрын
8   Solving Systems Part Three
13:13
3 жыл бұрын
Пікірлер
@BelghachiKhaledwalid
@BelghachiKhaledwalid 2 ай бұрын
i cant thank you enough for this playlist you are the best >
@Hermanubis1
@Hermanubis1 2 ай бұрын
You are so sad, you probably don't have a husband because you have been brainwashed by feminism.
@Hermanubis1
@Hermanubis1 2 ай бұрын
Maths = Mathematics is plural. Be consistent.
@Hermanubis1
@Hermanubis1 2 ай бұрын
Social science is pure communism. Evolutionary psychology at mankind quarterly is not.
@tanzhouqingTami
@tanzhouqingTami 3 ай бұрын
@GregorianWater
@GregorianWater 3 ай бұрын
why do we begin at y sub zero and not y sub 1?
@Hermanubis1
@Hermanubis1 4 ай бұрын
Why say 5 year old wordds like 'cool fact' cmon, you're an adult with good explanations yet you resort to 'cool'
@Hermanubis1
@Hermanubis1 4 ай бұрын
Head circumference does correlate with IQ at .3 because of brain size.
@Hermanubis1
@Hermanubis1 4 ай бұрын
Im here to learn PCA too. I research IQ on my own. Thanks.
@Hermanubis1
@Hermanubis1 4 ай бұрын
Being left handed is a sign of genetic mutation and brain maladaptation. You sound like a far left extremist.
@Hermanubis1
@Hermanubis1 4 ай бұрын
Maths. Math is incorrect. Mathematics is plural.
@Hermanubis1
@Hermanubis1 4 ай бұрын
Americans speak English with bad grammar. It's different to, not different than.
@ChamodPerera-p3q
@ChamodPerera-p3q 4 ай бұрын
PCA excellently simplified. Thanks
@lonemaven
@lonemaven 7 ай бұрын
I can't thank you enough for making this lecture series available for free here. I'm a big-picture and a visual learner and your teaching approach fits my learning style really well. Any chance you can do a lecture on, probably, intro to operations research/optimization (or some more lectures in statistics and/or machine learning) or any outside resources you would recommend on the topic that have a similar pedagogical style? Thanks again!
@chiarasacchetti8284
@chiarasacchetti8284 7 ай бұрын
This video saved my life
@ahmdhmd5561
@ahmdhmd5561 8 ай бұрын
thanks
@PraveenKumar-ex1iw
@PraveenKumar-ex1iw 8 ай бұрын
Hyperplane need not always be 1 lesser dimension? A 2D hyperplane is also possible from a 4D ambient space right? Basically the dimension of the hyperplane has to be lesser than the dimension of the ambient space, is that right?
@ahmdhmd5561
@ahmdhmd5561 9 ай бұрын
thanks
@ahmdhmd5561
@ahmdhmd5561 9 ай бұрын
thanks
@mostafamohammed5684
@mostafamohammed5684 9 ай бұрын
Really appreciate your efforts for this great explanation ❤
@imadyTech
@imadyTech 10 ай бұрын
The "wee..." alone deserves a million likes!😂
@antonlinares2866
@antonlinares2866 Жыл бұрын
Thank you so much, you made algebra and linear regression click for me
@nikosalexoudis8874
@nikosalexoudis8874 Жыл бұрын
Excuse me I really like your work. Could you also provide solutions for the practicals as well?
@ashtonronald
@ashtonronald Жыл бұрын
Thanks a ton!🙏
@sum1sw
@sum1sw Жыл бұрын
I'm not sure this is what I am looking for, if it is, then I missed it. I have an implicit function f(x,y,z)=0 (it is actually a model with adjustable parameters). I have an experimental data point (Xexp, Yexp, Zexp). You can probably see where I am heading with this. I want to know where a line, orthogonal/perpendicular to the surface, will intersect the surface. I'm calling this point of intersection Xcalc, Ycalc, Zcalc. How do I proceed? Based on other videos I watched, it looks like the first step is to linearize the surface using Taylor series. So, now I have a plane (in terms of partial derivatives and (Xcalc, Ycalc, Zcalc) which is still unknown. I want to know the point of intersection (Xcalc, Ycalc, Zcalc) of the orthogonal line from Xexp, Yexp, Zexp. At first, I thought is it a trial an error iterative procedure (I have to guess Xcalc, Ycalc, Zcalc) so I programmed that, but the answers I am getting do not seem to be correct. I'm also beginning to suspect that the solution can be direct, not iterative. Any thoughts?
@AceOnBase1
@AceOnBase1 Жыл бұрын
Thank you for doing this
@juliocardenas4485
@juliocardenas4485 Жыл бұрын
Choosing the objective of the model resonates with me. That is a problem / need I face regularly during my job as a data scientist in healthcare.
@Jacquesds
@Jacquesds Жыл бұрын
This is the best intuitive explanation of eigenvalues and eigenvectors I've ever seen. And I saw a lot of them :D
@AlphansoEric
@AlphansoEric Жыл бұрын
That's amazing video, Beautiful explanation of linear regression in terms of linear algebra.
@breathemath4757
@breathemath4757 Жыл бұрын
This is just way too good. Thanks a lot!
@MrOndra31
@MrOndra31 Жыл бұрын
Great content! This was the missing link between my linear algebra and econometrics courses :D
@LayneSadler
@LayneSadler Жыл бұрын
🤘
@teklehaimanotaman3150
@teklehaimanotaman3150 Жыл бұрын
Very amazing lecture! thank you very much for your efforts. Is the line from the origin to the point y_hat the regression line please?
@meyouanddata9338
@meyouanddata9338 Жыл бұрын
how do we come up with the prototypes against each class. i understand the part where we get similarity measure between the CNN features and prototypes, but where does these prototypes comes from in the first place? As mentioned in the paper, they are also learn, but I am failing to understand that part. can you kindly explain that too in the comments
@asifzahir7512
@asifzahir7512 Жыл бұрын
Amazing! cleared lots of confusions
@lucynowacki3327
@lucynowacki3327 Жыл бұрын
Very informative.
@gohilramdevrajeshbhai5254
@gohilramdevrajeshbhai5254 2 жыл бұрын
Hello. Thank you for the explanation. Is there any rule/paper/reference which suggest how many singular values to keep or throw away for best noise reduction? Thank you.
@frankieeatstrash
@frankieeatstrash 2 жыл бұрын
The answer to all Data Science questions like this?! It Depends!!! I talk about this in a later lecture. There are many. They will all disagree. Roughly speaking you want to retain the "majority" of your signal - this means different things in different applications and industries. Some social scientists say something like 70-90% of variance captured. We copy that rule in industry, unless that rule produces too many components 😂 then to hell with it, we'll pick a number that we live with moving forward in our pipeline. 🤷‍♀️ you can probably find a paper or methodology to defend any number you pick, unless it's too big!!! The worse thing you could do is use all your components because that is a complicated way to do nothing 😂. Just rewrite your data ...
@gohilramdevrajeshbhai5254
@gohilramdevrajeshbhai5254 2 жыл бұрын
@@frankieeatstrash Thank you Ma'am! So there is always a trade off between noise reduction and actual data losing which is to be handled smartly! This video helped me a lot in assignment. Thanks a ton. :)
@zodder13
@zodder13 2 жыл бұрын
Thank you so much for your channel
@silentlessons4221
@silentlessons4221 2 жыл бұрын
I just completed lesson 2 and am finding it much easier to follow than 3brown1blue videos. I also love the practice exercises at the end. Thank u for this.
@silentlessons4221
@silentlessons4221 2 жыл бұрын
Is this ok for those starting out in linear algebra
@frankieeatstrash
@frankieeatstrash 2 жыл бұрын
No, if you are brand new, then you should start with the "linear algebra primer" playlist. This "boot camp" is that entire playlist condensed into six mini sessions.
@silentlessons4221
@silentlessons4221 2 жыл бұрын
@@frankieeatstrash Thnks Shaina. Let me start off with the primer as u say. I am preparing fr machine learning in second semester and I hope this will help me. Let me start off right away.
@MrSyncope
@MrSyncope 2 жыл бұрын
Is the max/min ratio plot based on draws from the a multivariate gaussian with increasing dimensions? Background: I tried to replicate this for a class of mine with a multivariate gaussian but somehow it doesn't converge as nicely as your plot. Could be a plotting issue tho. After 500 iteratios I have a ratio of 2.711633 based on eucledian distance.
@MrSyncope
@MrSyncope 2 жыл бұрын
Such a great explanation! Thanks a lot
@vm3552
@vm3552 2 жыл бұрын
Thank you 😀
@NathanRichan
@NathanRichan 2 жыл бұрын
Thank you for making these, really well done. Some of the best videos I've seen on linear algebra.
@neoneo1503
@neoneo1503 2 жыл бұрын
As the dimension increases, more space(or volume) will concentrate on the surface or corner of the total space, and most of the sample vectors will be orthogonal to each other at 10:34, Thanks for your great explanation!
@zhigall1
@zhigall1 2 жыл бұрын
the best explanation! thanks!
@leukosnanos
@leukosnanos 2 жыл бұрын
Great video. Could you explain more why all of the volume of the hypercube is contained in the corners as the dimensions grow?
@alessandrogaeta9343
@alessandrogaeta9343 2 жыл бұрын
very useful video!!!
@alinabarnett8574
@alinabarnett8574 2 жыл бұрын
The slides and code are publicly available here: drive.google.com/drive/folders/1skrvdCc581TN1VL3Cvx2-E4Nt0sZb8sO
@netheraxe4811
@netheraxe4811 2 жыл бұрын
Nice explanation, really well done. I have one question.: Given a real variance-covariance matrix, which of the following is true. A) it's eigenvalues can be +ve & -ve B) it is always non-singular C) it's eigenvectors are mutually orthogonal D) it's size depands both on image size & number of bands. (MSQ)