i cant thank you enough for this playlist you are the best >
@Hermanubis12 ай бұрын
You are so sad, you probably don't have a husband because you have been brainwashed by feminism.
@Hermanubis12 ай бұрын
Maths = Mathematics is plural. Be consistent.
@Hermanubis12 ай бұрын
Social science is pure communism. Evolutionary psychology at mankind quarterly is not.
@tanzhouqingTami3 ай бұрын
@GregorianWater3 ай бұрын
why do we begin at y sub zero and not y sub 1?
@Hermanubis14 ай бұрын
Why say 5 year old wordds like 'cool fact' cmon, you're an adult with good explanations yet you resort to 'cool'
@Hermanubis14 ай бұрын
Head circumference does correlate with IQ at .3 because of brain size.
@Hermanubis14 ай бұрын
Im here to learn PCA too. I research IQ on my own. Thanks.
@Hermanubis14 ай бұрын
Being left handed is a sign of genetic mutation and brain maladaptation. You sound like a far left extremist.
@Hermanubis14 ай бұрын
Maths. Math is incorrect. Mathematics is plural.
@Hermanubis14 ай бұрын
Americans speak English with bad grammar. It's different to, not different than.
@ChamodPerera-p3q4 ай бұрын
PCA excellently simplified. Thanks
@lonemaven7 ай бұрын
I can't thank you enough for making this lecture series available for free here. I'm a big-picture and a visual learner and your teaching approach fits my learning style really well. Any chance you can do a lecture on, probably, intro to operations research/optimization (or some more lectures in statistics and/or machine learning) or any outside resources you would recommend on the topic that have a similar pedagogical style? Thanks again!
@chiarasacchetti82847 ай бұрын
This video saved my life
@ahmdhmd55618 ай бұрын
thanks
@PraveenKumar-ex1iw8 ай бұрын
Hyperplane need not always be 1 lesser dimension? A 2D hyperplane is also possible from a 4D ambient space right? Basically the dimension of the hyperplane has to be lesser than the dimension of the ambient space, is that right?
@ahmdhmd55619 ай бұрын
thanks
@ahmdhmd55619 ай бұрын
thanks
@mostafamohammed56849 ай бұрын
Really appreciate your efforts for this great explanation ❤
@imadyTech10 ай бұрын
The "wee..." alone deserves a million likes!😂
@antonlinares2866 Жыл бұрын
Thank you so much, you made algebra and linear regression click for me
@nikosalexoudis8874 Жыл бұрын
Excuse me I really like your work. Could you also provide solutions for the practicals as well?
@ashtonronald Жыл бұрын
Thanks a ton!🙏
@sum1sw Жыл бұрын
I'm not sure this is what I am looking for, if it is, then I missed it. I have an implicit function f(x,y,z)=0 (it is actually a model with adjustable parameters). I have an experimental data point (Xexp, Yexp, Zexp). You can probably see where I am heading with this. I want to know where a line, orthogonal/perpendicular to the surface, will intersect the surface. I'm calling this point of intersection Xcalc, Ycalc, Zcalc. How do I proceed? Based on other videos I watched, it looks like the first step is to linearize the surface using Taylor series. So, now I have a plane (in terms of partial derivatives and (Xcalc, Ycalc, Zcalc) which is still unknown. I want to know the point of intersection (Xcalc, Ycalc, Zcalc) of the orthogonal line from Xexp, Yexp, Zexp. At first, I thought is it a trial an error iterative procedure (I have to guess Xcalc, Ycalc, Zcalc) so I programmed that, but the answers I am getting do not seem to be correct. I'm also beginning to suspect that the solution can be direct, not iterative. Any thoughts?
@AceOnBase1 Жыл бұрын
Thank you for doing this
@juliocardenas4485 Жыл бұрын
Choosing the objective of the model resonates with me. That is a problem / need I face regularly during my job as a data scientist in healthcare.
@Jacquesds Жыл бұрын
This is the best intuitive explanation of eigenvalues and eigenvectors I've ever seen. And I saw a lot of them :D
@AlphansoEric Жыл бұрын
That's amazing video, Beautiful explanation of linear regression in terms of linear algebra.
@breathemath4757 Жыл бұрын
This is just way too good. Thanks a lot!
@MrOndra31 Жыл бұрын
Great content! This was the missing link between my linear algebra and econometrics courses :D
@LayneSadler Жыл бұрын
🤘
@teklehaimanotaman3150 Жыл бұрын
Very amazing lecture! thank you very much for your efforts. Is the line from the origin to the point y_hat the regression line please?
@meyouanddata9338 Жыл бұрын
how do we come up with the prototypes against each class. i understand the part where we get similarity measure between the CNN features and prototypes, but where does these prototypes comes from in the first place? As mentioned in the paper, they are also learn, but I am failing to understand that part. can you kindly explain that too in the comments
@asifzahir7512 Жыл бұрын
Amazing! cleared lots of confusions
@lucynowacki3327 Жыл бұрын
Very informative.
@gohilramdevrajeshbhai52542 жыл бұрын
Hello. Thank you for the explanation. Is there any rule/paper/reference which suggest how many singular values to keep or throw away for best noise reduction? Thank you.
@frankieeatstrash2 жыл бұрын
The answer to all Data Science questions like this?! It Depends!!! I talk about this in a later lecture. There are many. They will all disagree. Roughly speaking you want to retain the "majority" of your signal - this means different things in different applications and industries. Some social scientists say something like 70-90% of variance captured. We copy that rule in industry, unless that rule produces too many components 😂 then to hell with it, we'll pick a number that we live with moving forward in our pipeline. 🤷♀️ you can probably find a paper or methodology to defend any number you pick, unless it's too big!!! The worse thing you could do is use all your components because that is a complicated way to do nothing 😂. Just rewrite your data ...
@gohilramdevrajeshbhai52542 жыл бұрын
@@frankieeatstrash Thank you Ma'am! So there is always a trade off between noise reduction and actual data losing which is to be handled smartly! This video helped me a lot in assignment. Thanks a ton. :)
@zodder132 жыл бұрын
Thank you so much for your channel
@silentlessons42212 жыл бұрын
I just completed lesson 2 and am finding it much easier to follow than 3brown1blue videos. I also love the practice exercises at the end. Thank u for this.
@silentlessons42212 жыл бұрын
Is this ok for those starting out in linear algebra
@frankieeatstrash2 жыл бұрын
No, if you are brand new, then you should start with the "linear algebra primer" playlist. This "boot camp" is that entire playlist condensed into six mini sessions.
@silentlessons42212 жыл бұрын
@@frankieeatstrash Thnks Shaina. Let me start off with the primer as u say. I am preparing fr machine learning in second semester and I hope this will help me. Let me start off right away.
@MrSyncope2 жыл бұрын
Is the max/min ratio plot based on draws from the a multivariate gaussian with increasing dimensions? Background: I tried to replicate this for a class of mine with a multivariate gaussian but somehow it doesn't converge as nicely as your plot. Could be a plotting issue tho. After 500 iteratios I have a ratio of 2.711633 based on eucledian distance.
@MrSyncope2 жыл бұрын
Such a great explanation! Thanks a lot
@vm35522 жыл бұрын
Thank you 😀
@NathanRichan2 жыл бұрын
Thank you for making these, really well done. Some of the best videos I've seen on linear algebra.
@neoneo15032 жыл бұрын
As the dimension increases, more space(or volume) will concentrate on the surface or corner of the total space, and most of the sample vectors will be orthogonal to each other at 10:34, Thanks for your great explanation!
@zhigall12 жыл бұрын
the best explanation! thanks!
@leukosnanos2 жыл бұрын
Great video. Could you explain more why all of the volume of the hypercube is contained in the corners as the dimensions grow?
@alessandrogaeta93432 жыл бұрын
very useful video!!!
@alinabarnett85742 жыл бұрын
The slides and code are publicly available here: drive.google.com/drive/folders/1skrvdCc581TN1VL3Cvx2-E4Nt0sZb8sO
@netheraxe48112 жыл бұрын
Nice explanation, really well done. I have one question.: Given a real variance-covariance matrix, which of the following is true. A) it's eigenvalues can be +ve & -ve B) it is always non-singular C) it's eigenvectors are mutually orthogonal D) it's size depands both on image size & number of bands. (MSQ)