Why are comments disabled on other videos?! We all want to say thank you to this dude
@m.harrisonbaker4624 жыл бұрын
this man is such a great teacher
@romanemul14 жыл бұрын
a legend
@craigcollings55684 жыл бұрын
You are so right. He skips straight through things but he brings you along. I have met some good teachers but SB is incredible!
@johnwt73334 жыл бұрын
He also works as a teacher (the two things are not related, just saying he also works as a teacher).
@abeke55233 жыл бұрын
@@romanemul1 a god
@romanemul13 жыл бұрын
@@abeke5523 A myth
@serkansenturk69452 жыл бұрын
Steve Brunton is saving my masters as he saved my undergraduate. What a guy!
@issamassafi4 жыл бұрын
I don't know how they do it, but Steve Brunton and 3Blue1Brown can explain stuff in a very impressive comprehensive way
@ABC-hi3fy3 жыл бұрын
Relating SVD to Fourier series is the most enlightening sentence I had ever heard. Thank you.
@PunmasterSTP2 жыл бұрын
Yeah that blew my mind too!
@lucyhaddant13034 жыл бұрын
A great lecturer. Some people are born teachers. A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague. From downtown London, UK.
@waddahramzialhajar1471 Жыл бұрын
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on KZbin from amazing teachers like Steve or others
@saltrocklamp199 Жыл бұрын
@@robensonlarokulu4963even at great schools, not all professors are equally good at teaching.
@jdormaar2 жыл бұрын
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
@icybrain89435 жыл бұрын
You’ve now become the only channel I’m subscribed to where I’ve also hit the bell - I really appreciate your approach
@Eigensteve4 жыл бұрын
Awesome!
@LucyHealthy972 жыл бұрын
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
@dragoncurveenthusiast4 жыл бұрын
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series! Thank you so much, you are great at explaining!
@MrHaggyy2 жыл бұрын
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
@douglasespindola51852 жыл бұрын
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =( Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
@easa20084 ай бұрын
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand. The structure is well-organized, making it very beginner-friendly. Thank you for the thoughtful teaching!
@srikanthsridharan60624 жыл бұрын
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
@AdarshPathak4 жыл бұрын
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept! Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
@princeebenezeradjei30084 жыл бұрын
This guy is like the greatest teacher ever!!
@ryanciminski4695 Жыл бұрын
You obviously can see this through your viewership, but holy smokes you have an amazing delivery style. Thank you
@davipenha63404 жыл бұрын
i am doing an elective about this and you are practically saving my life
@ShaunakDe3 жыл бұрын
I think i have found a corner of youtube that brings me true joy. Thank you.
@wiseguym2 жыл бұрын
A good KZbin rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
@ksitaraman Жыл бұрын
Outstanding and brilliant with the intuition. A great teacher, and the best set of videos, bar none, on the topic.
@adityaroshan1688 Жыл бұрын
Such smooth and intuitive understanding saving me from perplexity of the topics. Thanks Sir!! May God bless you! Love from students from Bharat!
@adrianl6811 Жыл бұрын
This man is the Guru of so many topics. His explanation is so good even I can understand the material.
@Eigensteve Жыл бұрын
Thanks!
@yayunhuang11322 жыл бұрын
This is the first time I learn about SVD and I can fully understand. Thank you!!!
@technosapien330 Жыл бұрын
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
@Eigensteve Жыл бұрын
Happy to help and thanks for watching!
@angelzarate74802 ай бұрын
Great videos! Attending UW at the moment as a junior in ACMS and I have been referencing your videos for just about every class.
@mustafizurrahman56998 ай бұрын
I cannot thank you more for the splendid elucidation of SVD
@GiuseppeRomagnuolo3 жыл бұрын
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
@nazniamul3 жыл бұрын
This is the best video on SVD i have ever come across. Just wow.
@jorisbergman98153 жыл бұрын
Great stuff! You explain the story the mathematics is trying to say in such a clear and understandable way!
@navalgupta5807 Жыл бұрын
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
@srivatsagrama5377 Жыл бұрын
U and V are singular matrices. So they have to be square.
@bernhardneubuser81633 жыл бұрын
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
@polares013 жыл бұрын
Dude you and some other KZbin teachers have been make my life great and helping me love studying again, thank you truly
@meghaldarji5982 жыл бұрын
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
@gonzalocordova59343 жыл бұрын
Incredible production! Congrats from Spain!
@rizwanmuhammad64683 жыл бұрын
Oh my God. What a teacher. Thank you sir. I needed to learn this mid career and you are God send...
@kjaanishji25953 жыл бұрын
This is the best explanation I have seen so far! Great explanation!!
@sarathk81833 ай бұрын
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
@RajeshSharma-bd5zo4 жыл бұрын
Awsome explanation, before coming to this channel watched other videos and got a bit confused but here the concept is explained so smoothly. Thanks!!
@albertjtyeh534 жыл бұрын
@steve brunton or anyone else. What does Steve mean at 9:43 when he says that big sigma captures the "energy" of the flows? How is sigma sorted? is it done manually? what mathematical process does he do to get BIG sigma from U?
@ammarshahzad96275 ай бұрын
at 3:46 it is said that matrix U is of shape nxn. Now considering the fact that the U is collection of eigen vectors of data, how can the total number of eigen vectors exceed number of examples (m)? Why isnt the shape nxm where 1st column represent eigen vector of 1st example in data and mth column represent eigen vectors of mth example in data. What are the eigen vectors from mth column to nth column representing?
@IkramShah-s8j2 ай бұрын
Precise, comprehensive, tangible Great❤
@shubhamnayak53984 жыл бұрын
wow. when you strated describing the meaning of U sigma and V, I could see how it was of similar concept of that of fourier series and transform
@JaswinKasi4 жыл бұрын
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
@K-Viz2 жыл бұрын
Wow I can clearly visualise it in my head how the multidimensional data is stacked and matched across time.
@mohamed-mkh15 Жыл бұрын
You are amazing. Thank you .. The explanation was really clear and your way of teaching is great.
@bm-ub6zc4 жыл бұрын
looking like a nerd, teaching like a BOSS. big like for that man
@quocphantruong Жыл бұрын
Hello Prof. Steve, it is such an interesting series. I think that A = (nxm) at min 0:45 it should be n faces, and m are pixels? isn't it? Because normal people show data in vertical format rather than in horizontal format
@tiff_inthecloudsАй бұрын
This guy is amazing!! What a fantastic teacher 🙏
@nicholasjaramillo9561 Жыл бұрын
This guy is awesome. This is the Full SVD not the reduced SVD FYI
@fabriciot4166 Жыл бұрын
Thank you so much. Simple, clear and with examples, it's nice 👌
@user-wc7em8kf9d4 жыл бұрын
WOOOOWW ! Amazing teacher! Thanks Professor, I'll get the book.
@saurabhdamle41764 жыл бұрын
OMG this is exactly what I was looking for! And you have explained it in the clearest way possible. Thank you! Instantly subscribed.
@prelude27522 жыл бұрын
Thank you for all the awesome lectures.We wish you all the best..
@softpeachhy89673 жыл бұрын
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
@Eigensteve3 жыл бұрын
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem. Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
@zack_1206 ай бұрын
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
@GeorgeRon2 жыл бұрын
You're a star , Mr Brunton. Thanks !
@stekim4 жыл бұрын
blew my mind with the explanation of each of the three elements. thanks!
@Eigensteve4 жыл бұрын
Awesome, great to hear!
@ZhengQu2 жыл бұрын
I cannot thank you enough for the awesome explanation! Thank you!
@booma2 Жыл бұрын
Much respect from a mathematics teacher
@RajKumar-ob8wk Жыл бұрын
Thank you very much for explaining these in very simple words
@navalgupta5807 Жыл бұрын
At 10:32 duration, eigen mixtures concept of U w.r.t VT is explained
@patrice_ojie Жыл бұрын
Quick question: What software do you use in order to write on the screen?
@edwardsung634 ай бұрын
I’m also wondering about the same thing
@hcam933 жыл бұрын
At time 7:11 you stated the sigmas m is greater than or equal to 0, I was reading through my linear notes and I through that sigma m was strictly greater than.
@ayuanyang30923 жыл бұрын
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
@chiragagrawal71043 жыл бұрын
Thanks a lot teacher, I didn't knew SVD is so simple to understand..
@abhishekranjan320 Жыл бұрын
I am still impressed with how the professor can mirror-write
@atalwar003 жыл бұрын
Thank god you made this video, sir. Awesome!
@rs63924 жыл бұрын
Jesus started to teach Math. Super clear and comprehensive
@mybean10965 жыл бұрын
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
@murraypatterson9190 Жыл бұрын
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
@hiranabe4 жыл бұрын
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
@andreacervantes24852 жыл бұрын
actually he mentions that he follows Brunton's lecture
@just-ask-why Жыл бұрын
My linear algebra teacher used Strang's textbook and it was a great experience for learning linear algebra
@shawheennaderi89704 жыл бұрын
Keep up the great work! Excellent channel!
@hermanusscholtz8 ай бұрын
@steve Brunton! Amazingly explained !! Ite super clear now in my head
@sharonfong1746Ай бұрын
thank you so much for the clear explanation. i'm so touched
@dcadfdfbdf21812 жыл бұрын
4:49 how come there are n U vectors? Shouldnt it be m of them?
@u2coldplay8443 жыл бұрын
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
@prabalghosh29543 жыл бұрын
Very, very nice explanation and presentation. Thank you!
@GabrielleYadventuremore3 жыл бұрын
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table. If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right? and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix. I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
@Eigensteve3 жыл бұрын
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@GabrielleYadventuremore3 жыл бұрын
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
@Eigensteve3 жыл бұрын
@@GabrielleYadventuremore I think of it a little differently. There will usually be r
@ivlev_channel4 жыл бұрын
Probably the best explanation I've seen. At least I understood it))
@macmos17 ай бұрын
Why is V transpose represented differently @10:30
@tjemath8 ай бұрын
Why does the data matrix necessarily have rank m? In general, isn't it possible for X to have rank less than both n and m?
@Eggleweb973 жыл бұрын
That's a great lecture on svd.Hope for getting more initiative videos
@inspiredbynature89702 жыл бұрын
sir you are the best,too good explanation and material
@ravipratapmishra70133 жыл бұрын
Really sir, how can someone explain like this, thanks for your efforts.
@CmanComedy4 жыл бұрын
Great video, love the different real world examples
@meb7499 Жыл бұрын
Just after the 10 minute point you talk about the physics, you seem to describe different multiplication depending on faces or flow fields, but the underlying multiplication is the same correct? I understand e.g. there is no time element for the eigen faces but you seem to describe a different multiplication procedure?
@Sam-gq1bw3 жыл бұрын
You are a legend, Steve
@hugotalibart1463 жыл бұрын
This is so useful and clearly explained. Thank you
@prakhars9624 жыл бұрын
you made it so easy to understand.
@Eigensteve4 жыл бұрын
Thanks!
@michaelpadilla141 Жыл бұрын
This is how you teach. Thank you.
@mohammadgharachorloo44864 жыл бұрын
Brilliant approach. So intuitive.
@AakarshNair3 жыл бұрын
This is pure genius.
@fernandojackson72078 ай бұрын
Thanks for your great video(s). A technical question: Asyou're ordering your sigmas, are you assuming they're all Real-valued?
@4096fb Жыл бұрын
Thanks for this great lecture! btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
@mlkrei20103 жыл бұрын
amazing series! New subscriber here! Love the aesthetics and how clearly you explain everything.
@Eigensteve3 жыл бұрын
Awesome, thank you!
@nickname86684 ай бұрын
shouldn't each sample in X be row vectors instead of colum vectors? ie, the X matrix should be m*n instead of the n*m matrix shown in the video. Or am I missing something?
@Consistent_studying4 жыл бұрын
I cannot understand why SVD of a matrix is unique. Thanks for your great explanation.
@dr.alikhudhair9414 Жыл бұрын
I can't find the suitable word that depict your effort .. Great thanks
@mckstani4 жыл бұрын
Can you make a lesson about how to write things backwards x)
@youmarcube4 жыл бұрын
Writing backwards 101: write regularly in a transparent glass board and then flip the video horizontally... :P
@yuewang39624 жыл бұрын
@@youmarcube You're a genius
@xToTaLBoReDoMx3 жыл бұрын
@@youmarcube So you think he's just miming the hand movements afterwards then? That's ridiculous
@youmarcube3 жыл бұрын
@@xToTaLBoReDoMx Not sure what you mean... Just give it a try: write regularly on a glass and flip the video horizontally... What it is the proof tha this is ridiculous? Could you point the moment in time in the video? I'm really curious...
@GaroKeNJiKunG3 жыл бұрын
@@xToTaLBoReDoMx Put a transparent glass between teacher and camera. When the teacher write something on the glass, the camera will see that the teacher write it from right to left. Then if we flip the video horizontally, it will be that the teacher write it from left to right.