Singular Value Decomposition (SVD): Mathematical Overview

  Рет қаралды 412,216

Steve Brunton

Steve Brunton

Күн бұрын

Пікірлер: 300
@dragosmanailoiu9544
@dragosmanailoiu9544 5 жыл бұрын
Why are comments disabled on other videos?! We all want to say thank you to this dude
@m.harrisonbaker462
@m.harrisonbaker462 4 жыл бұрын
this man is such a great teacher
@romanemul1
@romanemul1 4 жыл бұрын
a legend
@craigcollings5568
@craigcollings5568 4 жыл бұрын
You are so right. He skips straight through things but he brings you along. I have met some good teachers but SB is incredible!
@johnwt7333
@johnwt7333 4 жыл бұрын
He also works as a teacher (the two things are not related, just saying he also works as a teacher).
@abeke5523
@abeke5523 3 жыл бұрын
@@romanemul1 a god
@romanemul1
@romanemul1 3 жыл бұрын
@@abeke5523 A myth
@serkansenturk6945
@serkansenturk6945 2 жыл бұрын
Steve Brunton is saving my masters as he saved my undergraduate. What a guy!
@issamassafi
@issamassafi 4 жыл бұрын
I don't know how they do it, but Steve Brunton and 3Blue1Brown can explain stuff in a very impressive comprehensive way
@ABC-hi3fy
@ABC-hi3fy 3 жыл бұрын
Relating SVD to Fourier series is the most enlightening sentence I had ever heard. Thank you.
@PunmasterSTP
@PunmasterSTP 2 жыл бұрын
Yeah that blew my mind too!
@lucyhaddant1303
@lucyhaddant1303 4 жыл бұрын
A great lecturer. Some people are born teachers. A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague. From downtown London, UK.
@waddahramzialhajar1471
@waddahramzialhajar1471 Жыл бұрын
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on KZbin from amazing teachers like Steve or others
@saltrocklamp199
@saltrocklamp199 Жыл бұрын
@@robensonlarokulu4963even at great schools, not all professors are equally good at teaching.
@jdormaar
@jdormaar 2 жыл бұрын
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
@icybrain8943
@icybrain8943 5 жыл бұрын
You’ve now become the only channel I’m subscribed to where I’ve also hit the bell - I really appreciate your approach
@Eigensteve
@Eigensteve 4 жыл бұрын
Awesome!
@LucyHealthy97
@LucyHealthy97 2 жыл бұрын
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
@dragoncurveenthusiast
@dragoncurveenthusiast 4 жыл бұрын
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series! Thank you so much, you are great at explaining!
@MrHaggyy
@MrHaggyy 2 жыл бұрын
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
@douglasespindola5185
@douglasespindola5185 2 жыл бұрын
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =( Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
@easa2008
@easa2008 4 ай бұрын
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand. The structure is well-organized, making it very beginner-friendly. Thank you for the thoughtful teaching!
@srikanthsridharan6062
@srikanthsridharan6062 4 жыл бұрын
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
@AdarshPathak
@AdarshPathak 4 жыл бұрын
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept! Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
@princeebenezeradjei3008
@princeebenezeradjei3008 4 жыл бұрын
This guy is like the greatest teacher ever!!
@ryanciminski4695
@ryanciminski4695 Жыл бұрын
You obviously can see this through your viewership, but holy smokes you have an amazing delivery style. Thank you
@davipenha6340
@davipenha6340 4 жыл бұрын
i am doing an elective about this and you are practically saving my life
@ShaunakDe
@ShaunakDe 3 жыл бұрын
I think i have found a corner of youtube that brings me true joy. Thank you.
@wiseguym
@wiseguym 2 жыл бұрын
A good KZbin rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
@ksitaraman
@ksitaraman Жыл бұрын
Outstanding and brilliant with the intuition. A great teacher, and the best set of videos, bar none, on the topic.
@adityaroshan1688
@adityaroshan1688 Жыл бұрын
Such smooth and intuitive understanding saving me from perplexity of the topics. Thanks Sir!! May God bless you! Love from students from Bharat!
@adrianl6811
@adrianl6811 Жыл бұрын
This man is the Guru of so many topics. His explanation is so good even I can understand the material.
@Eigensteve
@Eigensteve Жыл бұрын
Thanks!
@yayunhuang1132
@yayunhuang1132 2 жыл бұрын
This is the first time I learn about SVD and I can fully understand. Thank you!!!
@technosapien330
@technosapien330 Жыл бұрын
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
@Eigensteve
@Eigensteve Жыл бұрын
Happy to help and thanks for watching!
@angelzarate7480
@angelzarate7480 2 ай бұрын
Great videos! Attending UW at the moment as a junior in ACMS and I have been referencing your videos for just about every class.
@mustafizurrahman5699
@mustafizurrahman5699 8 ай бұрын
I cannot thank you more for the splendid elucidation of SVD
@GiuseppeRomagnuolo
@GiuseppeRomagnuolo 3 жыл бұрын
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
@nazniamul
@nazniamul 3 жыл бұрын
This is the best video on SVD i have ever come across. Just wow.
@jorisbergman9815
@jorisbergman9815 3 жыл бұрын
Great stuff! You explain the story the mathematics is trying to say in such a clear and understandable way!
@navalgupta5807
@navalgupta5807 Жыл бұрын
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
@srivatsagrama5377
@srivatsagrama5377 Жыл бұрын
U and V are singular matrices. So they have to be square.
@bernhardneubuser8163
@bernhardneubuser8163 3 жыл бұрын
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
@polares01
@polares01 3 жыл бұрын
Dude you and some other KZbin teachers have been make my life great and helping me love studying again, thank you truly
@meghaldarji598
@meghaldarji598 2 жыл бұрын
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
@gonzalocordova5934
@gonzalocordova5934 3 жыл бұрын
Incredible production! Congrats from Spain!
@rizwanmuhammad6468
@rizwanmuhammad6468 3 жыл бұрын
Oh my God. What a teacher. Thank you sir. I needed to learn this mid career and you are God send...
@kjaanishji2595
@kjaanishji2595 3 жыл бұрын
This is the best explanation I have seen so far! Great explanation!!
@sarathk8183
@sarathk8183 3 ай бұрын
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
@RajeshSharma-bd5zo
@RajeshSharma-bd5zo 4 жыл бұрын
Awsome explanation, before coming to this channel watched other videos and got a bit confused but here the concept is explained so smoothly. Thanks!!
@albertjtyeh53
@albertjtyeh53 4 жыл бұрын
@steve brunton or anyone else. What does Steve mean at 9:43 when he says that big sigma captures the "energy" of the flows? How is sigma sorted? is it done manually? what mathematical process does he do to get BIG sigma from U?
@ammarshahzad9627
@ammarshahzad9627 5 ай бұрын
at 3:46 it is said that matrix U is of shape nxn. Now considering the fact that the U is collection of eigen vectors of data, how can the total number of eigen vectors exceed number of examples (m)? Why isnt the shape nxm where 1st column represent eigen vector of 1st example in data and mth column represent eigen vectors of mth example in data. What are the eigen vectors from mth column to nth column representing?
@IkramShah-s8j
@IkramShah-s8j 2 ай бұрын
Precise, comprehensive, tangible Great❤
@shubhamnayak5398
@shubhamnayak5398 4 жыл бұрын
wow. when you strated describing the meaning of U sigma and V, I could see how it was of similar concept of that of fourier series and transform
@JaswinKasi
@JaswinKasi 4 жыл бұрын
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
@K-Viz
@K-Viz 2 жыл бұрын
Wow I can clearly visualise it in my head how the multidimensional data is stacked and matched across time.
@mohamed-mkh15
@mohamed-mkh15 Жыл бұрын
You are amazing. Thank you .. The explanation was really clear and your way of teaching is great.
@bm-ub6zc
@bm-ub6zc 4 жыл бұрын
looking like a nerd, teaching like a BOSS. big like for that man
@quocphantruong
@quocphantruong Жыл бұрын
Hello Prof. Steve, it is such an interesting series. I think that A = (nxm) at min 0:45 it should be n faces, and m are pixels? isn't it? Because normal people show data in vertical format rather than in horizontal format
@tiff_intheclouds
@tiff_intheclouds Ай бұрын
This guy is amazing!! What a fantastic teacher 🙏
@nicholasjaramillo9561
@nicholasjaramillo9561 Жыл бұрын
This guy is awesome. This is the Full SVD not the reduced SVD FYI
@fabriciot4166
@fabriciot4166 Жыл бұрын
Thank you so much. Simple, clear and with examples, it's nice 👌
@user-wc7em8kf9d
@user-wc7em8kf9d 4 жыл бұрын
WOOOOWW ! Amazing teacher! Thanks Professor, I'll get the book.
@saurabhdamle4176
@saurabhdamle4176 4 жыл бұрын
OMG this is exactly what I was looking for! And you have explained it in the clearest way possible. Thank you! Instantly subscribed.
@prelude2752
@prelude2752 2 жыл бұрын
Thank you for all the awesome lectures.We wish you all the best..
@softpeachhy8967
@softpeachhy8967 3 жыл бұрын
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
@Eigensteve
@Eigensteve 3 жыл бұрын
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem. Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
@zack_120
@zack_120 6 ай бұрын
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
@GeorgeRon
@GeorgeRon 2 жыл бұрын
You're a star , Mr Brunton. Thanks !
@stekim
@stekim 4 жыл бұрын
blew my mind with the explanation of each of the three elements. thanks!
@Eigensteve
@Eigensteve 4 жыл бұрын
Awesome, great to hear!
@ZhengQu
@ZhengQu 2 жыл бұрын
I cannot thank you enough for the awesome explanation! Thank you!
@booma2
@booma2 Жыл бұрын
Much respect from a mathematics teacher
@RajKumar-ob8wk
@RajKumar-ob8wk Жыл бұрын
Thank you very much for explaining these in very simple words
@navalgupta5807
@navalgupta5807 Жыл бұрын
At 10:32 duration, eigen mixtures concept of U w.r.t VT is explained
@patrice_ojie
@patrice_ojie Жыл бұрын
Quick question: What software do you use in order to write on the screen?
@edwardsung63
@edwardsung63 4 ай бұрын
I’m also wondering about the same thing
@hcam93
@hcam93 3 жыл бұрын
At time 7:11 you stated the sigmas m is greater than or equal to 0, I was reading through my linear notes and I through that sigma m was strictly greater than.
@ayuanyang3092
@ayuanyang3092 3 жыл бұрын
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
@chiragagrawal7104
@chiragagrawal7104 3 жыл бұрын
Thanks a lot teacher, I didn't knew SVD is so simple to understand..
@abhishekranjan320
@abhishekranjan320 Жыл бұрын
I am still impressed with how the professor can mirror-write
@atalwar00
@atalwar00 3 жыл бұрын
Thank god you made this video, sir. Awesome!
@rs6392
@rs6392 4 жыл бұрын
Jesus started to teach Math. Super clear and comprehensive
@mybean1096
@mybean1096 5 жыл бұрын
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
@murraypatterson9190
@murraypatterson9190 Жыл бұрын
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
@hiranabe
@hiranabe 4 жыл бұрын
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
@andreacervantes2485
@andreacervantes2485 2 жыл бұрын
actually he mentions that he follows Brunton's lecture
@just-ask-why
@just-ask-why Жыл бұрын
My linear algebra teacher used Strang's textbook and it was a great experience for learning linear algebra
@shawheennaderi8970
@shawheennaderi8970 4 жыл бұрын
Keep up the great work! Excellent channel!
@hermanusscholtz
@hermanusscholtz 8 ай бұрын
@steve Brunton! Amazingly explained !! Ite super clear now in my head
@sharonfong1746
@sharonfong1746 Ай бұрын
thank you so much for the clear explanation. i'm so touched
@dcadfdfbdf2181
@dcadfdfbdf2181 2 жыл бұрын
4:49 how come there are n U vectors? Shouldnt it be m of them?
@u2coldplay844
@u2coldplay844 3 жыл бұрын
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
@prabalghosh2954
@prabalghosh2954 3 жыл бұрын
Very, very nice explanation and presentation. Thank you!
@GabrielleYadventuremore
@GabrielleYadventuremore 3 жыл бұрын
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table. If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right? and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix. I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
@Eigensteve
@Eigensteve 3 жыл бұрын
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@GabrielleYadventuremore
@GabrielleYadventuremore 3 жыл бұрын
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
@Eigensteve
@Eigensteve 3 жыл бұрын
@@GabrielleYadventuremore I think of it a little differently. There will usually be r
@ivlev_channel
@ivlev_channel 4 жыл бұрын
Probably the best explanation I've seen. At least I understood it))
@macmos1
@macmos1 7 ай бұрын
Why is V transpose represented differently @10:30
@tjemath
@tjemath 8 ай бұрын
Why does the data matrix necessarily have rank m? In general, isn't it possible for X to have rank less than both n and m?
@Eggleweb97
@Eggleweb97 3 жыл бұрын
That's a great lecture on svd.Hope for getting more initiative videos
@inspiredbynature8970
@inspiredbynature8970 2 жыл бұрын
sir you are the best,too good explanation and material
@ravipratapmishra7013
@ravipratapmishra7013 3 жыл бұрын
Really sir, how can someone explain like this, thanks for your efforts.
@CmanComedy
@CmanComedy 4 жыл бұрын
Great video, love the different real world examples
@meb7499
@meb7499 Жыл бұрын
Just after the 10 minute point you talk about the physics, you seem to describe different multiplication depending on faces or flow fields, but the underlying multiplication is the same correct? I understand e.g. there is no time element for the eigen faces but you seem to describe a different multiplication procedure?
@Sam-gq1bw
@Sam-gq1bw 3 жыл бұрын
You are a legend, Steve
@hugotalibart146
@hugotalibart146 3 жыл бұрын
This is so useful and clearly explained. Thank you
@prakhars962
@prakhars962 4 жыл бұрын
you made it so easy to understand.
@Eigensteve
@Eigensteve 4 жыл бұрын
Thanks!
@michaelpadilla141
@michaelpadilla141 Жыл бұрын
This is how you teach. Thank you.
@mohammadgharachorloo4486
@mohammadgharachorloo4486 4 жыл бұрын
Brilliant approach. So intuitive.
@AakarshNair
@AakarshNair 3 жыл бұрын
This is pure genius.
@fernandojackson7207
@fernandojackson7207 8 ай бұрын
Thanks for your great video(s). A technical question: Asyou're ordering your sigmas, are you assuming they're all Real-valued?
@4096fb
@4096fb Жыл бұрын
Thanks for this great lecture! btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
@mlkrei2010
@mlkrei2010 3 жыл бұрын
amazing series! New subscriber here! Love the aesthetics and how clearly you explain everything.
@Eigensteve
@Eigensteve 3 жыл бұрын
Awesome, thank you!
@nickname8668
@nickname8668 4 ай бұрын
shouldn't each sample in X be row vectors instead of colum vectors? ie, the X matrix should be m*n instead of the n*m matrix shown in the video. Or am I missing something?
@Consistent_studying
@Consistent_studying 4 жыл бұрын
I cannot understand why SVD of a matrix is unique. Thanks for your great explanation.
@dr.alikhudhair9414
@dr.alikhudhair9414 Жыл бұрын
I can't find the suitable word that depict your effort .. Great thanks
@mckstani
@mckstani 4 жыл бұрын
Can you make a lesson about how to write things backwards x)
@youmarcube
@youmarcube 4 жыл бұрын
Writing backwards 101: write regularly in a transparent glass board and then flip the video horizontally... :P
@yuewang3962
@yuewang3962 4 жыл бұрын
@@youmarcube You're a genius
@xToTaLBoReDoMx
@xToTaLBoReDoMx 3 жыл бұрын
@@youmarcube So you think he's just miming the hand movements afterwards then? That's ridiculous
@youmarcube
@youmarcube 3 жыл бұрын
@@xToTaLBoReDoMx Not sure what you mean... Just give it a try: write regularly on a glass and flip the video horizontally... What it is the proof tha this is ridiculous? Could you point the moment in time in the video? I'm really curious...
@GaroKeNJiKunG
@GaroKeNJiKunG 3 жыл бұрын
@@xToTaLBoReDoMx Put a transparent glass between teacher and camera. When the teacher write something on the glass, the camera will see that the teacher write it from right to left. Then if we flip the video horizontally, it will be that the teacher write it from left to right.
Singular Value Decomposition (SVD): Matrix Approximation
14:54
Steve Brunton
Рет қаралды 249 М.
ВЛОГ ДИАНА В ТУРЦИИ
1:31:22
Lady Diana VLOG
Рет қаралды 1,2 МЛН
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 408 М.
Linear Systems of Equations, Least Squares Regression, Pseudoinverse
11:53
Lecture 47 - Singular Value Decomposition | Stanford University
13:40
Artificial Intelligence - All in One
Рет қаралды 339 М.
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 380 М.
Singular Value Decomposition (SVD) and Image Compression
28:56
Serrano.Academy
Рет қаралды 96 М.
Five Factorizations of a Matrix
59:52
MIT OpenCourseWare
Рет қаралды 102 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Singular Value Decomposition (SVD): Overview
6:44
Steve Brunton
Рет қаралды 589 М.
6. Singular Value Decomposition (SVD)
53:34
MIT OpenCourseWare
Рет қаралды 231 М.
Lecture: The Singular Value Decomposition (SVD)
44:36
AMATH 301
Рет қаралды 231 М.
ВЛОГ ДИАНА В ТУРЦИИ
1:31:22
Lady Diana VLOG
Рет қаралды 1,2 МЛН