No video

Singular Value Decomposition (SVD): Matrix Approximation

  Рет қаралды 234,787

Steve Brunton

Steve Brunton

Күн бұрын

This video describes how the singular value decomposition (SVD) can be used for matrix approximation.
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com...
Book Website: databookuw.com
Book PDF: databookuw.com/...
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Пікірлер: 210
@greenpumpkin172
@greenpumpkin172 4 жыл бұрын
This channel is so underrated, your explanations and overal video presentation is really good!
@dombowombo3076
@dombowombo3076 4 жыл бұрын
Don't know why you think it's underrated... Everyone who is watching this videos knows how great they are.
@ayushsaraswat866
@ayushsaraswat866 4 жыл бұрын
This series is by far the best explanation of SVD that I have seen.
@smilebig3884
@smilebig3884 4 жыл бұрын
The best thing about your lectures is, u do coding implementation along with huge maths.. That makes u different from rest of the traditional instructors. Kudos to you!!!
@Eigensteve
@Eigensteve 4 жыл бұрын
It's my pleasure
@AdityaDiwakarVex
@AdityaDiwakarVex 4 жыл бұрын
SVD was at the very end of my college LinAlg class so I never got a very good understanding of it before the final - this is truly amazing; you say "thank you" at the end of every video but it should be us saying it to you- keep doing your thing! I'm loving it.
@ris2043
@ris2043 4 жыл бұрын
The best explanation of SVD. Your videos are excellent. Thank you very much!
@omniscienceisdead8837
@omniscienceisdead8837 2 жыл бұрын
you explain math in such a way as to not make someone feel stupid, but feel like their taking steps into understanding a larger concept, and the tools they need are the ones we already have, big ups
@alexpujoldartmouth
@alexpujoldartmouth 3 жыл бұрын
You have a talent for taking complicated topics and breaking them down into digestible pieces. That's the sign of a good teacher. Thank you.
@douglasespindola5185
@douglasespindola5185 2 жыл бұрын
Gosh, what a class! As mr. Ayush said, this was indeed by far the best SVD explanation I've seen. You've made a such complicated subject way more affordable! I wish you all the best, Steve! Greetings from Brazil!
@Eigensteve
@Eigensteve 2 жыл бұрын
Thanks so much! That is great to hear!!
@wackojacko1997
@wackojacko1997 Жыл бұрын
Not an engineer/student, but I'm watching this to get a better understanding of PCA in statistics. I'm going to check the book and research this, but my only complaint (nit-picky) is trying to tell the difference when Steve speaks between "M" and "N" which I know refers to the number of rows or columns of the matrix. But really, this was great and I am thankful that this is something I can study on my own. Much appreciated.
@rajkundaliya7796
@rajkundaliya7796 2 жыл бұрын
It doesn't get better than this. I am so thankful to you. I don't know how to repay this help.... And yes, this is a highly underrated channel
@nathannguyen2041
@nathannguyen2041 2 жыл бұрын
This was, by far, the most compensable explanation of what the SVD is mathematically and visually. The SVD is an incredible algorithm! Amazing how so little you could keep in order to understand the original system.
@sonilshrivastava1428
@sonilshrivastava1428 3 жыл бұрын
One of the best videos on singular value decomposition. it not only tells the maths but also the intuition. Thanks. !
@AkshatJha
@AkshatJha Жыл бұрын
What a wonderful way to simplify a complicated topic such as SVD--I wish more people in academia emulated your way of teaching, Mr. Brunton.
@skilambi
@skilambi 3 жыл бұрын
Please keep making these high quality lectures. They are some of the best I have seen on KZbin and that goes a long way because I watch a lot of lectures online.
@YYchen713
@YYchen713 2 жыл бұрын
Thank you for making the linear algebra less boring and really connected to data science and machine learning, this series is so much more interpretable than what my professor explains
@PunmasterSTP
@PunmasterSTP Жыл бұрын
Hey I know it's been nine months but I just came across your comment and was curious. How'd the rest of your class go?
@fabou7486
@fabou7486 3 жыл бұрын
One of the best channels I have ever followed, appreciate it so much!
@peymanzirak5400
@peymanzirak5400 Жыл бұрын
I find everything with these courses, even the way board arranged is just great. Many many thanks for this wonderful explanation and all your effort to make it understandable and yet complete.
@zsun0188
@zsun0188 3 жыл бұрын
I learned this in college but couldn't recall a bit after working in the industry over a year. This explanation not only helped me refresh my memory but also enhanced my understanding as well.
@douglashurd4356
@douglashurd4356 3 жыл бұрын
Superlative production! Lighting, sound, set, rehearsals, material, these videos are among the best productions on KZbin. Even I understood some of it! :-)
@bnglr
@bnglr 4 жыл бұрын
every time I think it's time to pause and comment this video with "awesome", it surprises me with more informative perspective. great job
@alek282
@alek282 4 жыл бұрын
Amazing lectures, immidiately bought the book, thank you!
@LusidDreaming
@LusidDreaming 3 жыл бұрын
The book is great, but relatively terse for someone like me who needs to brush up on his linear algebra. These video lectures are an excellent compliment to the book and really help drive home the concepts.
@PunmasterSTP
@PunmasterSTP Жыл бұрын
Matrix approximation? More like "Magnificent explanation!" I really can't convey in words how absolutely outstanding *all* of your videos are.
@din_far
@din_far Жыл бұрын
this is by far the best video explaining SVD on youtube
@nicholashawkins1017
@nicholashawkins1017 2 жыл бұрын
Lightbulbs are finally going off when it comes to SVD cant thank you enough!
@eveninrose
@eveninrose 4 жыл бұрын
Just started watching this playlist, excellent explanations and a great way to promote while sharing knowledge; bought your book and can't wait to revisit w/the text!
@Eigensteve
@Eigensteve 4 жыл бұрын
Awesome, thank you!
@Multibjarne
@Multibjarne 2 жыл бұрын
Explanations like this for a dummy like me makes my life so much easier
@NickKingIII
@NickKingIII 4 жыл бұрын
Wonderful explanation, clear and easy to understand. Thank you very much
@patf9770
@patf9770 3 жыл бұрын
Can't overstate how good this series is...
@alwaysaditi2001
@alwaysaditi2001 4 ай бұрын
Thank you so much for this easy to understand explanation. I was really struggling with the topic and this helped a lot. Thanks again 😊
@Eigensteve
@Eigensteve 4 ай бұрын
Glad it was helpful!
@wudiNB
@wudiNB Жыл бұрын
best teacher that l have ever met
@FezanRafique
@FezanRafique 3 жыл бұрын
Steve is magician of explanation.
@Eigensteve
@Eigensteve 3 жыл бұрын
Thanks so much!
@mkhex87
@mkhex87 Жыл бұрын
To the point. Answers all the important questions. I mean you should come to the party knowing some lin alg but great for intermediate level
@malekbaba7672
@malekbaba7672 4 жыл бұрын
The best explanation of SVD i have ever seen !
@MassimoMorelli
@MassimoMorelli 3 жыл бұрын
Extremely clear. Just want to point out a fact which at first did not seem obvious to me: the outer product has rank 1 because all the column are proportional to the first vector of the outer product, hence they are linearly dependent.
@pilmo11
@pilmo11 Жыл бұрын
superinformative series of SVD
@Nana-wu6fb
@Nana-wu6fb 3 жыл бұрын
Literally the best svd explained, so meaningful
@sabarathinamsrinivasan8832
@sabarathinamsrinivasan8832 29 күн бұрын
Very nice lecture and clearly understantable...Thanks Steve...🤗
@prabalghosh2954
@prabalghosh2954 2 жыл бұрын
Very, very nice explanation and presentation. Thank you!
@guitar300k
@guitar300k 2 жыл бұрын
I like your series also the dark background make my eye feels ease than white background like other channels did
@sanaomar2182
@sanaomar2182 Жыл бұрын
This is the best explanation ever
@Phi1618033
@Phi1618033 Жыл бұрын
This all sounded like gibberish until I started to think of the first term of the expansion (Sigma1*U1*V1T) as the (strongest) "signal" and the rest of the terms as ever decreasing amounts of "signal" and ever increasing amounts of "noise". So the last term (Sigmam*Um*VmT) is essentially all background "noise" in the data. Thinking of it that way, it all makes perfect sense.
@kansasmypie6466
@kansasmypie6466 3 жыл бұрын
Can you do a series on QR decomposition as well? This is so useful!
@zepingluo694
@zepingluo694 3 жыл бұрын
Thank you for presenting us an amazing experience to learn about SVD!
@saitaro
@saitaro 4 жыл бұрын
It was pleasure to watch. You should do more educational videos, mr. Brunton.
@tobyleung96
@tobyleung96 3 жыл бұрын
@14:52 No Steve, thank YOU!
@nikosips
@nikosips 4 жыл бұрын
Thank you very much for those videos , they are very explanatory . Keep up the good work, we need you lessons for our academic improvement.
@kiaranr
@kiaranr 2 жыл бұрын
I've read about and even used SVD. But I never really understood it in this way. Thank you!
@carlossouza5151
@carlossouza5151 4 жыл бұрын
You are a very very gifted teacher! Thank you for sharing this! :)
@user-hp1zj6hk5c
@user-hp1zj6hk5c Жыл бұрын
really really nice explanation!you are really a great teacher!
@yasirsultani
@yasirsultani 2 жыл бұрын
These are the best videos out there. Biggest fan Steve, keep it up.
@sachingarg4385
@sachingarg4385 3 жыл бұрын
Part 2 of the Eckard Young theorem is that this video is the best explanation of the theorem's part1 :P
@parnashish1910
@parnashish1910 2 жыл бұрын
Beautifully explained.
@billandrews6291
@billandrews6291 4 жыл бұрын
13:41 The way I like thinking about is, for example, two vectors in R^3 that are orthogonal are not necessarily orthogonal when projected into R^2, which is essentially what is being done by dropping some of the dimensions. Love the videos though, has me thinking about SVD again.
@HuadongWu
@HuadongWu 4 жыл бұрын
the best lecture of SVD I have ever seen!
@neurochannels
@neurochannels Жыл бұрын
I never *really* appreciated SVD until I saw this video. Mind blown!
@andrezabona3518
@andrezabona3518 3 жыл бұрын
for mn ? (For example, what happen if my dataset is composed by 5000 images of 32x32?)
@bryan_hiebert
@bryan_hiebert Жыл бұрын
Thank you so much for posting the course material. I was running through asking ChatGPT some questions about eigenvector/eigenvalues and revisiting some linear algebra when I stumbled upon transitional probability matrices or Markov Matrices, PCA and SVD as was getting back to my Data Science studies. This is very exciting stuff and your presentation is very clear and understandable.
@LyndaCorliss
@LyndaCorliss Жыл бұрын
Top rate education, I'm happily learning a lot. Nicely done. Thank you
@mohiuddinshojib2647
@mohiuddinshojib2647 3 жыл бұрын
This is everything that I need. Thanks for nice explanation .
@Eigensteve
@Eigensteve 3 жыл бұрын
You are welcome!
@johnberry5275
@johnberry5275 3 жыл бұрын
_"I've been assuming the whole time in these lectures that_ *n* _is_ *much much larger* _than_ *m,* _meaning I have many more_ *entries* _in each column than I have_ *columns."* [question]: To agree with his choice of wording, didn't he actually mean to say that _m_ (the number of rows; you could also say the number of entries down each column) is much much larger than _n_ (the count of columns)? I think he got his _m_ and _n_ mixed up when he wrote *n>>m* . I think, instead, he meant to write *m>>n* .
@VainBrilliance
@VainBrilliance 3 жыл бұрын
n -- is the number of entries in each column m -- is the number of columns (can be interpreted as number of samples/individual observations) so, I think in your interpretations, after the [question], you have swapped m and n. m is the number of columns, not the number of rows
@JJChameleon
@JJChameleon 2 жыл бұрын
I believe in this video he makes m = number of columns and n = number of rows. It is standard to have it the other away around and this confused me as well.
@xiaoyu5181
@xiaoyu5181 4 жыл бұрын
This is also the best explanation of SVG I have seen! Thanks for sharing!
@garrettosborne4364
@garrettosborne4364 3 жыл бұрын
This is answering a lot of my questions on SVD.
@khim2970
@khim2970 Жыл бұрын
really appreciate your efforts. wish u all the best
@RajeshSharma-bd5zo
@RajeshSharma-bd5zo 3 жыл бұрын
Such a cool concept of decomposition and brilliantly explained here. Big thumbs up!!
@kaiyueli1372
@kaiyueli1372 2 жыл бұрын
This video series is so helpful!! Thank you Dr. Brunton!
@mohammedal-khulaifi7655
@mohammedal-khulaifi7655 2 жыл бұрын
you are at the tip-top i like your explanation
@Streamoon
@Streamoon 2 жыл бұрын
Thank you Prof. Brunton, excellent explanation! Just come from MIT 18.06.
@tusharnandy6711
@tusharnandy6711 4 жыл бұрын
Gentleman, you have done a very impressive job. I have just started exploring data science and have recently completed my college course in Linear Algebra. This was quite interesting.
@aryanshrajsaxena6961
@aryanshrajsaxena6961 4 ай бұрын
Thank You Professor. Respects from India
@Eigensteve
@Eigensteve 4 ай бұрын
You're welcome :)
@patrickgilbert6170
@patrickgilbert6170 3 жыл бұрын
Great video. Should be required viewing for anybody learning the SVD!
@Aditya-ne4lk
@Aditya-ne4lk 4 жыл бұрын
Just in time for the new semester!
@maydin34
@maydin34 3 жыл бұрын
Sir. Just thank you for making me be your student in here for free! Great performance, great job!
@dhoomketu731
@dhoomketu731 4 жыл бұрын
This one's a brilliant explanation. Simply loved it.
@btobin86
@btobin86 2 жыл бұрын
You are so talented at teaching, great explanations!
@juangoog
@juangoog 2 жыл бұрын
Wow, what a wonderful presentation. Congratulations.
@mdmamunurrashid2945
@mdmamunurrashid2945 3 жыл бұрын
Love his explanation style
@arthurlee8961
@arthurlee8961 Ай бұрын
That's so helpful! thank you!
@ARSHABBIR100
@ARSHABBIR100 4 жыл бұрын
Excellent explanation. Thank you very much.
@marcavila3938
@marcavila3938 3 жыл бұрын
Even better then the previous one
@nolanzheng7810
@nolanzheng7810 2 жыл бұрын
I cant believe I am watching this for free. Thank you!
@arne9518
@arne9518 3 жыл бұрын
This is a gold mine! thanks for your videos
@raven5165
@raven5165 10 ай бұрын
In the example, m is the number of images you have, n is their pixel values. So m doesn't have to be enough to represent your data. It is like saying, if you have 2 photo, 2 dimensions are enough to represent image features.
@adelheidgang8217
@adelheidgang8217 Жыл бұрын
incredicle explanation!
@deepthikiran8345
@deepthikiran8345 3 жыл бұрын
The explanation is really wow !! Very intuitive ... thank you so much !!
@jonathanschwartz7256
@jonathanschwartz7256 4 жыл бұрын
Watch out Kahn Academy, Steve Brunton is coming for ya! Seriously though, these videos are fantastic :)
@katieadamczyk937
@katieadamczyk937 Жыл бұрын
This is a fantastic video!!
@johnberry5275
@johnberry5275 3 жыл бұрын
I'm glad he made it clear that *outer products* were taking place.
@chenqu773
@chenqu773 3 жыл бұрын
Good explanation! Many thanks ! how could one manage to get these stuffs explained in such an elegant way.
@liuhuoji
@liuhuoji 3 жыл бұрын
love the video, well explained and aesthetically good.
@JCatharsis
@JCatharsis 2 жыл бұрын
Thank you so much professor.
@scottandrewbishop
@scottandrewbishop 4 жыл бұрын
Finished my masters in cv and ml with a vague understanding of this topic - a very pertinent topic for my field. This series demystifies svd for me. Looking forward to checking out your other lectures.
@alexyang6755
@alexyang6755 3 жыл бұрын
it covers a lot.Thanks for beautiful teaching!
@harrypotter1155
@harrypotter1155 2 жыл бұрын
Mindblowing!
@florawoflour4501
@florawoflour4501 11 ай бұрын
thank u so much sir, very helpful
@hugeride
@hugeride 3 жыл бұрын
Just amazing explanation.
@4096fb
@4096fb 11 ай бұрын
Thank you for this great explanation. I just lost you on one point, why is this matrix multiplication equals sig1U1V1T + sig2U2V2T + ... + sigmUmVmT Can someone explain how does it complete the entire matrix multiplication? I somehow lost in this columns of U and row of V
@maciejmikulski7287
@maciejmikulski7287 Жыл бұрын
The assumption n >> m is contrary to what we have quite often in data sciences. In many problems, the number of samples (here m) is bigger than number of features (here n). In such a case, we just take the transpose and keep going the same way? Or there are some additional considerations (of course except of swapping interpretations of eigen vectors etc)?
@inazuma3gou
@inazuma3gou 3 жыл бұрын
Excellent, excellent content. Thank you so much!
@maipyaar
@maipyaar 3 жыл бұрын
Thank you for this video series.
@fabiopadovani2359
@fabiopadovani2359 4 жыл бұрын
Thank you very much. Excellent explanations.
Singular Value Decomposition (SVD): Dominant Correlations
11:19
Steve Brunton
Рет қаралды 132 М.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 532 М.
Чёрная ДЫРА 🕳️ | WICSUR #shorts
00:49
Бискас
Рет қаралды 6 МЛН
Fortunately, Ultraman protects me  #shorts #ultraman #ultramantiga #liveaction
00:10
Oh No! My Doll Fell In The Dirt🤧💩
00:17
ToolTastic
Рет қаралды 3,7 МЛН
Kind Waiter's Gesture to Homeless Boy #shorts
00:32
I migliori trucchetti di Fabiosa
Рет қаралды 15 МЛН
Standing Waves on Strings Source
11:22
Miguel Rodriguez
Рет қаралды
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 376 М.
Singular Value Decomposition of a Matrix (SVD)
24:08
Machara Mathematics
Рет қаралды 21 М.
Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra
17:16
The Surgery That Proved There Is No Free Will
29:43
Joe Scott
Рет қаралды 358 М.
Lecture 47 - Singular Value Decomposition | Stanford University
13:40
Artificial Intelligence - All in One
Рет қаралды 333 М.
Unitary Transformations
12:11
Steve Brunton
Рет қаралды 57 М.
Чёрная ДЫРА 🕳️ | WICSUR #shorts
00:49
Бискас
Рет қаралды 6 МЛН