Singular Value Decomposition (SVD): Dominant Correlations

  Рет қаралды 130,887

Steve Brunton

Steve Brunton

4 жыл бұрын

This lectures discusses how the SVD captures dominant correlations in a matrix of data.
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Book Website: databookuw.com
Book PDF: databookuw.com/databook.pdf
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Пікірлер: 144
@capsbr2100
@capsbr2100 4 жыл бұрын
I cannot put in words how much talent you have for teaching. It is just impressive. Thank you so much.
@ethan6708
@ethan6708 20 күн бұрын
I'm highly critical of so-called KZbin "educators." I just watched several on SVD from MIT and Stanford, all of which were garbage. But this... this is art in its purest form. You are a scholar among scholars! Absolutely beautiful to watch unfold. Thank you!!!
@phdrxakadennyblack4098
@phdrxakadennyblack4098 19 күн бұрын
Brunton is unparalleled until others do the clear board as well. re MIT tho..Strang is no joke, don't sleep on him. abstruse but distilled well, passionate dude yet elderly linear algebra notable
@ethan6708
@ethan6708 10 күн бұрын
@@phdrxakadennyblack4098 No doubt. If I could possess the mind of a modern mathematician, Strang is in my top 5, without question. But this video really emphasized the simple elegance of a good TEACHER.
@phdrxakadennyblack4098
@phdrxakadennyblack4098 10 күн бұрын
​@@ethan6708 Foucault and Baudrillard always made me feel smarter like I learned something significant that they pondered long and hard over. When it is so good that the lecture and the concepts stick clearly in your head for years and years, that to me is the measure of a great teacher. I hope SB can address the latest stuff I see like the math behind KAN networks as opposed to MLPs, or how ternary filters are so good at approximating so efficiently cutting through data without losing information, if he hasn't already.
@mutiur7396
@mutiur7396 4 күн бұрын
Mit and likes are either simplistic to superficial and wasting much time, or like teaching to seniors... Strang is passionate but teach in a unique way, after that you are on your own studies or restart another similar course
@SorenS_
@SorenS_ 3 жыл бұрын
You do these fantastic explanations with your original behind-the-whiteboard technique AND you're giving the book this information comes from for free? ?????????? What a gift you are to the internet, thank you so very much. I love your teaching style.
@Eigensteve
@Eigensteve 3 жыл бұрын
Wow, thank you!
@seungyunsong2798
@seungyunsong2798 3 жыл бұрын
If they had a Nobel prize for teaching on KZbin, this guy would be one of the top contenders
@eevibessite
@eevibessite Жыл бұрын
kzbin.info/www/bejne/ZnLLm2uJgaman8k
@Melle-sq4df
@Melle-sq4df 25 күн бұрын
this video is on a different level of teaching, thanks
@user-fv2hd2nw6g
@user-fv2hd2nw6g Жыл бұрын
I could complain about these videos being to fast-paced and technical for a more novice audience, or that the explanations are not very thorough, or even about the idiosynchratic notation (m for number of columns, like come on matrices are mxn by default). But it still wouldn't change the fact that these videos motivated me to learn more about SVD and served as a invaluable resource for getting introduced to the maths I need for my job. Thank you. I hope there is more content in the making.
@silvadaniloc
@silvadaniloc 3 жыл бұрын
This guy is the refreshed version of professor Gilbert Strang. Professor Steve Brunton, you're an amazing linear algebra teacher. Glad I found these videos. Going to recommend to my entire computational linear algebra classmates. Thank you so much
@thomasfisherson
@thomasfisherson 3 жыл бұрын
Strang will always be the GOAT.
@zichendu5565
@zichendu5565 2 жыл бұрын
OMG.... These videos really moved me. I haven't had the feeling to learn knowledge with excitement and refreshment, thank you so much for being such a good teacher. I feel the beauty of math.
@iheavense
@iheavense 3 жыл бұрын
Usually some of the explanations can be skipped using Einstein notation but it is often ignored how very educational it can be to explicitly walk though the process. Respect!
@katherinhuang5635
@katherinhuang5635 3 жыл бұрын
I learned SVD 4 years ago, and nobody ever explained it so well! Will recommend this series to every math student!
@ayankashyap5379
@ayankashyap5379 4 жыл бұрын
this series is like a gift, thank you so much. Will definitely buy the book when I can :)
@Eigensteve
@Eigensteve 4 жыл бұрын
Hope you enjoy it! (check out databookuw.com/databook.pdf until then)
@poiuwnwang7109
@poiuwnwang7109 Жыл бұрын
Professor Brunton is great. I saw his SVD slightly covered in his in-class lecture. Now he dedicate his time to make this video lecture. Super!
@mangomilkshakelol
@mangomilkshakelol 3 жыл бұрын
the best grad school prof ever!
@Al-oy3le
@Al-oy3le 2 жыл бұрын
a real genius can dissect complex subject matter to explain it in a simple and intuitive way. and this guy is just a splendid example of such genius
@jameswalters8755
@jameswalters8755 3 жыл бұрын
Engaged delivery with exceptional content clarity. If Mr Brunton isn't tenured it would be an indictment of the US university system. Cheer!
@Harish-ou4dy
@Harish-ou4dy 3 жыл бұрын
Every one is applauding the wonderful lesson, which is very true!! BUT No one is applauding Steve's wonderful ability to write inverted.
@invinity3982
@invinity3982 4 жыл бұрын
One of the best things to have happen in Jan 2020 is seeing you share more of your knowledge! Much Appreciated!
@JohnSmith-ok9sn
@JohnSmith-ok9sn 3 жыл бұрын
Sir, do you even have a slightest idea what a superawesomely-superawesome teacher you are?!!! Thank you, SO MUCH! 🙏🙏🙏
@markq448
@markq448 4 жыл бұрын
Your descriptions here are, by far, the best I have seen. Maybe it is because you teach in the way I learn but these have been really useful.
@tems4real
@tems4real 4 жыл бұрын
Wow, thank you. I just found your channel today, and I'm grateful for these intuitive explanations. Cheers!
@Feschmesser2
@Feschmesser2 2 жыл бұрын
Thank you for this fantastic series, definitely among the best educational content i have come across on youtube!
@tongluo9860
@tongluo9860 Жыл бұрын
best video on SVD topic, concept crispy clear, and the video is movie quality, really enjoy
@sundaynyanabo2777
@sundaynyanabo2777 3 жыл бұрын
This is one of the best Stuffz I've seen on the internet.....explicit and was taught with so much passion, rigor, Aggressiveness in presentation, clear intuition etc. With these stepwise approach, even the blind is set and equipped to be a Genius and solve real time problems. You don't want to know how grateful I am. Thanks
@victoriaborjas3066
@victoriaborjas3066 3 жыл бұрын
One of the best KZbin explanation i have ever seen.
@ourybah6227
@ourybah6227 4 жыл бұрын
This is as good as it gets, thank you. I got the book last week, and the Python implementation from the website is a great addition. Thanks again for these priceless videos.
@Arkturium
@Arkturium 4 жыл бұрын
That was so clear and well explained. It's really connecting a lot of dots in the maths I've had to learn for modelling in engineering
@somdubey5436
@somdubey5436 2 жыл бұрын
I don't think I would ever see this sort of intuitive explanation of U and V matrices in terms of Eigen values and Eigen vectors. Thanks a lot Professor for helping humanity in understanding SVD.
@supersnowva6717
@supersnowva6717 Жыл бұрын
Amazing lecture, thank you for making the SVD series! I appreciate the importance of SVD much more after watching your lectures. Thank you again!
@yuxiang3147
@yuxiang3147 Жыл бұрын
You are the the best teacher I have seen
@hiryu87
@hiryu87 3 жыл бұрын
This is so good. We need more teachers like this!
@nournote
@nournote 4 жыл бұрын
Thanks a lot. I haven't watched the series yet, but I already know it's fantastic. I already liked your classroom course on the same topic.
@payman_azari
@payman_azari Жыл бұрын
wow these tutorials needs high level of attention! thanks for the extraordinary lectures
@ayseharukaackbas3286
@ayseharukaackbas3286 Жыл бұрын
Thank you for making such excellent learning material freely available, you are a godsend.
@RS-el7iu
@RS-el7iu 4 жыл бұрын
thanks a lot ❤.. youve made someone in Lebanon get SVD in a very clear way :)))....
@magnusoksbltherkelsen2453
@magnusoksbltherkelsen2453 4 жыл бұрын
Thank you very much for these amazing lecture videos, I bought the book after watching a few of them. I am also very happy that you made it available as an ebook as well, it works great on my Kindle :)
@gregorymacchio4077
@gregorymacchio4077 4 жыл бұрын
Making my research ever so meaningful! Thank you.
@StratosFair
@StratosFair 8 ай бұрын
This lecture cleared a misunderstanding I had about SVD. Thank you so much !
@Eigensteve
@Eigensteve 8 ай бұрын
Glad it was helpful!
@PunmasterSTP
@PunmasterSTP Жыл бұрын
Dominant correlations? More like "Dang good pieces of information!" Being serious, this is some of the absolute best content on KZbin.
@giuniorcandido177
@giuniorcandido177 3 жыл бұрын
God Bless you, Steve because you make a big difference in how to teach ML salut from Brazil
@assiadehiles153
@assiadehiles153 2 жыл бұрын
Thank you so much ! This seemed so complicated and you made It so clear and easily understandable , amazing !
@claire1806
@claire1806 2 жыл бұрын
Thank you so much for uploading these amazing lectures!
@SigmaChuck
@SigmaChuck 2 жыл бұрын
I am not clear on how X^tX is a correlation matrix, which I understand to be a matrix of correlation coefficients between variables. Really enjoying this series.
@7898xd
@7898xd Жыл бұрын
Yeah that's my problem too! The video is incredible but I don't get why those matrixes are correlations matrixes
@neurochannels
@neurochannels Жыл бұрын
It technically isn't a correlation matrix. If you center each variable (column) by subtracting a mean, then X'X yields a covariance matrix (if you divide the whole result by n-1). If you normalize each column in X by its standard deviation, then X'X yields a correlation matrix. I think he leaves out these details to simplify.
@SigmaChuck
@SigmaChuck Жыл бұрын
@@neurochannels ty .. that seems plausible
@evanparshall1323
@evanparshall1323 3 жыл бұрын
This video is incredible. Such a great explanation
@tavrion
@tavrion 2 жыл бұрын
Incredibly well explained. Thank you!
@mm__1659
@mm__1659 3 жыл бұрын
Your lectures are treasure for us ...........👍👍❤️❤️
@Eigensteve
@Eigensteve 3 жыл бұрын
So nice of you!
@justmohsend
@justmohsend 4 жыл бұрын
As always, It is really nice to have more prof. Brunton content to learn from. This is the nudge I needed to get your new book and dive deeper to data-driven modeling and machine learning control. Thanks, prof. Brunton. I just finished the series, I am really looking forward to the rest of it. If the [X]T[X] represents the correlation between the columns, does that mean that the diagonal has the largest elements? And does [X][X]T represent the correlation between the row? Can that be interpreted in any meaningful way?
@user-rp7oo8vk9v
@user-rp7oo8vk9v 3 жыл бұрын
Your teaching is so impressive and awesome.
@Zinzin09
@Zinzin09 4 жыл бұрын
Thank you for the great video! Couple questions: 1. Is the column space of the matrix X what people usually refer to as the feature space? 2. How does it relate to the standard basis on R^n? 3. When you take the inner products in (X^T)(X), is that the inner product in R^n? or some other inner product? i.e pearsons correlation on the space of random variables.
@kezwikHD
@kezwikHD 3 жыл бұрын
Very good videos you are making! Keep it up! I honestly understood nothing at all when my professor tried to explain SVD and stuff...i really don't like the lecture style of my university because it is basically: yeah the SVD is a thing and you have the decomposition of input and you can use it and i mean it is easy to see that this are eigenvalues and this are the eigenvectors and you easily get to that formula Like this is how the professor says it..but without even writing something down or showing anything..or if he writes something it is literally the sentences he said. So i was just like: wtf is this and how does it work and why are these eigenvalues and why this formula and stuff. And sure at a university you should always ask if something is unclear but there is always waaaaay to much to ask about simply because the explanations are a big amount of BS. And now watching this i understood sooo much..and it even only took 11 minutes..i really wish my university would explain like that :( anyways i am glad there are people like you so in my freetime i just learn all the stuff on my own 🤷‍♂️
@buh357
@buh357 2 жыл бұрын
Thank you, your explanation is beautiful.
@kumarass2493
@kumarass2493 4 жыл бұрын
very clear. I love it professor
@djchrisi
@djchrisi 3 жыл бұрын
For this amazing videos you are going to heaven.
@augustye3489
@augustye3489 3 жыл бұрын
You're awesome. I wish I could watch this video a few years earlier.
@allenwang149
@allenwang149 3 жыл бұрын
you just got a genius for teaching complex things.
@robertof.8174
@robertof.8174 5 ай бұрын
Excelent content, really love it! thanks
@ksjksjgg
@ksjksjgg 2 жыл бұрын
Great thanks professor!!! At last I understood what SVD is
@just_depie
@just_depie Жыл бұрын
Thank you so much, your videos helped a lot !
@ad2181
@ad2181 3 жыл бұрын
First SVD I've ever liked and like spreading it.
@jbond5834
@jbond5834 2 жыл бұрын
greatly put. again, linear algebra is magic.
@Plin34
@Plin34 3 жыл бұрын
Great interpretations!
@eriche6250
@eriche6250 4 жыл бұрын
Great explanation! thanks for sharing
@Eigensteve
@Eigensteve 4 жыл бұрын
My pleasure!
@marcavila3938
@marcavila3938 3 жыл бұрын
Nicely explained again
@parnashish1910
@parnashish1910 2 жыл бұрын
Beautifully explained :)
@sajidhaniff01
@sajidhaniff01 3 жыл бұрын
Awesome videos!
@ixy6864
@ixy6864 2 жыл бұрын
thank you very much! you video is most intuition!!!😊
@rs6392
@rs6392 4 жыл бұрын
KZbin should add in option to put multiple likes
@abhijitpai2409
@abhijitpai2409 3 жыл бұрын
This is gold.
@FalconSmart
@FalconSmart 4 жыл бұрын
Thank you so much! So clear, this is what you need to be a good engineer/scientist in any fields.
@afarro
@afarro 3 жыл бұрын
Very clean and clear explanation of the computational process for SVD. However, it doesn’t explain much the intuitive concepts of SVD in terms of what it is conceptually as a decomposition of a linear transformation with basis change.
@lawrencekool
@lawrencekool 3 жыл бұрын
Awesome!!
@julianomachado8455
@julianomachado8455 3 жыл бұрын
Very impressive and didatic.
@prandtlmayer
@prandtlmayer 4 жыл бұрын
Mind blowing
@yuntianliu2612
@yuntianliu2612 3 жыл бұрын
THAT WAS A W E S O M E!
@karraguer
@karraguer 2 ай бұрын
For sure, the best explanation I'd seen about the whole topic. I'm very surprised on the production and staging. Prof. Brunton seems to be writing in a transparent surface with actual pens. However, it's not possible to film it from the perspective all we are seeing unless he is doing mirror writing, like Leonardo Davinci. Involved mathematics, fluent speaking and mirror writing, at the same time! too much, even for a sharp mind. So, how was done? Maybe the scene is reflected in a physical mirror and the camera is pointing that mirror, and some linear algebra to correct the perspective 🙂, or maybe everything is advanced postprocessing. Please, we want to know how did you shoot this fantastic video. Greetings from Spain!
@forooghfarajzade8206
@forooghfarajzade8206 Жыл бұрын
thank you so much
@federicopeconi8095
@federicopeconi8095 4 жыл бұрын
Amazing
@donaldslowik5009
@donaldslowik5009 2 жыл бұрын
I think, If nxm X has n features of m faces, then (X-p)(X-p)^T , where p is nx1 vector of row means(here p is broadcast from nx1 to nxm), is the nxn covariance matrix of the features among those faces. (X-q)^T (X-q), where q is the 1xm row vector of column means (again broadcast to nxm), is the mxm covariance between the faces among those features. After dividing by the number of data points - 1.
@michaelcroucher2195
@michaelcroucher2195 3 жыл бұрын
Great lecture Steve. You say that we usually want the economy SVD. In what situations would we want to compute the full svd please?
@vardansahakyan5249
@vardansahakyan5249 2 жыл бұрын
Brilliant. I think there is one little error at 6.10. Sigma has the size mxn, so Sigma Transpose is of the size nxm. They both have sigma1,..,sigma_m in the diagonal. So instead of Sigma^2 we get another mxm matrix. It doesn't change end result,but I thought it was important to mention to avoid confusion.
@samhsu811
@samhsu811 3 жыл бұрын
Why would you say U is the eigenvectors of the column space of the data when U is corresponding to X @ X.T which is the correlation between rows of X.
@jocelinbordet5335
@jocelinbordet5335 Ай бұрын
Thanks for the video! Does the data have to be centered/standardized for transpose(X)*X to be the "correlation matrix" ?
@pbs1970
@pbs1970 3 жыл бұрын
Hello professor, many thanks for the video's. I have watching both your electrical engineering (laplace transform) videos, as well as this one on SVD. I am bit confused where the approximation you state (towards the end of the video) comes from. Is the original SVD itself approximate ? In the case of truncated SVD, we are just dropping the columns that would be zero'd anyway correct ?, so at what point of time do things become approximate.
@tnuts92
@tnuts92 9 ай бұрын
I am greatful for such a good series of videos with you Steve as your are really talented. I have a question that might be basic : to obtain a proper correlation matrix, one should center X and reduce it afterwards ? I asked chatgpt but trust you more on this ^^'
@tnuts92
@tnuts92 9 ай бұрын
I know this specific video is about intuition, I'm just asking to confirm my understanding thanks :)
@Eigensteve
@Eigensteve 9 ай бұрын
Thanks, I'm glad you like them! Yes, typically you would center X (subtract the column-wise mean or row-wise mean first, and then take inner products with either all pairs of columns, or all pairs of rows). We discuss this in databookuw.com/databookV2.pdf if you want more details.
@tnuts92
@tnuts92 9 ай бұрын
Ok super thanks!@@Eigensteve
@shobhitkulshreshtha7418
@shobhitkulshreshtha7418 5 ай бұрын
Hi. Great explanantion! Just had a slight confusion. Isnt it like Inner product signifies similarity but wont be able to give a correlation since 'technically' correlation could mean a measure of linear reltionship between 2 variables (That is, how much a value cahnges in response to the other)? Bu tin the video, it is marked as a correlation.
@andrewwilliam2209
@andrewwilliam2209 3 жыл бұрын
Hi, so I'm just wondering, does a column in X here represent a feature column, and the rows represent the different samples, or do the columns represent the different samples and the rows represent the different features?
@renato5668
@renato5668 3 жыл бұрын
The class is just incredible, and you are a very good teacher. But I just want to know, how do you write it in front of you and the text and math symbols are not inverted to us? Thats a nice magic trick.
@bipanbhatta2736
@bipanbhatta2736 4 жыл бұрын
I performed XT*X for V and X*XT for U on a matrix X. But X does not equal to the U*S*VT. I even converted the values of U and V to orthogonal. Still does not equal to the orignal X. What am I doing wrong?
@RoyTescaro
@RoyTescaro 5 ай бұрын
Thank you for the great content, I have a question: At 3:54 you say "if these are people's faces then the ij-th entry of this matrix ( X^t X ) is the inner product between person i and person j 's face, so if there's a value of this matrix that is large that means that those two people had a large inner product, their faces are similar, they have the same basic face structure. If you have a small value of this inner product that means that they are nearly orthogonal and they're very different faces". The inner product (as usually defined for a Euclidean space) of two vectors gives you an idea of the "similarity" of the two vectors, in this case each vector is a "list" of numbers each representing a pixel of a picture of a face. The idea of "if the two vectors are similar then the two faces they represent are similar" supposes that the pictures were somehow taken such that the distribution of the values of the pixels was related to the facial characteristic, correct? Otherwise it's not clear to me how could the similarity between two faces be "detected" (for example two photos of the same face in different lighting/position would be associated to two very different vectors). Thanks!
@mikhailnovichkov1178
@mikhailnovichkov1178 4 жыл бұрын
Why dot product of X is correlation matrix? Did we substructure the mean somewhere?
@Victory190
@Victory190 3 жыл бұрын
@7.28 can we say that svd= v€2VT is equally represented by cholesky decomposition LDLT?
@666lysis
@666lysis Жыл бұрын
Hi. What isn't a hat V there since we are using the truncated SVD? Thanks.
@Chloe-ty9mn
@Chloe-ty9mn Ай бұрын
great video! thanks so much!i have a question... what if you have data where m>>n? so my datapoints outnumber the parameters i have evaluated for each of them. can you still interpret the SVD such that the columns in U and V are the eigenvectors of the row and column correlation matrix, respectively?
@Chloe-ty9mn
@Chloe-ty9mn Ай бұрын
ah sorry! i think that may have been a silly question! i think i'll just switch the rows and columns of my data so that n>>>m and just pull out whichever U or V would be more meaningful for my data!
@hananehome2108
@hananehome2108 3 жыл бұрын
I just cannot not pushing the thumbs up button thanks
@nickknight5373
@nickknight5373 3 жыл бұрын
Good video but it would also be handy if you marked the array sizes of the SVD factors (both full & economy), not just those of the correlation matrix factors.
@Bonzazaz
@Bonzazaz 3 жыл бұрын
Just to note, it is not the correlation matrix but the covariance matrix which is gained by the X^T X etc. Otherwise great video, I really needed this type of intuition for SVD:s.
@billperlman6168
@billperlman6168 Жыл бұрын
I know that your post is a year old, but I was surprised that you seem to have been the only person that noticed the "correlation"/"covariance" confusion. Too bad that Steve Brunton did not respond to your post.
@changhanchao9654
@changhanchao9654 2 жыл бұрын
is V the single vectors of the correlation matrix or the data matrix? 8:12
@nathanaellwelter2581
@nathanaellwelter2581 4 жыл бұрын
In the book (p. 13) you say: "_This provides an intuitive interpretation of the SVD, where the columns of U are eigenvectors of the correlation matrix XX* and columns of V are eigenvectors of X*X". Isn't the rows of U are eigenvectors of the correlation matrix XX* and rows of V are eigenvectors of X*X, is it?
@parisaghanad8042
@parisaghanad8042 4 жыл бұрын
I have a question. Does anyone know why the value of inner product is related to the correlation ?
@deepdeep3157
@deepdeep3157 3 жыл бұрын
what kind of board do you use ??
@carolinavertis5793
@carolinavertis5793 4 жыл бұрын
Hi Prof. Steve, I don't understand why the eigenvectors given by the right singular vectors do not correspond to the same vectors when I compute Eigenvectors function (in Mathematica software) of the matrix X^T.X. Could you please help me? I have a singular matrix X ={{-1, 0, -1}, {1, -1, 0}, {0, 1, 1}}, with rank =2, with the economy SVD I get V= {{1/Sqrt[2], -(1/Sqrt[6])}, {0, Sqrt[2/3]}, {1/Sqrt[2], 1/Sqrt[6]}}. The relationships are clearly satisfied as the definition of eigenvectors, i.e., X^T.X.v=3.v. However, when I run the Eigenvector[ X^T.X] it returns {{1, 0, 1}, {-1, 1, 0}, {-1, -1, 1}}, where the first vector is ok, and corresponds to 1/Sqrt[2]*{1, 0, 1}, but the remain ones have no correlation with v2= {-(1/Sqrt[6]), Sqrt[2/3], 1/Sqrt[6]}. However, I know that this last eigenvector {-1, -1, 1} corresponds to the (right) null space of X. Am I doing something wrong? Maybe it is because this matrix has the four spaces and that is the reason for these vector are not correponding?? So, my question is, why I got different eigen vectors when considering two different techiques? Shouldn't they span the same space? Thank you
SVD: Image Compression [Matlab]
14:19
Steve Brunton
Рет қаралды 78 М.
Singular Value Decomposition (SVD): Matrix Approximation
14:54
Steve Brunton
Рет қаралды 231 М.
Looks realistic #tiktok
00:22
Анастасия Тарасова
Рет қаралды 100 МЛН
Survival skills: A great idea with duct tape #survival #lifehacks #camping
00:27
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 368 М.
Linear Systems of Equations, Least Squares Regression, Pseudoinverse
11:53
Linear Regression
10:56
Steve Brunton
Рет қаралды 46 М.
Gilbert Strang: Singular Value Decomposition
5:06
Lex Fridman
Рет қаралды 62 М.
Singular Value Decomposition | Linear algebra episode 9
31:17
All Angles
Рет қаралды 2,4 М.
The Matrix Transpose: Visual Intuition
26:01
Sam Levey
Рет қаралды 18 М.
We Need to Rethink Exercise - The Workout Paradox
12:00
Kurzgesagt – In a Nutshell
Рет қаралды 1,9 МЛН
Randomized Singular Value Decomposition (SVD)
13:12
Steve Brunton
Рет қаралды 28 М.
iPhone 15 Pro в реальной жизни
24:07
HUDAKOV
Рет қаралды 334 М.
Здесь упор в процессор
18:02
Рома, Просто Рома
Рет қаралды 339 М.
PART 52 || DIY Wireless Switch forElectronic Lights - Easy Guide!
1:01
HUBAB__OFFICIAL
Рет қаралды 49 МЛН