Linear Regression

  Рет қаралды 49,779

Steve Brunton

Steve Brunton

Күн бұрын

Linear regression is a cornerstone of data-driven modeling; here we show how the SVD can be used for linear regression.
Book PDF: databookuw.com/...
Book Website: databookuw.com
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com...
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Пікірлер: 33
@nikosips
@nikosips 4 жыл бұрын
Those videos are the best resource for someone who wants to understand data driven models! Thank you very much for your work from an engineering student!!
@Ash-bc8vw
@Ash-bc8vw 2 жыл бұрын
I don't think anybody is teaching LR with respect to SVD on KZbin right now, hence this video is more informative! Loved it immediately subscribed
@patrickxu8795
@patrickxu8795 2 жыл бұрын
The lecture is so clear and well-organized! IT IS IMPRESSIVE!!!!
@appliedmathness8397
@appliedmathness8397 3 жыл бұрын
I love thiese videos! But in this one you point out the "squared projection error" while showing the segment going from the biased line to the outlier (like in PCA); instead in case of linear regression residuals should be vertical lines.
@dmitrystikheev3384
@dmitrystikheev3384 3 жыл бұрын
I was looking for copper, but found gold! Boss, excellent as always. Love your way of conveying the material. I hope you will continue presenting more topics on statistics, cause in the multivariate case it can become really intimidating. Best regards from Russia!
@SoroushRabiei
@SoroushRabiei 4 жыл бұрын
Dear professor, you're a great teacher! Thank you so much for these videos.
@vahegizhlaryan5052
@vahegizhlaryan5052 Жыл бұрын
I am honestly surprised(just accidentally discovered this channel) why this coolest recourse is not popular among KZbin algorithms
@shakibyazdani9276
@shakibyazdani9276 3 жыл бұрын
Absolutely awesome series, I will finish the whole series today:)
@Eigensteve
@Eigensteve 3 жыл бұрын
Hope you enjoy it!
@ParthaPratimBose
@ParthaPratimBose 3 жыл бұрын
Hi Steve, I am a pharmaceutical data analyst, but you're just outstanding
@Eigensteve
@Eigensteve 3 жыл бұрын
Wow, thanks!
@saitaro
@saitaro 4 жыл бұрын
This is gold, professor!
@motbus3
@motbus3 3 жыл бұрын
besides the very awesome explanation, the book is awesome and he writes mirrored as it was nothing 😄
@Chloe-ty9mn
@Chloe-ty9mn 4 ай бұрын
i've been watching all the videos in this chapter and this is the one that got me to cave and purchase the book!! i was so surprised to see that it was so affordable. thank you and your team so so so much for the high quality accessible information
@hengzhang7039
@hengzhang7039 4 жыл бұрын
Thank you sir, your courses are awesome!
@PaulNielan
@PaulNielan 11 ай бұрын
Interesting. In the first lecture of this series, individual faces (i.e. people) were in the columns, but a face was really a column of many pixels. In this lecture, people are in the rows. So each use of SVD is different. And each setup of a data matrix is different.
@linnbjorkholm9237
@linnbjorkholm9237 Жыл бұрын
Wow! Great video! I really liked your shirt, where is it from?
@Eigensteve
@Eigensteve Жыл бұрын
It’s a Patagonia capilene. My favorite shirt. I almost only wear them
@udriss1
@udriss1 2 жыл бұрын
Hello. In your book DATA DRIVEN SCIENCE & ENGINEERING page 24, relation (1.26), you express the matrix B. In this relation you must write: B = X - X bar and not as one can read B = X - B bar. With here X bar which is the matrix of means.
@engr.israrkhan
@engr.israrkhan 4 жыл бұрын
sir great teacher you are
@Martin-iw1ll
@Martin-iw1ll 10 ай бұрын
In mechanics, overdetermined is named statically indeterminate
@a.danielhernandez2839
@a.danielhernandez2839 3 жыл бұрын
Excellent explanation!, What happens with the y-interception of the line? Is it b?
@kindleflow
@kindleflow 4 жыл бұрын
Thanks
@sachavanweeren9578
@sachavanweeren9578 3 жыл бұрын
very nice series ... though it has been a while and I might be a bit rusty on my math. But if I recall correctly there is nowhere an explicit link made between the SVD and least squares. It is explained that the there is an SVD and with a theorem that this was the best one in some norm. But I have not seen an explicit link with ols. Would be nice if that would be more explicit in the video series...
@spidertube1000
@spidertube1000 4 жыл бұрын
Good vid bruh
@ralfschmidt3831
@ralfschmidt3831 4 жыл бұрын
I am slightly confused: the orthogonal projection of b onto a should minimize the distance between b and its projection - which is ORTHOGANAL to the span of a. If I remember correctly, the minimum least squares, however, should minimize the VERTICAL distance between the projected and the original point. I am sure there is something wrong with my assumptions but maybe someone can point me in the right direction
@robsenponte3308
@robsenponte3308 2 жыл бұрын
Cool
@anilsenturk408
@anilsenturk408 4 жыл бұрын
How's it going?
@philrincon
@philrincon 2 жыл бұрын
Is he writing in the reverse?
@moshuchitu203
@moshuchitu203 5 ай бұрын
by cross referencing kzbin.info/www/bejne/q5LPnqyQnrWmb9k, one can clearly see the slope derived in the end is nothing but "covariance (a, b)/variance(a)"
@uzferry5524
@uzferry5524 2 жыл бұрын
based
@SkyaTura
@SkyaTura Жыл бұрын
Besides the undeniable quality of the video overall, isn't awesome that he writes backwards in the air just to explain his points? 🤔
@clickle23
@clickle23 3 жыл бұрын
Can you explain why in the example at the end, U = a/|a|, is it because U has the only one eigen vector of matrix AA(transpose), which is just itself?
Linear Regression 1 [Matlab]
12:05
Steve Brunton
Рет қаралды 40 М.
Singular Value Decomposition (SVD): Mathematical Overview
12:51
Steve Brunton
Рет қаралды 395 М.
Кәсіпқой бокс | Жәнібек Әлімханұлы - Андрей Михайлович
48:57
How do Cats Eat Watermelon? 🍉
00:21
One More
Рет қаралды 13 МЛН
Statistics 101: Linear Regression, The Very Basics 📈
22:56
Brandon Foltz
Рет қаралды 2 МЛН
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 387 М.
Learn Statistical Regression in 40 mins! My best video ever. Legit.
40:25
21. Generalized Linear Models
1:15:14
MIT OpenCourseWare
Рет қаралды 137 М.
Linear Regression in Python - Full Project for Beginners
50:52
Alejandro AO - Software & Ai
Рет қаралды 26 М.
Regression Analysis | Full Course
45:17
DATAtab
Рет қаралды 825 М.
Randomized Singular Value Decomposition (SVD)
13:12
Steve Brunton
Рет қаралды 29 М.
13. Regression
1:16:02
MIT OpenCourseWare
Рет қаралды 67 М.
Regression Output Explained
33:19
zedstatistics
Рет қаралды 669 М.