9 Regression as an Orthogonal Projection

  Рет қаралды 3,734

Shaina Race Bennett

Shaina Race Bennett

Күн бұрын

Пікірлер: 9
@AlphansoEric
@AlphansoEric Жыл бұрын
That's amazing video, Beautiful explanation of linear regression in terms of linear algebra.
@MrOndra31
@MrOndra31 Жыл бұрын
Great content! This was the missing link between my linear algebra and econometrics courses :D
@antonlinares2866
@antonlinares2866 11 ай бұрын
Thank you so much, you made algebra and linear regression click for me
@asifzahir7512
@asifzahir7512 Жыл бұрын
Amazing! cleared lots of confusions
@chiarasacchetti8284
@chiarasacchetti8284 6 ай бұрын
This video saved my life
@breathemath4757
@breathemath4757 Жыл бұрын
This is just way too good. Thanks a lot!
@sum1sw
@sum1sw Жыл бұрын
I'm not sure this is what I am looking for, if it is, then I missed it. I have an implicit function f(x,y,z)=0 (it is actually a model with adjustable parameters). I have an experimental data point (Xexp, Yexp, Zexp). You can probably see where I am heading with this. I want to know where a line, orthogonal/perpendicular to the surface, will intersect the surface. I'm calling this point of intersection Xcalc, Ycalc, Zcalc. How do I proceed? Based on other videos I watched, it looks like the first step is to linearize the surface using Taylor series. So, now I have a plane (in terms of partial derivatives and (Xcalc, Ycalc, Zcalc) which is still unknown. I want to know the point of intersection (Xcalc, Ycalc, Zcalc) of the orthogonal line from Xexp, Yexp, Zexp. At first, I thought is it a trial an error iterative procedure (I have to guess Xcalc, Ycalc, Zcalc) so I programmed that, but the answers I am getting do not seem to be correct. I'm also beginning to suspect that the solution can be direct, not iterative. Any thoughts?
@GregorianWater
@GregorianWater 2 ай бұрын
why do we begin at y sub zero and not y sub 1?
@teklehaimanotaman3150
@teklehaimanotaman3150 Жыл бұрын
Very amazing lecture! thank you very much for your efforts. Is the line from the origin to the point y_hat the regression line please?
10   Defining Eigenvalues and Eigenvectors
9:17
Shaina Race Bennett
Рет қаралды 710
Least Squares
29:01
Dr Peyam
Рет қаралды 10 М.
Увеличили моцареллу для @Lorenzo.bagnati
00:48
Кушать Хочу
Рет қаралды 8 МЛН
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2,2 МЛН
ЛУЧШИЙ ФОКУС + секрет! #shorts
00:12
Роман Magic
Рет қаралды 39 МЛН
8   Orthogonal Projections
8:24
Shaina Race Bennett
Рет қаралды 710
Orthogonal Regression
7:48
Statgraphics Technologies, Inc.
Рет қаралды 13 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
9.3) Projection Matrix: Idempotent and Symmetric
3:20
Causal Deep Learning
Рет қаралды 8 М.
The Least Squares Formula: A Derivation
10:31
MathTheBeautiful
Рет қаралды 126 М.
The deeper meaning of matrix transpose
25:41
Mathemaniac
Рет қаралды 388 М.
17. Orthogonal Matrices and Gram-Schmidt
49:10
MIT OpenCourseWare
Рет қаралды 215 М.
Projection into Subspaces
9:51
MIT OpenCourseWare
Рет қаралды 55 М.
Linear Least Squares to Solve Nonlinear Problems
12:27
The Math Coffeeshop
Рет қаралды 32 М.
Увеличили моцареллу для @Lorenzo.bagnati
00:48
Кушать Хочу
Рет қаралды 8 МЛН