Least squares approximation | Linear Algebra | Khan Academy

  Рет қаралды 538,417

Khan Academy

Khan Academy

14 жыл бұрын

Courses on Khan Academy are always 100% free. Start practicing-and saving your progress-now: www.khanacademy.org/math/line...
The least squares approximation for otherwise unsolvable equations
Watch the next lesson: www.khanacademy.org/math/line...
Missed the previous lesson?
www.khanacademy.org/math/line...
Linear Algebra on Khan Academy: Have you ever wondered what the difference is between speed and velocity? Ever try to visualize in four dimensions or six or seven? Linear algebra describes things in two dimensions, but many of the concepts can be extended into three, four or more. Linear algebra implies two dimensional reasoning, however, the concepts covered in linear algebra provide the basis for multi-dimensional representations of mathematical reasoning. Matrices, vectors, vector spaces, transformations, eigenvectors/values all help us to visualize and understand multi dimensional concepts. This is an advanced course normally taken by science or engineering majors after taking at least two semesters of calculus (although calculus really isn't a prereq) so don't confuse this with regular high school algebra.
About Khan Academy: Khan Academy offers practice exercises, instructional videos, and a personalized learning dashboard that empower learners to study at their own pace in and outside of the classroom. We tackle math, science, computer programming, history, art history, economics, and more. Our math missions guide learners from kindergarten to calculus using state-of-the-art, adaptive technology that identifies strengths and learning gaps. We've also partnered with institutions like NASA, The Museum of Modern Art, The California Academy of Sciences, and MIT to offer specialized content.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to KhanAcademy’s Linear Algebra channel: / channel
Subscribe to KhanAcademy: kzbin.info_...

Пікірлер: 96
@sarabenavidez3863
@sarabenavidez3863 4 жыл бұрын
"Some of you might already know where this is going.." Me: Nope
@howbani
@howbani 3 жыл бұрын
Hahaha
@tahiriqbal177
@tahiriqbal177 3 жыл бұрын
What do u mean
@aashaytambi3268
@aashaytambi3268 3 жыл бұрын
When I get a real job, I will donate my bonus to Khan Academy. This has saved me so much time and you are so awesome.
@Kialikhun123
@Kialikhun123 Жыл бұрын
Did you get a job yet?
@mafiafiazmafiafiaz3410
@mafiafiazmafiafiaz3410 7 ай бұрын
Bol na jana❤❤
@user-ew2sz3ez4n
@user-ew2sz3ez4n 2 ай бұрын
They are still waiting fir your bonus mate
@_spectre_1589
@_spectre_1589 Жыл бұрын
This lesson is fantastic! I understood the problem in only 15 minutes! You're absolutely better than my numerical analysis teacher at university, that can't properly teach an argument in two hours! Thank you!
@haojiang4882
@haojiang4882 7 жыл бұрын
Comes in handy while studying machine learning.
@KeystoneScience
@KeystoneScience 7 жыл бұрын
yes, same
@mwaleed2082
@mwaleed2082 3 жыл бұрын
Very true. When I was studying ML, "normal equation", I really thought that I had seen it somewhere. Then I realized I studied it in Lin. algb.
@adamhuang2421
@adamhuang2421 11 жыл бұрын
very helpful! Thanks a lot! you are doing great things! I also listened to your other videos, all very wonderful!
@nicholascunningham6936
@nicholascunningham6936 4 ай бұрын
This channel is a blessing. I've had some really bad professors and I've had some really good professors. But even the really good professor never made the concepts click with me as well as these videos do. Like, not only do I understand the math better, but just the little diagram you drew showing A's column space, and visibly showing how b is outside of A's column space yet could still be approximated using a vector v in A's column space, like idk how else to describe it but that just made it click for me. Edit: I guess _one_ way to describe it and how it clicked for me: So we use a line to approximate a bunch of data points on a graph, or plane. If these data points were in a straight line, the "approximation" would have no error. However, this is often not the case. Now, think about the equation y=mx+b. Let's use c instead of b to avoid confusion in the next step. So we have y=mx+c. This is the equation used to represent our line. Suppose b=y-c. Then we have mx=b, which looks a lot like Ax=b. And it is! A is just a 1x1 matrix. So the line is bounded by the column space of A, or m, and our variable(s) (in this case, just x) can be changed to get b. Just basic algebra: if m=3 and b=6, then x=2. But say b is a 2D vector, e.g. b=(1, 2)^T. Well now, no matter what x you use, you can't get b (unless b just happens to lie on the line). You can only get as close to b as the column space of A will allow you. In the diagram drawn in the video, the column space of A is a plane, so the span of A is 2. For simplicity, let's suppose A is a 3x2 matrix (a geometrical interpretation of this is that A is a 2D plane "floating" in a 3D space). b appears to be a 3D vector (so while A is only a 2D slice of the 3D space, b is a point that could be anywhere in the 3D space). So, just like before, we try to use a line bounded by the column space of A to get as close to b as possible by changing our variables (in this case, x1 and x2). Correct me if my understanding is wrong :)
@rajj1567
@rajj1567 11 жыл бұрын
Your videos are just great !!! The concepts with geometrical examples make very good sense !!! Thanks a lot
@fireheart9715
@fireheart9715 6 ай бұрын
This was incredible, I started this video off being so confused about the least squares, and I just get it entirely now! Thank you so much :)
@samirrimas
@samirrimas 11 жыл бұрын
Very useful man you are doing an amazing job this literally saved me hours of searching and reading can't thank you enough :)
@elmiramb
@elmiramb 14 жыл бұрын
Thanks a lot, very comprehensive ! great job!
@xesan555
@xesan555 7 жыл бұрын
Thanks so much Khan...wonderful explanation in two videos that explains everything...great. You are wonderful
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
I was doing an online machine learning course and got lost when the lecturer introduced the normal equation (which this is, with a different name). Needless to say, I'm finna binge-watch your linear algebra lectures now because I get insecure about using equations I don't understand. Thanks for the playlist, I really wanna put ML in my toolset so we're doing this!
@khaledsherif7056
@khaledsherif7056 4 жыл бұрын
Can you please mention the name/link of the course ?
@mwaleed2082
@mwaleed2082 3 жыл бұрын
@@khaledsherif7056 not sure which course he used for ML, but I'm studying Machine Learning by Andrew NG on Coursera. When he was teaching us normal equation as an alternative to gradient descent In Week 2 of the course, I realized I had seen this in Linear algebra but with a different name which is the title of this video.
@ottoomen5076
@ottoomen5076 6 жыл бұрын
Excellent explanation of a valuable technique.
@jibran6635
@jibran6635 3 жыл бұрын
This is super useful in solving assignments.THanks khan academy.
@aidawall8
@aidawall8 11 жыл бұрын
You are like a billion times better than my professor... and my professor isn't even bad. On the contrary he's my favorite! You're just even better at explaining things. Plus it's impossible for me to lose focus with the pretty colors and your beautiful handwriting. lol I have my Linear Algebra final tomorrow (technically today) and I owe the A that I'm sure to get to you and all your helpful videos!
@potatojam6519
@potatojam6519 4 жыл бұрын
7 years later... did you get an A? :)
@HustleHeaven247
@HustleHeaven247 7 ай бұрын
11 years later did you get that A?
@johnfykhikc
@johnfykhikc 6 жыл бұрын
Best approach to the problem. No gradient, no multivariable calculus. you're master!
@Jshizzle2
@Jshizzle2 5 жыл бұрын
Helpful exploration of least square properties
@user-ot4rp8yn8r
@user-ot4rp8yn8r 2 жыл бұрын
Best linear algebra playlist.
@kartarsingh7776
@kartarsingh7776 6 жыл бұрын
Super clarity......
@Awhobiwom
@Awhobiwom 5 жыл бұрын
Thank you so much. You just simplified long boring hours of confusing lecture
@seprage
@seprage 8 жыл бұрын
It would be great having links when says "I explained (whatever) in a different video" to access that explanation. In this case I wanted to know why C(A)transpose=N(Atranspose). Thanks¡
@Daski69
@Daski69 8 жыл бұрын
+Sergio Prada same thing here
@196phani
@196phani 6 жыл бұрын
www.khanacademy.org/math/linear-algebra/alternate-bases/othogonal-complements/v/linear-algebra-orthogonal-complements go through this to understand how C(A)transpose=N(Atranspose).
@MrVishyG
@MrVishyG 5 жыл бұрын
+1
@gulshanjangid3470
@gulshanjangid3470 5 жыл бұрын
consider any vector x perpendicular to Column space of A i.e. belongs to A _|_. Then dot product of A and x is 0, i.e. (A^T)(x) = 0 Now consider b = A^T, so clearly above equation is bx = 0, i.e. x lies in null space of b Thus x lies in null space of A^T also as in the first line I said x belongs to A perpendicular , thus C(A _|_) = null(A^T)
@fascist27
@fascist27 14 жыл бұрын
really helpful
@rbfreitas
@rbfreitas 14 жыл бұрын
Good video!!!! And nice work! Good luck with the KhanAcademy :)
@tranzconceptual
@tranzconceptual 9 жыл бұрын
god dang it I knew I should have chosen other bachelor thesis..
@user-xedwsg
@user-xedwsg 7 жыл бұрын
haha!!!!
@OurEverlastingYouth
@OurEverlastingYouth 6 жыл бұрын
just realizing this now as well
@rob6129
@rob6129 4 жыл бұрын
first semester stuff at my uni
@gangigooga7710
@gangigooga7710 4 жыл бұрын
@@rob6129 what uni u attending?
@MistrVahag
@MistrVahag 13 жыл бұрын
Excelent video. Thanks much :)))))))) Vahag
@mattralston4969
@mattralston4969 4 жыл бұрын
Thank you Salman Khan. I appreciate the opportunity to relearn the method here. You can never hear this stuff enough times.
@batmendbatbaatar4290
@batmendbatbaatar4290 3 жыл бұрын
This is surprisingly easy
@rob6129
@rob6129 4 жыл бұрын
Nice derivation of the normal equation
@kalvinsackey1804
@kalvinsackey1804 6 ай бұрын
can we please get a video for the maximum likelihood estimation
@Matterhorn1125
@Matterhorn1125 12 жыл бұрын
can you teach me cubic expressions and cubic equations :) eg. solve the equation x(3X3X3) - 2x(2X2) - x + 2 = 0 by using the factor theorem formula :)
@adithyavarma758
@adithyavarma758 Жыл бұрын
thank you very much sir
@lancelofjohn6995
@lancelofjohn6995 2 жыл бұрын
It seems I have seen the best video!
@inserthere6387
@inserthere6387 5 жыл бұрын
great geometric intuition of linear regression
@EWang-yn5sy
@EWang-yn5sy 5 жыл бұрын
This guy is good...........
@mustafasabeeh8893
@mustafasabeeh8893 3 жыл бұрын
thanks
@arico94
@arico94 5 жыл бұрын
Should have used n instead of k its usually mxn in R^n
@ArafatAmin
@ArafatAmin 11 жыл бұрын
what happens when AT*A is singular. How do we solve for the least square solution?
@budharpey
@budharpey 11 жыл бұрын
Very useful! In my lecture slides I had this term Hx=z for the same problem and I couldn't make sense of how we could get to this as the best solution: x = (Ht*H)^-1 * Ht * z. Now I understand:-)
@luffy08dn
@luffy08dn 13 жыл бұрын
thaks
@zhiqiguo803
@zhiqiguo803 10 жыл бұрын
love this guy
@BlackfireGippal
@BlackfireGippal 12 жыл бұрын
I wish to know how to solve this: x has values of : -2 0 1 2 3 and y : 17 5 2 1 2 and i'm asked to use the least squares method, but i've been absent and i don't know exactly what my teacher ment by that or what that method consists of. Can anyone help me solve this ?
@SanwaOfficial
@SanwaOfficial 5 жыл бұрын
I have one question, whether the LSS always consistent? if yes, how can I prove it? please answer
@mwaleed2082
@mwaleed2082 3 жыл бұрын
Hi, not sure if you're still looking for the answer, but could you please describe what do you mean by consistent?
@bb_-.-_-.407
@bb_-.-_-.407 6 ай бұрын
It means that wheather we can always find least square solution of a system.
@utte12
@utte12 11 жыл бұрын
nice vid, but why did you take the length squared? i understand that the length of the vector would be sqrt(b1^2 + b2^2...bn^2) but why did you square even that?
@lucasm4299
@lucasm4299 6 жыл бұрын
utte12 Because it’s easier to work with minimizing the sum of squares than minimizing the square root of a sum of squares. That’s my guess
@trejkaz
@trejkaz 2 жыл бұрын
I tried using this trick for the problem I'm facing, but it turns out that when I multiply AT by A, I get a matrix which isn't invertible, so I still can't solve it. LOL This _still_ seems odd to me, because even if some element in the input matrix A was contributing 0 to the result b, it should _still_ be possible to get a point as close as possible to the result.
@MrZulfiqar37
@MrZulfiqar37 9 жыл бұрын
I have a question.. does least sequare approximation has always solution..
@CR-iz1od
@CR-iz1od 8 жыл бұрын
+Zulfiqar Ali not if you don't solve it.
@Daski69
@Daski69 8 жыл бұрын
+Conor Raypholtz it still has a universally reasonable solution
@spindash64
@spindash64 7 жыл бұрын
I'm pretty sure that is the idea of least squares: to provide a close answer when you can't give an exact one
@shredding121
@shredding121 6 жыл бұрын
it does always have one - if Ax = b has a solution than it's a vector on A and if not it's the projection on A.
@natebush26
@natebush26 6 жыл бұрын
There is always a solution to the least squares problem. Why? x* is in colspace(A) by definition of being a projection from b into C(A) so there must be a set of weights that yield a linear combination of a that equal b.
@winnies1001
@winnies1001 6 жыл бұрын
how did you know that it was a projection to the Col(A) and not anything else like the Range(A)?
@lucasm4299
@lucasm4299 6 жыл бұрын
Winnie Shi Col(A) already is the range of A.
@kavishdoshi2408
@kavishdoshi2408 8 жыл бұрын
accha hai
@user-te2hd9nb8q
@user-te2hd9nb8q 2 жыл бұрын
당신은 나의 구원자입니다. 정말 명쾌한 강의입니다. 감사합니다!! 👍👍👍
@shinigummyl1586
@shinigummyl1586 6 жыл бұрын
2018? Im alone :(
@diesel7777777
@diesel7777777 5 жыл бұрын
I'm here.
@bli240
@bli240 5 жыл бұрын
Onto 2019!
@bb_-.-_-.407
@bb_-.-_-.407 6 ай бұрын
2024 here
@priestofrhythm
@priestofrhythm 12 жыл бұрын
I am the 60th guy liking it !! :P :D Great vid, thank you. :)
@91leonetammie
@91leonetammie 4 жыл бұрын
This is the first Khan Academy video I watch and don't understand...
@mwaleed2082
@mwaleed2082 3 жыл бұрын
For that you need to study orthogonal components, and the concept of what spanning sets are which further derive the concept of column space, null space, etc.
@nilsclaessens5203
@nilsclaessens5203 6 ай бұрын
@Ben.N
@Ben.N 2 жыл бұрын
Big brajn
@spechtbert
@spechtbert 12 жыл бұрын
n1
@dion9795
@dion9795 2 жыл бұрын
bro just do an example lol
@aysegocer3308
@aysegocer3308 2 жыл бұрын
🤩
@jihyepark9139
@jihyepark9139 3 жыл бұрын
Sometimes I can't see what he's writing.
@user-ud7nv6fp6q
@user-ud7nv6fp6q Жыл бұрын
gorgeous
@hugoderuyver
@hugoderuyver 3 жыл бұрын
ICAM ! ICAM ! .... .. ...... !
@fascist27
@fascist27 14 жыл бұрын
Respond to this video...
@roy5180
@roy5180 2 жыл бұрын
thank you sir
УГАДАЙ ГДЕ ПРАВИЛЬНЫЙ ЦВЕТ?😱
00:14
МЯТНАЯ ФАНТА
Рет қаралды 3,9 МЛН
Beautiful gymnastics 😍☺️
00:15
Lexa_Merin
Рет қаралды 15 МЛН
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 5 МЛН
Least squares using matrices | Lecture 26 | Matrix Algebra for Engineers
10:15
The Big Picture of Linear Algebra
15:57
MIT OpenCourseWare
Рет қаралды 964 М.
Linear Systems of Equations, Least Squares Regression, Pseudoinverse
11:53
Linear Algebra 6.5.1 Least Squares Problems
18:27
Kimberly Brehm
Рет қаралды 49 М.
Linear Regression Using Least Squares Method - Line of Best Fit Equation
15:05
The Organic Chemistry Tutor
Рет қаралды 1,3 МЛН
1. The Geometry of Linear Equations
39:49
MIT OpenCourseWare
Рет қаралды 1,7 МЛН
Linear Least Squares to Solve Nonlinear Problems
12:27
The Math Coffeeshop
Рет қаралды 29 М.
Introduction to residuals and least squares regression
7:39
Khan Academy
Рет қаралды 592 М.