Least squares | MIT 18.02SC Multivariable Calculus, Fall 2010

  Рет қаралды 283,074

MIT OpenCourseWare

MIT OpenCourseWare

Күн бұрын

Пікірлер: 138
@vuluongtrieu2609
@vuluongtrieu2609 5 жыл бұрын
1:16 least squares explain 4:42 fit line equations 6:39 fit parabola equations
@mikebmccraw
@mikebmccraw 7 жыл бұрын
Nice explanation!! Very clear!! If you want to know WHY the least squares method works, then watch this explanation. Only three data points are used here so as not to loose the method in a sea of unnecessary complication.
@joyofliving5352
@joyofliving5352 3 жыл бұрын
The best explanation for last decades, and more to come! Linear Least Squares. Could you do a non-linear LS as well? :D We are suckers for real teachers and materials who simplify concepts for the proper digestion.
@vladimircastanon9682
@vladimircastanon9682 2 ай бұрын
Very rarely do the presenters on MIT OpenCourseWare ever take the time to SLOWLY and THROUGHLY explain key concepts/steps and methods. This was brilliant, I love the way you broke everything down to a tea and made the learning process so simple. Thank you! Please keep making more videos
@peterp-a-n4743
@peterp-a-n4743 11 жыл бұрын
The "challenge" was the real explanation.
@jan-torejakobsen6184
@jan-torejakobsen6184 7 жыл бұрын
Finally i found a good explenation for this subject:) Keep up the good work
@EHBRod13
@EHBRod13 9 жыл бұрын
OMG!!!!!! We're doing Regression Analysis in Engineering Stats, which is exactly this!!!! Thanks so much for making it SO much easier to understand!!!!!!!! Much love!
@justpaulo
@justpaulo 3 жыл бұрын
- Regarding the parabola: With a bit of "contemplation" one can see that the Y values seem to be the ones of a parabola with equation Y = X². However the Y values are not symmetrical with respect to X=0, but rather to X=1. In other words it seems that the parabola Y = X² was shifted to the right by 1. That means the parabola Y = (X - 1)² will fit the data points perfectly, and so that must be the solution: Y = X² -2X +1 .
@Zeddy27182
@Zeddy27182 Жыл бұрын
0:43 She literally went out and came back. she is so cute🤣 not to mention that she is such a great Professor
@joelngige5776
@joelngige5776 Жыл бұрын
This made the concept of lease squares even clearer than i had mastered it before. Thanks for the great Video
@rittwikchatterjee5347
@rittwikchatterjee5347 5 жыл бұрын
Really helped me to understand the c program algorithm of it...
@theodoresweger4948
@theodoresweger4948 5 жыл бұрын
Thanks, very clear by keeping it to it's simplest form.
@jeeyabilal1266
@jeeyabilal1266 2 жыл бұрын
Thankyou you dont how helpful these lectures might be
@valbercesar
@valbercesar 6 жыл бұрын
Perfect class. Nice explanation. Thanks so much! It helped me a lot. :)
@eddiechen6389
@eddiechen6389 2 жыл бұрын
it would be nice if you can show how u derived those two equations!!
@evansodhiambo2030
@evansodhiambo2030 10 жыл бұрын
OMG you are so good,thanks
@fadiaburaid
@fadiaburaid 5 жыл бұрын
Thanks for the good explanation, couldn't be better!! Awesome work
@michaelgenchi3026
@michaelgenchi3026 8 ай бұрын
condensing my 2 and 1/2 hour long lecture into 10 minutes 🙏, thank you
@mohammedmhilal4129
@mohammedmhilal4129 2 жыл бұрын
After solving the problem a = 1, b = -2, c = 1 so y=x^2-2x+1
@pabloramirez9711
@pabloramirez9711 4 жыл бұрын
what a legend, thank you
@smilex3
@smilex3 10 жыл бұрын
Love it! Great video that I will recommend to my students. Thank you!
@kumaransivan
@kumaransivan 12 жыл бұрын
great lecture and teacher. Thank you!
@zhengwang5828
@zhengwang5828 11 жыл бұрын
brilliant explanation! very clear!
@jitendranadhpalaparthi6394
@jitendranadhpalaparthi6394 6 жыл бұрын
First time I understood why we square values.. Awesome thank you.
@Salahuddin-nv6kh
@Salahuddin-nv6kh 2 ай бұрын
Thank you Baris hocam for the video :)
@digigoliath
@digigoliath 3 жыл бұрын
TQVM! My first exposure (self study) to Statistics, starting with List Squares.
@cgleck780
@cgleck780 4 жыл бұрын
Amazing explanation as I understood what least squares is at the two minute mark!
@95Deepanshu
@95Deepanshu 2 жыл бұрын
Thank you professor.
@MyScienceSnacks
@MyScienceSnacks 6 жыл бұрын
This is extremely good. I've been trying to wrap me around this for a bit and this is the clearest explanation.
@KW-dg6fs
@KW-dg6fs 7 жыл бұрын
My thought&guess? : Instead of finding the least squares, what we are finding is the least distance (absolute value since some points are above, some are below the fit function). However, we decide to use least squares because when we take the gradient, it makes it easy to solve the problem.
@OriakoPhi
@OriakoPhi 13 жыл бұрын
Loved the challenge problem! Very engaging recitation video.
@garrettkuketz2715
@garrettkuketz2715 4 жыл бұрын
Holy crap shes a great teacher
@dui15pucca
@dui15pucca 8 жыл бұрын
how could 9 people didn't like this video???!!!! this video is very helpful to understand the least square method!
@lusinetalawally7611
@lusinetalawally7611 6 жыл бұрын
They are jealous perhaps. Hahaha
@iraq2011iq-for-every-one
@iraq2011iq-for-every-one 10 жыл бұрын
Thank you very much and best wishes
@panav731
@panav731 12 жыл бұрын
Thanks for the video, i found it very straight forward. Can you please explain further how you found the two equations you used to solve for a and b. are taking the partial derv. of y=ax+b? And do you have a video for your bonus question. I would like to see if what i worked out is correct. Thanks!!
@ahmedhisham9420
@ahmedhisham9420 5 жыл бұрын
i realize this is way too late but ill leave this here in case it helps someone :D kzbin.info/www/bejne/i4rIamt5mN2DmNU
@armiakrolewska
@armiakrolewska 12 жыл бұрын
You are great teacher.
@peterp-a-n4743
@peterp-a-n4743 11 жыл бұрын
Patents and copyrights expire eventually for a good reason. Generally I appreciate that we honor the ones who found something out sooner than the rest of the world, even if in fact that often happens by mere accident.
@pp7114
@pp7114 12 жыл бұрын
Thank you. Very good session!!
@standman007
@standman007 Жыл бұрын
Too Good. Thanks very much
@nandinijain2773
@nandinijain2773 9 жыл бұрын
thanku very much...for helping us by spreading knowledge !!!!! truly u r great !!!
@MrLongliveAmerica
@MrLongliveAmerica 11 жыл бұрын
Least Square is a data fitting application discovered by Mathematician Gauss. The system of equations that you see are given laws obtained by Trial and Error by the Mathematician and so you don't need to Break your head. Hence by merely plugging in the x values you can not only Interpolate, but also extrapolate the y value. As for the Second question, it is a Challenge Question. If you have done it, you can post it here or hint your way of approach, I can say if you are Right or Not.
@malluhouston6788
@malluhouston6788 11 ай бұрын
Thank you. This is the point I was missing!
@aslcesur9243
@aslcesur9243 6 жыл бұрын
We were looking for this explanation thank you so much 😊
@boogalooshrimpable
@boogalooshrimpable 13 жыл бұрын
@SpitTanker Given certain assumptions there are different solutions and estimators. If all gauss-markov assumptions are valid you use the normal OLS which is BLUE under thes assumptions (best linear unbiased estimator). You minimize the summation by minimizing the residuals. the residuals are the difference between the true value of y and the estimated value of y(head). You do this by deriving the regression equation with respect to b (which is the regressor). Kinda complicated on YT
@YawnGod
@YawnGod 11 жыл бұрын
AWW YEAH MIT.
@faisalmumtaz7363
@faisalmumtaz7363 2 жыл бұрын
the value of b is 11/7 at time 5:43
@d1a2n3i5e8l
@d1a2n3i5e8l 2 жыл бұрын
Thank you
@yannis1982
@yannis1982 11 жыл бұрын
thnx it was simple and clear and it didnt made me look stupid because i understood 100% of what you said.
@RamilSanchez
@RamilSanchez 12 жыл бұрын
This is very helpful , thanks maam christine!
@nyunai298
@nyunai298 3 жыл бұрын
Thanks i found what i was looking for. Non linear model using least square methode
@nyunai298
@nyunai298 3 жыл бұрын
I'm sure you'll bring the non linear model in the exam since you didn't show it on the board
@aditydud
@aditydud 6 жыл бұрын
Very very helpful video... And very easy to understand...also the concept is explained very clearly
@AlexReyesInHD
@AlexReyesInHD 10 ай бұрын
This was a good explanation, but I was kinda disappointed that the instructor didn't explain how she got the equations that were used to find the least squares
@NehaJain1704
@NehaJain1704 10 жыл бұрын
very nice explanation. thank you very much :)
@abhinandankushwaha140
@abhinandankushwaha140 6 жыл бұрын
Very beautifully explained... thanks a lot !! It helped me for my semesters
@mussie2040
@mussie2040 6 жыл бұрын
thank you teacher
@ebrahimmohammadsaleh1427
@ebrahimmohammadsaleh1427 8 жыл бұрын
very nice explanation. thank you very much
@fatihaydogdu2519
@fatihaydogdu2519 6 жыл бұрын
thank you so much. I need that formules for faster GPS ambiguity estimation
@imegatrone
@imegatrone 13 жыл бұрын
I Really Like The Video Least squares From Your
@BossManTee
@BossManTee 7 жыл бұрын
Segmented Least Squares: Multi-way Choices
@lusinetalawally7611
@lusinetalawally7611 6 жыл бұрын
Awesome!!! Thanks a lot
@ganeshpadesur2944
@ganeshpadesur2944 5 жыл бұрын
Tqvm madam
@LAnonHubbard
@LAnonHubbard 13 жыл бұрын
Thank you very much!
@krishnar1836
@krishnar1836 Жыл бұрын
In the challenge problem how to ensure that error is not maximized, as we are trying to minimise error?
@johnsonekka6397
@johnsonekka6397 6 жыл бұрын
Thankyou , that was very helpful
@wade5941
@wade5941 6 жыл бұрын
I got more work to do but that was very helpful. Thank you.
@jetskiwillywilly7970
@jetskiwillywilly7970 6 жыл бұрын
Mind Blown
@vinayakmallikarjunmali4863
@vinayakmallikarjunmali4863 3 жыл бұрын
2 hours of class in 10min
@bobkameron
@bobkameron 4 жыл бұрын
nice video!
@duranieleonard7761
@duranieleonard7761 6 жыл бұрын
It was very helpful thank you so much
@rebinshw
@rebinshw 13 жыл бұрын
very useful video.
@the_eternal_student
@the_eternal_student 2 ай бұрын
What function are they putting in terms of a and b at 3:36? i do not see any derivative being taken.
@yogiwp_
@yogiwp_ 6 жыл бұрын
Holy shit. Now I get it!
@jtang105
@jtang105 4 жыл бұрын
very nice!!
@zennologyofeverything7265
@zennologyofeverything7265 6 жыл бұрын
she is is mighty smaht - chuckie sullivan of good will hunting
@MultiAnkit1993
@MultiAnkit1993 12 жыл бұрын
Very good video
@miguelpetrarca5540
@miguelpetrarca5540 9 жыл бұрын
how do we derivive when we have a summation ? would the summation go away since it is like an integral?
@sarahhope8516
@sarahhope8516 7 жыл бұрын
what if "Y" is a function of a lot of variables say for example 4 or 5 variables and each time we combine values of these variables we get a new value of "Y"......in this case how can i proceed to get an approximate function of y?? THANK YOU SO MUCH :-)
@peterp-a-n4743
@peterp-a-n4743 11 жыл бұрын
That's clearly wrong! Many inventions and discoveries in history were made almost simultaneously and independently. It would've been a mere matter of time till anyone else had found it. And don't get me wrong Gauss was one of the greatest geniuses ever!
@kunjaai
@kunjaai 12 жыл бұрын
Nice lecture....thanks a lot....:))
@MrLongliveAmerica
@MrLongliveAmerica 11 жыл бұрын
I appreciate your Line of thought, and the way you put forth it. But what would your Opinion be on Patient Policies and Rights. If I am to agree upon your Say, then it would mean no one would ever have a legal right on any thing. In fact, as you said and claimed earlier, some one would, some how, at least stumble on the thought of invention of all the existing concepts and their application, and that means, Rutherford, and J.J. Thompson and Bohr and Einstein were just there by accident?
@aram1rasul751
@aram1rasul751 10 жыл бұрын
Hi Madam, Thank you very much indeed for a nice explanation. Really I want ot draw a best curve through all point, curve may have a lot of fluctuations instead of line or parabola, can you tell me how to do that. I am desperately in need of this answer. I appreciate your help in advance . Look forward to hearing from you Aram
@alexandersoderstrom3610
@alexandersoderstrom3610 5 жыл бұрын
Did you solve it sir?
@janzaibmbaloch5484
@janzaibmbaloch5484 8 жыл бұрын
well explained!!!!!!!!
@mvv.3431
@mvv.3431 4 жыл бұрын
You are awsome !! Thank you !! Should be cloned and put into every college and school in the world.
@hevi6048
@hevi6048 4 жыл бұрын
That would be, no. Differential?
@andrewjustin256
@andrewjustin256 4 ай бұрын
3:25 Hey there, I would like to inquire you regarding the function you mentioned in the timeline; you are asking to take the derivative of the function with respect to a and b, but where is the function of I may ask. Furthermore, if I were to find the equation of the scattered plot, it is evidently incorrect! May I please get your insight and assistance on that?
@PINEDARONALD
@PINEDARONALD 10 ай бұрын
I don't understand how is 13a + 5b = 14 and then 7 under 6 and 5 a + 3b = 6 and then b = 4 under 7 can someone explain the step the end in that conclusions please?
@yasmin_jsmn
@yasmin_jsmn 8 жыл бұрын
thanks
@stewbate70
@stewbate70 12 жыл бұрын
Question...Isn't Exi Squared 5 squared5x5 =25, why do you have 13? and kExiyi is 5x6=30, why do you have 14? Just started statistics and I can't see it...help! Stewy
@nicesacbro4891
@nicesacbro4891 5 жыл бұрын
Exiyi is not equal to Exi * Eyi . It's equal to the sum of each xi multiplied by it's corresponding yi.
@manawilab5322
@manawilab5322 6 жыл бұрын
Wow, very illustrative, I wish you were my lecturer Thanks, but how should we send your assignment mentioned at the end of the lecture?
@MrLongliveAmerica
@MrLongliveAmerica 11 жыл бұрын
It would be have been much better if the Instructor had given Due Credits to the Inventor/Discoverer Carl Gauss. After all without him, there would have Never Been a technique as Least Squares.
@frankyoung6006
@frankyoung6006 2 жыл бұрын
I think my calculation might be wrong but could anyone tell me if there is a factor of 2 in front of the latter two terms in equation 1 at 3:32 here?
@KaviPriyan-qt6vc
@KaviPriyan-qt6vc 4 жыл бұрын
great
@erikumble
@erikumble 3 жыл бұрын
Did anyone do the challenge? I got a=1, b= -2, and c=1. Can anyone confirm the answers?
@asiriindrajith8738
@asiriindrajith8738 8 жыл бұрын
Hi, How to solve a something like this.. f= 1/{K1(x1^a1)(y1^b1)(z1^c1)} + 1/{K2(x2^a2)(y2^b2)(z2^c2)}: x1,x2,y1,y2,z1,z2 all are variables and a1,a2, b1,b2 c1,c3 K1, K2 all are constants that needed to be find.
@vigneshnagarajan3077
@vigneshnagarajan3077 8 жыл бұрын
wow great
@therasmataz2168
@therasmataz2168 7 ай бұрын
got lost at 3:20
@ass-thetic6035
@ass-thetic6035 5 жыл бұрын
I have fallen love
@cloudwalker2730
@cloudwalker2730 9 ай бұрын
Not even close to being correct why is this video still up here...
@salihduzyol919
@salihduzyol919 5 жыл бұрын
WE STAN
@ahmedsamawe5002
@ahmedsamawe5002 6 жыл бұрын
There is no mathematical proof so far proves that this method is the best way, but I have a simple proof that this method is the best way in this field
@mallakbasheersyed1859
@mallakbasheersyed1859 3 жыл бұрын
Can u provide the proof.I am thinking that we must reduce error so we have that formula of mse, can u explain urs.
@TeddyJohnson
@TeddyJohnson 12 жыл бұрын
Regression
Second derivative test | MIT 18.02SC Multivariable Calculus, Fall 2010
8:39
Least squares using matrices | Lecture 26 | Matrix Algebra for Engineers
10:15
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН
Max/Min | MIT 18.02SC Multivariable Calculus, Fall 2010
13:59
MIT OpenCourseWare
Рет қаралды 27 М.
Linear Regression Using Least Squares Method - Line of Best Fit Equation
15:05
The Organic Chemistry Tutor
Рет қаралды 1,5 МЛН
Change of variables | MIT 18.02SC Multivariable Calculus, Fall 2010
9:17
MIT OpenCourseWare
Рет қаралды 194 М.
Graphing surfaces | MIT 18.02SC Multivariable Calculus, Fall 2010
7:52
MIT OpenCourseWare
Рет қаралды 227 М.
How to calculate linear regression using least square method
8:29
statisticsfun
Рет қаралды 1,7 МЛН
Integral of exp(-x^2) | MIT 18.02SC Multivariable Calculus, Fall 2010
9:34
MIT OpenCourseWare
Рет қаралды 729 М.
9. Four Ways to Solve Least Squares Problems
49:51
MIT OpenCourseWare
Рет қаралды 123 М.
Least squares approximation | Linear Algebra | Khan Academy
15:32
Khan Academy
Рет қаралды 560 М.