Linear Regression: Derivation

  Рет қаралды 69,661

numericalmethodsguy

numericalmethodsguy

Күн бұрын

Learn how linear regression formula is derived. For more videos and resources on this topic, please visit mathforcollege.com/nm/topics/l...

Пікірлер: 86
@hottoniapalustris1541
@hottoniapalustris1541 3 жыл бұрын
Man, thank you! I thought my school project was really doomed before I saw this, but with your explanation, I finally found a way to make sense of my project data. Once more, thanks a lot!
@sharifahmed45
@sharifahmed45 3 жыл бұрын
Prof, I can't say anything else , but immense gratitude for you and your channel, and I am grateful student. Thanks again
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel kzbin.info, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at kzbin.info/store
@SaintRudi85
@SaintRudi85 5 жыл бұрын
Nice explanation. It would also be really useful to have a similar video for multiple linear regression.
@ahmadibrahim3596
@ahmadibrahim3596 3 жыл бұрын
Thank You professor your explanation is very clear, I did the calculation and had the formulas of a and b.
@imglenngarcia
@imglenngarcia 3 жыл бұрын
Wow! This will definitely be a key ingredient for my endeavor in transport, urban and regional planning. Thank you!
@muhammadkashim3229
@muhammadkashim3229 4 жыл бұрын
Can you explain for the model of 3 independent or explanatory variables
@ayushshaw3681
@ayushshaw3681 3 жыл бұрын
After watching the derivation I would say, awesome explanation.
@y_p7
@y_p7 3 жыл бұрын
This helped me a ton!!! God bless ya professor
@matard2940
@matard2940 3 жыл бұрын
All of this guys videos are so clear and helpful, best for numerical methods!
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel kzbin.info, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at kzbin.info/store
@alexm9744
@alexm9744 4 жыл бұрын
VERY well explained. Thanks so much!
@bulakornsi7285
@bulakornsi7285 4 жыл бұрын
Thank you so much. You explain so clearly.
@sergten
@sergten 3 жыл бұрын
Fantastic explanation.
@kvs123100
@kvs123100 3 жыл бұрын
This is so awesome! Sir Pranam from my side! After having gone through so many videos, this the perfect video I saw!
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel kzbin.info, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at kzbin.info/store
@Vishwesh2
@Vishwesh2 3 жыл бұрын
THANKS A LOT SIR!!!! I was choking at the derivative part but you made it clear. I have watched some other videos of yours. All are great. You earned a like and a subscriber. Really huge thanks sir. I'll watch other videos of yours also. You're a really good teacher
@lukepaluso9863
@lukepaluso9863 4 жыл бұрын
Wondrous! Thank you!!!
@user-zi5qq2ke4u
@user-zi5qq2ke4u Жыл бұрын
That is a fantastic explanation! I'm thankful for this video.
@numericalmethodsguy
@numericalmethodsguy Жыл бұрын
Thank you. Please subscribe and ask your friends to subscribe - our goal is to get to 100,000 subscribers by the end of 2021. To get even more help, subscribe to the numericalmethodsguy channel kzbin.info, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at AutarKaw.org. You can also take a free massive open online course (MOOC) at canvas.instructure.com/enroll/KYGTJR Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at kzbin.info/store
@delaware137
@delaware137 5 жыл бұрын
Enlightening! Thank you for teaching me this.
@numericalmethodsguy
@numericalmethodsguy 5 жыл бұрын
Thank you. To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email. Support the site by buying the textbooks at www.lulu.com/shop/search.ep?keyWords=autar+kaw&type= Follow my numerical methods blog at AutarKaw.org. You can also take a free online course at www.canvas.net/?query=numerical%20methods Best of Learning Autar Kaw AutarKaw.com
@stephenbarnes5145
@stephenbarnes5145 3 жыл бұрын
Excellent explanation! Thank you
@edwardmansal8459
@edwardmansal8459 2 жыл бұрын
Well explained. Grateful
@dharasheth4107
@dharasheth4107 3 жыл бұрын
I love it........Thank you so much......
@kunalparihar9224
@kunalparihar9224 Жыл бұрын
Thankyou sir for clear explanation 🙏
@cheznikos
@cheznikos 3 жыл бұрын
Seems you can set a0 = 0, find a1 very easily, then deduct a0 also easily. Reason is the angle a1 of the straight line doesn't not change if all Yi are decreased by any constant. Also in the end we can verify that a1 = cov(x,y)/var(x) = cov(x, y-a0)/var(x) for any a0. This will simply the computations.
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Do not know about setting a0=0. If we are minimizing with respect to a0, we cannot assume it to be zero. Simpler derivation should not be used to sacrifice logical explanation.
@cheznikos
@cheznikos 3 жыл бұрын
@@numericalmethodsguy You're right, I was badly confused :(
@mjf6125
@mjf6125 4 жыл бұрын
Thanks good explanation. Question: why does the partial derivative in this case yield a 'minimum'? How do we know it's not a maximum? Is it because: SSR = (Y - a.o - sum(a.i*x.i))^2 is the multivariable function we're trying to minimize and since it's squared we assume it's parabolic and opens upwards? Therefore the solution to the first partial derivative = 0 is a minimum?
@mjf6125
@mjf6125 4 жыл бұрын
I'm sorry I misspoke when I placed the a.i and x.i in the sum. I was getting confused with multiple regression. Is solving multiple regression the same process? Just taking partial derivative with respect to each unknown variable and then solving the resulting equations?
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
SSR=sum(y_i - a_o - a_1*x_i)^2 where _ stands for subscript. First partial derivatives put=0 ONLY yield a possible location of local minimum or maximum (do not know yet, if it a local minimum, local maximum or inflection point). It has to be followed by a second derivative test to see if it is the location of a local minimum or a local maximum. The second derivative test shows it is the location of local minimum (see link below). Since the first partial derivatives equal to zero equations have only one solution and SSR is a continuous function of a_0 and a_1, it has to be the also the location where the absolute minimum occurs too. To see the complete math behind it, go here: autarkaw.org/2012/09/03/prove-that-the-general-least-squares-model-gives-the-absolute-minimum-of-the-sum-of-the-squares-of-the-residuals/ or look at the derivation and appendix of mathforcollege.com/nm/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
@@mjf6125 Yes, multiple regression follows same procedure as it is all about minimizing SSR.
@studycenter8941
@studycenter8941 3 жыл бұрын
Very helpful 💓 thank you sir.
@visualizetheinfinitys.g.5048
@visualizetheinfinitys.g.5048 2 жыл бұрын
Thank u so much sir.
@seal0118
@seal0118 3 жыл бұрын
its very clear, thank you
@nD-ci7uw
@nD-ci7uw 4 жыл бұрын
Can you explain how you derived a0? I got it from Crammer's Rule, but I can't derive it with a1 :/
@nD-ci7uw
@nD-ci7uw 4 жыл бұрын
ok I did inverted derivation. I take your equation for a0 and I make it equal to Crammer a0. So equation is true, but how did you hit on this idea ? :)
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
@@nD-ci7uw If you look at the equations, you already got a1 using Crammer's rule. You will get a similar looking expression to a1 for a0 by using Crammers rule. But how I get the expression for a0 is just by using equation (1) without Crammer's rule, that is n*a0+sum(xi)*a1=sum(yi), and writing a0 in terms of a1. Also, sum(xi)/n=xbar and sum(yi)/n=ybar.
@gp6957
@gp6957 Жыл бұрын
Sir, I learnt basic Calculus and I'm in doubt how the exponent 2 become minus 2? When we use power rule it is simply 2 but u r using -2, how u got?
@numericalmethodsguy
@numericalmethodsguy Жыл бұрын
d/dx(u^2)=2*u*du/dx. The du/dx may be negative!
@A.K04
@A.K04 4 жыл бұрын
Thank you very much..... Sir...
@twinklecloud6645
@twinklecloud6645 3 жыл бұрын
Thank you so much Sir for explaining the derivation in such an easy way😇.
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Always welcome
@shreyanawani4218
@shreyanawani4218 3 жыл бұрын
Sir,is it correct to call this method as minimization using partial derivaties?Kindly reply as i have exam tomorrrow.
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
One cannot conflate the two items. What is shown is the derivation of the linear regression model. The least-squares linear regression method is to find the best fit straight line for given data. The straight-line regression model is found by minimizing the sum of the square of the residuals. " Minimization using partial derivatives" is the concept used to find the constants of the model. math.libretexts.org/Courses/University_of_Maryland/MATH_241/03%3A_Differentiation_of_Functions_of_Several_Variables/3.08%3A_Maxima/Minima_Problems
@michaeljburt
@michaeljburt 3 жыл бұрын
@@numericalmethodsguy Good answer. Also @numericalmethodsguy, this derivation was fantastic, thanks much. I'm now using regression models in electrical engineering (power distribution demand forecast models) and wanted to take a bit of a dive to understand where the coefficients for linear regression came from.
@samirah1534
@samirah1534 4 жыл бұрын
why is the derivation of the minimum error made with respect to a0 and a1, i mean what is the general theory to derive w.r.t. a0 and a1.
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
nm.mathforcollege.com/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf
@samirah1534
@samirah1534 4 жыл бұрын
@@numericalmethodsguy Thanks loads
@sparrowp2251
@sparrowp2251 Жыл бұрын
Thank you sir really 🙏🙏🙏🙏
@Jayesh-uf6th
@Jayesh-uf6th 3 жыл бұрын
Sir... thank you sir.
@buttegowda
@buttegowda 3 жыл бұрын
Thanks a lot sir
@numericalmethodsguy
@numericalmethodsguy 5 жыл бұрын
Thank you. To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email. Support the site by buying the textbooks at www.lulu.com/shop/search.ep?keyWords=autar+kaw&type= Follow my numerical methods blog at AutarKaw.org. You can also take a free online course at www.canvas.net/?query=numerical%20methods
@natashawanjiru1018
@natashawanjiru1018 4 жыл бұрын
What about using partial differentiation derive a normal equation for regression model......is it the same??
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
That is what is being done in the video. I do not understand the question?
@natashawanjiru1018
@natashawanjiru1018 4 жыл бұрын
The question is"using partial differentiation,derive the normal equations of a two variables regression model"
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
@@natashawanjiru1018 The question is ill-posed. First, the kind of model should be defined - is it y=a0+a1*x? Is it y=a*exp(b*x)? If it is just the straight line, go to nm.mathforcollege.com/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf and look at the derivation as well as the appendix.
@natashawanjiru1018
@natashawanjiru1018 4 жыл бұрын
Thanks so much
@hirakmondal6174
@hirakmondal6174 4 жыл бұрын
What is this method called? Is it the same as gradient descent method?
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
One cannot conflate the two items. What is shown is derivation of linear regression model. The gradient descent method is to find the local minimum of any differentiable function. The least-squares linear regression method is to find the best fit straight line for given data. The straight-line regression model is found by minimizing the sum of the square of the residuals. The gradient descent method surely can be used to find the minimum of the square of the residuals.
@hirakmondal6174
@hirakmondal6174 4 жыл бұрын
@@numericalmethodsguy Thanks a lot for your reply. So both these ways i.e. the OLS and Gradient Descent can be used to achieve the same purpose right?
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
@@hirakmondal6174 No. You got to think about that SR is an objective function and we can use a method such as GD to find where it is minimum.
@lucasmoratoaraujo8433
@lucasmoratoaraujo8433 Жыл бұрын
Nice!
@arunbm123
@arunbm123 4 жыл бұрын
brilliant explanation............
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
To get even more help, subscribe to the numericalmethodsguy channel, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources and share the link with your friends through social media and email.
@nipulsindwani117
@nipulsindwani117 3 жыл бұрын
Thanks professor
@AffanSamad
@AffanSamad 6 жыл бұрын
very well explanation ..
@numericalmethodsguy
@numericalmethodsguy 6 жыл бұрын
Thank you. Go to mathforcollege.com/nm/mws/gen/06reg/mws_gen_reg_txt_straightline.pdf to see how the second derivative test is done as it is not shown in the video.
@gp6957
@gp6957 Жыл бұрын
Sir, I couldn't solve the matrix, please provide how to solve it....
@numericalmethodsguy
@numericalmethodsguy Жыл бұрын
Multiply equation (1) by sum of xi, and equation (2) by n. Subtract and you will get rid of a0 unknown. You will get the equation for a1. To find a0, simply use equation (1) and write it in terms of a1, sum of xi and sum of yi. You have already found a1. You can also look at the matrix form, and use Crammer's rule. See equation 9.8.5 and 9.8.6 of math.libretexts.org/Bookshelves/Precalculus/Precalculus_(OpenStax)/09%3A_Systems_of_Equations_and_Inequalities/9.08%3A_Solving_Systems_with_Cramer's_Rule
@MuhammadHussain-ol4lw
@MuhammadHussain-ol4lw 3 жыл бұрын
Can anyone explain the a0 value... How itt become ..
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Just look at the first equation and write a0 in terms of a1.
@jamalnuman
@jamalnuman Жыл бұрын
great
@col.aureliano7352
@col.aureliano7352 3 жыл бұрын
Where did the -1 come from @ 6:35 ??
@numericalmethodsguy
@numericalmethodsguy 3 жыл бұрын
Taking derivative of (-a0) with respect to a0 is -1. Chain rule example. If u=u(a), then d/da(u^2)=2*u*du/da
@col.aureliano7352
@col.aureliano7352 3 жыл бұрын
@@numericalmethodsguy yes figured it out! but thanks for replying
@manamsetty2664
@manamsetty2664 Жыл бұрын
Why do we add the errors
@numericalmethodsguy
@numericalmethodsguy Жыл бұрын
We cannot reduce each residual. If we reduce one, another will increase or decrease. When you have many points, it is hard to do that. So we as a next step say - let us add the residuals and add them up. Then make the sum as small as possible. We find that it is not a good criterion. The sum of the absolute residuals is also not a good criterion. Both these methods result in non-unique straight lines. Minimizing the sum of the squares of the residuals works. It gives a unique straight line as well.
@ethanhunt987
@ethanhunt987 5 жыл бұрын
i needed the solution of those equations where you stop solving and wrote the formula to find a0 and a1, this video is not much of a use for me
@numericalmethodsguy
@numericalmethodsguy 5 жыл бұрын
You can simply use Gaussian elimination symbolically to get the solution. Give it a try - it won't hurt. Or use the cofactor method as explained here. www.nabla.hr/MD-SysLinEquMatrics2.htm
@jamesoseiowusu8212
@jamesoseiowusu8212 4 жыл бұрын
Thanks Prof, but you didn't prove a0.
@numericalmethodsguy
@numericalmethodsguy 4 жыл бұрын
If you look at the equations, you already got a1 using Cramer's rule www.chilimath.com/lessons/advanced-algebra/cramers-rule-with-two-variables/ or by using Gaussian elimination symbolically. You will get a similar to a1 looking expression for a0 by using Cramers rule. But how I get the expression for a0 is just by using equation (1) without Cramer's rule, that is n*a0+sum(xi)*a1=sum(yi), and writing a0 in terms of a1. Also, sum(xi)/n=xbar and sum(yi)/n=ybar.
@romanemul1
@romanemul1 3 жыл бұрын
Police line on a ground . DO NOT CROSS !
@sathiyanarayanan7245
@sathiyanarayanan7245 Жыл бұрын
Thank u very much sir .
@numericalmethodsguy
@numericalmethodsguy Жыл бұрын
Most welcome. Thank you. Please subscribe and ask your friends to subscribe - our goal is to get 100,000 subscribers by the end of 2022. To get even more help, subscribe to the numericalmethodsguy channel kzbin.info, and go to MathForCollege.com/nm and MathForCollege.com/ma for more resources. Follow the numerical methods blog at blog.AutarKaw.com. You can also take a free massive open online course (MOOC) on Numerical Methods at canvas.instructure.com/enroll/KYGTJR and on Introduction to Matrix Algebra at canvas.instructure.com/enroll/J4BFME. Please share these links with your friends and fellow students through social media and email. Support the channel if you able to do so at kzbin.info/store
Learn Statistical Regression in 40 mins! My best video ever. Legit.
40:25
Linear Regression
35:33
Machine Learning- Sudeshna Sarkar
Рет қаралды 199 М.
KINDNESS ALWAYS COME BACK
00:59
dednahype
Рет қаралды 161 МЛН
Slow motion boy #shorts by Tsuriki Show
00:14
Tsuriki Show
Рет қаралды 4,2 МЛН
Chapter 06.03: Lesson: Deriving the Linear Regression Formula
14:16
numericalmethodsguy
Рет қаралды 5 М.
Chapter 06.03: Lesson: Linear Regression with Zero Intercept: Derivation
9:32
Derivation of OLS coefficients
13:35
William Sundstrom
Рет қаралды 48 М.
Exponential Model Regression Transformed Data Example
15:47
numericalmethodsguy
Рет қаралды 18 М.
Linear Regression, Clearly Explained!!!
27:27
StatQuest with Josh Starmer
Рет қаралды 1,3 МЛН
Linear Regression: Example
10:24
numericalmethodsguy
Рет қаралды 57 М.
Linear Regression From Scratch in Python (Mathematical)
24:38
NeuralNine
Рет қаралды 161 М.
Linear Regression, Clearly Explained!!!
27:27
StatQuest with Josh Starmer
Рет қаралды 226 М.