This was a very clear way of explaining it! I pay thousands of dollars for professors to ignore my requests for help and could not care less if their students pass or not. You're a great teacher, unlike everyone at Krannert Management School of Purdue U. Thank you!!!!
@helpmereachjust10msubs48 Жыл бұрын
U mean thousands of dollars for getting a degree? duh imagine if u start investing
@shireenkhan6847 Жыл бұрын
I spent whole day trying to understand this but I finally did thanks to your video. Thank you so much
@hopekivuyo882610 ай бұрын
I never thought that I would come to understand derivation of estimates easier like this
@sushmithags46834 жыл бұрын
Thanks a lot. It was easy to understand and explained really well. I appreciate it.
@pruthvinbatham61373 жыл бұрын
Probably the best video of this topic, thanks for this!
@akshar271510 ай бұрын
Thank you! This video helped me understand with ease. Great explanation
@AtsedeZinabu-dq4vm Жыл бұрын
great work keep it up
@aloybackbanerjeeup76342 жыл бұрын
Nice and wonderful content. I would be very happy if you can calculate the coefficient in case of TLS or total least square method too
@peacego6243 жыл бұрын
Man, thanks are not enough. But, thanks from the bottom of my harth.
@ivioalves72288 ай бұрын
Your are a friend my friend!! Thank you!
@dalkeiththomas93522 жыл бұрын
absolutely amazing
@1UniverseGames4 жыл бұрын
Can you please show one more thing of this calculation. Like how we can obtain intercept and slope of B0 and B1 after shifting line l to l'?
@oghenefejiroagbi14692 жыл бұрын
This video is so helpful and clearly explained. Thank you so much! But Please does anyone know why alpha hat a^ (which is not indexed by i) is in the summation but beta hat B^ gets taken out ?
@MCxWillyxP8 ай бұрын
I would also like to know this @techinsightsjournal
@louie300030002 ай бұрын
Since it's a partial derivative with respect to â, b hat is held constant and can therefore be pulled out of the sigma.
@hassanbolagligsman2 ай бұрын
@@louie30003000 do you know why we put a focus with the partial derivative on alpha?
@louie300030002 ай бұрын
@@hassanbolagligsmanSince the regression line y=a^ + b^x has two unknowns (the y-intercept a^ and the slope b^), to minimize the residual square function requires taking two partial derivatives: first with respect to a^ (where b^ is considered a constant), and then with respect to b^(where a^ is considered a constant).
@kerkebedeley2 жыл бұрын
It was adorable 😍😍 from Tigray
@Lucyferandtheson003 Жыл бұрын
Thank you man, but why are we minimizing with respect to a and not x?
@aimalkhan6632 жыл бұрын
Gr8 bro ❤❤❤
@naturebedtime5156Ай бұрын
very helpful
@mathiasmbendela79397 ай бұрын
It is good but indeed there are some areas that need to be further clearly explained just like what others have oberseved
@abigaelmokeiramonii63902 жыл бұрын
Nice job
@siwedewadugna50022 жыл бұрын
best videos teacing thankss
@Economics3652 жыл бұрын
Why after 1/n multiplication that alpha and beta becomes their respective hats
@searamanat81682 ай бұрын
Why did you multiply by 1/n?
@mishka793015 күн бұрын
to remove the summation signs
@ele49842 ай бұрын
I was beginning to think there is something wrong with my brain I understand this but in class I absolutely don't understand what those prof are doing and have to figure it out outside class.
@Aaron-kv6vr Жыл бұрын
thank you so much
@Akshay-cj3hq Жыл бұрын
but why is alpha an inner function hmm
@jujum48519 ай бұрын
Thank you
@victoriabrimm50144 жыл бұрын
THANK YOU
@aslay10642 жыл бұрын
if you divide the first derivative by 2, should the answer be (yi-a^-b^xi)/2 = 0 You just casually got rid of the 2 and the minus sign?? 🤔
@projectnew91722 жыл бұрын
helpful but your volume is low
@1UniverseGames4 жыл бұрын
Sir, great video. I need a little help, I have a question which I'm not understanding properly. Can you make a video of it to explain it to us. I can share the question to you, if you provide me your email address. It will be really helpful. Thanks in advance