Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/ Corrections: 13:05 When the residual is negative, the pink circle should be on the left side of the y-axis. And when the residual is positive, the pink circle should be on the right side.
@adolfocarrillo2482 жыл бұрын
This is an amazing explanation!!! Thanks
@statquest2 жыл бұрын
@@adolfocarrillo248 Thank you very much! :)
@anushreesaran2 жыл бұрын
Got my copy of The StatQuest Illustrated Guide to Machine Learning today! Quadruple BAM!!!!
@statquest2 жыл бұрын
@@anushreesaran Hooray! Thank you very much! :)
@tremaineification Жыл бұрын
@@statquestwhat do mean by the last term not containing the intercept?
@moetasimrady8876 Жыл бұрын
I have started my machine learning journey a month ago and I stumbled onto a myriad of resources that explain linear models using the RSS function but no one, and I mean no one, managed to explain it with as much clarity and elegance as you have in just under 20 minutes. You sir are a boon to the world.
@statquest Жыл бұрын
Thank you!
@diyanair1582 жыл бұрын
Did I just UNDERSTAND the CHAIN RULE ? SURREAL, thank you!
@statquest2 жыл бұрын
:)
@revolution77N4 жыл бұрын
Man you are amazing. You should get a Nobel prize!
@statquest4 жыл бұрын
Thank you! :)
@marcus24413 жыл бұрын
Agree!
@mammamiachemale2 жыл бұрын
more than a nobel! book bought
@egyptianplanner Жыл бұрын
Yes Yes Yes
@afmartins666 Жыл бұрын
Or a Grammy!
@pperez12243 жыл бұрын
Amazing pedagogy. Slow pace , short setences , visuals consistent with the talk. great job ;-) Thanks
@statquest3 жыл бұрын
Glad you liked it!
@Ruostesieni2 жыл бұрын
As someone who is doing medical research and needs to learn little-by-little about statistics, neural networks and machine learning as my project goes on, your channel is a literal life-saver! It has been so hard to try to keep my M.D. stuff together with my PhD research all the while learning statistics, programming and neural network structures and machine learning. Trying to arrange courses from my uni to fit in with all the other stuff is simply impossible, so I've been left to my own devices and find a way to gain knowledge about said subjects and your channel has done just that. Your teaching is great and down-to-earth enough to be easily grasped, but you also delve deep into the subject after the initial baby steps, so the person watching isn't just left with "nice to know"-infobits. Love it! Keep up the great work!
@statquest2 жыл бұрын
Thank you!
@amanrastogi6038 ай бұрын
I am Biostatistician, proclaiming that you are really a good teacher.
@statquest8 ай бұрын
Thank you very much!
@dc_amp8843 Жыл бұрын
The way you link equations to visuals and show how everything is working along with the math at the SAME time. Beautiful, elegant, easy to follow.
@statquest Жыл бұрын
Wow, thank you!
@jbboyne2 жыл бұрын
Your videos are fantastic, even without the sound effects... but the sound effects really bring them over the top.
@statquest2 жыл бұрын
Thank you! And thank yo so much for supporting StatQuest!!! BAM! :)
@ayushbatra2471 Жыл бұрын
Over the past three years, I have been studying neural networks and delving into the world of coding. However, despite my best efforts, I struggled to grasp the true essence of this complex subject. That is until I stumbled upon your enlightening video. I cannot emphasize enough how much your video has helped me. It has shed light on the intricate aspects of neural networks, allowing me to comprehend the subject matter with greater clarity and depth. The way you presented the material was truly remarkable, and it made a profound impact on my understanding. What astounds me even more is that you provide such valuable content for free. It is a testament to your passion for educating and empowering individuals like myself. Your dedication to spreading knowledge and fostering learning is truly commendable. Thanks to your channel, I have been able to unlock the true essence of mathematics and its relationship with neural networks. The confidence and clarity I now have in this subject are invaluable to my personal and professional growth. Your video has been a game-changer for me, and I am grateful beyond words. Please continue your fantastic work and know that your efforts are deeply appreciated.
@statquest Жыл бұрын
Thank you very much! BAM! :)
@TheGreatFilterPodcast3 жыл бұрын
BY FAR the best explanation of the chain rule I have ever seen (and trust me - I've seen A LOT) You, sir, just earned yourself yet another well-deserved subscriber. F'n brilliant!!!
@statquest3 жыл бұрын
Thank you very much!!! BAM! :)
@RahulVerma-Jordan7 ай бұрын
If I watched your videos during my college, my career trajectory would be totally different. BIG BAM!!!!
@statquest7 ай бұрын
Thanks!
@user-ul2mw6fu2e2 жыл бұрын
Best chain rule explanation i have ever seen.
@statquest2 жыл бұрын
Thank you!
@vnaveenkumar9823 жыл бұрын
Take my words Josh you are the best teacher on the internet who teaches Statistics........ and the chain rule made me crazy.......... by your explanation.
@statquest3 жыл бұрын
Wow, thanks!
@vnaveenkumar9823 жыл бұрын
@@statquest ❤️
@markbordelon16017 ай бұрын
We could have had a "dreaded terminology alert" : "decomposition of functions". But even without it: this was a perfect explanation of the chain rule , with great practical examples. Bravo, Josh!
@statquest7 ай бұрын
Thank you!
@Alchemist102413 жыл бұрын
Awesome!! None of my math teachers in high school or collage never explained to me WHY chain rule works this way. but you explained it with a very simple example. I'm certain that from now on I'll never forget the chain rule formula. Thanks a million. 👌✔
@statquest3 жыл бұрын
BAM! :)
@putririzqiyah62943 жыл бұрын
this channel was suggested by my professor, and i always watch the videos while doing a machine learning tasks. Big appreciate to you :D
@statquest3 жыл бұрын
Cool, thanks!
@ivanferreira50424 жыл бұрын
Nobody: The demon in my room at 3am: 7:56
@statquest4 жыл бұрын
Dang! :)
@borisdjakovic4 жыл бұрын
jesus, this was funny xD
@mr.shroom4280 Жыл бұрын
Bro your the only tutorial that actually helped me grasp this concept, thank you so much.
@statquest Жыл бұрын
Glad it helped!
@mr.shroom4280 Жыл бұрын
@@statquestI know this isn't related to this video, i just want you to help me because you replied to this comment. With gradeint descent, how am i supposed to get the derivative for each weight and bias in a loss function dynamically? because surely for networks with more than 100 neurons there would be a way, i know there is i just don't know. When i am calculating the derivative for one varaible in the loss function, to optimize it, i get some overly complicated function, but i see some papers on it and it isn't complicated.
@statquest Жыл бұрын
@@mr.shroom4280 See: kzbin.info/www/bejne/f3-ViaB4na5_qpY kzbin.info/www/bejne/n6rRY62adrGcn5o and kzbin.info/www/bejne/fXy9oIJ-jayWgtE
@mr.shroom4280 Жыл бұрын
@@statquest thankyou so much, i watched those but i totally forgot about the chain rule lol
@varunparuchuri95443 жыл бұрын
dear @stat quest you must have come from heaven to save students from suffering's just unbeliable explanation
@statquest3 жыл бұрын
Thank you! :)
@rigobertomartell50292 жыл бұрын
Josh you are a master in teaching, you make difficult topics so easy to understand which is really amazing. My mother language is not English but you explain so well and clear that I can understand everything. Congratulations Sir, please keep doing this job.
@statquest2 жыл бұрын
Thank you very much! :)
@aydnndurmaz5 сағат бұрын
Explaining u-substitution besides chain rule is brilliant
@juanp.lievanok.3737Ай бұрын
You are a genius at this I can't believe I hadn't heard of this channel before.
@statquestАй бұрын
Thanks!
@Maskedlapis644 ай бұрын
I’ve watched videos like this for work, yours is the best, I fully grasp what a derivative is!
@statquest4 ай бұрын
Glad you liked it!
@RealSlimShady74 жыл бұрын
Guess I will not be afraid of the ***THE CHAAAAAINNNN RULE*** Thank you, Josh! Always Waiting for your videos!
@statquest4 жыл бұрын
Bam! :)
@prydt Жыл бұрын
These seriously are some of my favorite videos on youtube!
@statquest Жыл бұрын
Thanks!
@alexg70823 ай бұрын
As always, clear and in simple language. Thank you !
@statquest3 ай бұрын
Glad it was helpful!
@louco2 Жыл бұрын
This is probably the best video about on the internet!! Thank you so much for taking the time to do it!!
@statquest Жыл бұрын
Glad it was helpful!
@behrampatel35638 ай бұрын
This one outdoes all the best videos on the topic .
@statquest8 ай бұрын
Thank you!
@meow-mi33310 ай бұрын
This dude explains things clearly. Huge thanks!
@statquest10 ай бұрын
Thanks!
@amerjabar78252 жыл бұрын
The best video in the internet about the Chain Rule!
@statquest2 жыл бұрын
Thank you!
@nick_g2 жыл бұрын
I love StatQuest! I got my SQ mug in the morning and just got the Illustrated Guide to Machine Learning. Super excited to start! Thank you for all the great content!
@statquest2 жыл бұрын
That is awesome! TRIPLE BAM!!!! :)
@ShermanSitter4 жыл бұрын
I would insert a BAM at 5:25. :) ...also, I realized the thing I like about your videos is you explain things, not only in a clear way, but in a different way. It adds to the depth of our understanding. Thank you!
@statquest4 жыл бұрын
That is definitely a BAM moment! And thank you. One of my goals is to always explain things in a different way, so I'm glad you noticed! :)
@taiman94233 жыл бұрын
Top notch visualization.
@statquest3 жыл бұрын
Thank you! :)
@dhakalsandeep3452 Жыл бұрын
One of the best video i have ever watched. Thank yoy guys for providing such a wonderful content for free.
@statquest Жыл бұрын
Thanks!
@georgetzimas68823 жыл бұрын
13:15 Is the residual(squared) graph mirrored? Since residual=(observed - predicted), wouldn't that mean that when on the original graph the intercept is zero, the residual would be positive(2-1=1), so the position on the residual(squared) graph should be on the positive x-axis(x=1), as opposed to the negative side on the video, and vice versa?
@statquest3 жыл бұрын
Yes! You are correct. Oops!
@darshuetube2 жыл бұрын
you have great videos that help explain a lot of concepts very clearly, step by step. You have help a lot of students for sure.
@statquest2 жыл бұрын
Thank you very much! :)
@gabrielcournelle30554 жыл бұрын
Now I can't read "the chain rule" without hearing your voice !
@statquest4 жыл бұрын
:)
@edphi3 жыл бұрын
Genius serious sincere I’m a mathematician and am convinced you are a born sage
@statquest3 жыл бұрын
Thanks!
@suneel84804 жыл бұрын
You had made my machine learning path easy!
@statquest4 жыл бұрын
Glad to hear that!
@salah6160 Жыл бұрын
Teaching is an art. thank you StatQuest
@statquest Жыл бұрын
Thank you!
@chelsie2924 жыл бұрын
this is epic, simple, and applicable chain rule in real life too - we need more videos like this damn
@statquest4 жыл бұрын
Thank you! :)
@joeyshias Жыл бұрын
i'm so moved to finally understand this, thank you!
@statquest Жыл бұрын
bam! :)
@Vinyl-vv3pz Жыл бұрын
Best reference for learning statistics. Btw, would just like to point out that in 6:16, there appears to be a minor mistake. Actually for every 1 unit increase in Weight, there is a 2 unit increase in Shoe Size, because the equation would be Size = (1/2)*Weight, or 2*Size = 1*Weight
@statquest Жыл бұрын
This video is actually correct. For every one unit increase in weight, there is only a 1/2 unit increase in Shoe Size. What your equation shows is that for every unit increase in Size, there is a 2 unit increase Weight. That's not the same thing as "for every unit increase in Weight, there is a 2 unit increase in Size".
@Vinyl-vv3pz Жыл бұрын
@@statquest I calculated through the equation, and you are correct. Thanks for the verification!
@anashaat952 жыл бұрын
Very clear explanation. I saw different people explaining this topic but you are the best. Thank you so much.
@statquest2 жыл бұрын
Thank you!
@janscheuring2642 Жыл бұрын
Hi, I think I found a mistake. (?) The pink ball in the graph from 13:08 should be on the other side of the Y axis. It doesn't change the educational value of the whole video but it caught my eye.
@janscheuring2642 Жыл бұрын
Oh, I see someone already brought this up.
@statquest Жыл бұрын
yep
@harisjoseph1173 жыл бұрын
Dear Josh Starmer, Thank you so much. May God bless with you more knowledge so that you can energize learners like me. ❤. Thank you again.
@statquest3 жыл бұрын
Thank you very much!
@RumayzaNorova3 ай бұрын
After this awesome statquest, I will hear 'The Chain Rule' with the echo playing in my head
@statquest3 ай бұрын
bam! :)
@MrXiiaoSky3 жыл бұрын
thanks for clearing up the confusions i had with chain rule!
@statquest3 жыл бұрын
bam!
@dkutagulla Жыл бұрын
Simply the best explanation of chain rule! Now I understand CR better to teach my kid when she needs it... Thank you!!! Do you publish a book on calculus I would love to buy it!
@statquest Жыл бұрын
Thanks! I don't have a book on calculus, but I have on on machine learning: statquest.org/statquest-store/
@Metryk Жыл бұрын
13:27 When the residual is negative, the pink circle is shown to be on the right side of the y-Axis, but shouldn't it be on the left side? Aside from that, great content! Cheers from Germany
@statquest Жыл бұрын
Yep. Thanks for catching that! I've added a correction to the pinned comment.
@syedmustahsan48885 ай бұрын
Another concept well explained ❤
@statquest5 ай бұрын
Thanks a lot 😊!
@amirhossientakeh55402 жыл бұрын
you deserve Nobel prize Nobel man
@statquest2 жыл бұрын
Thank you!
@aswink1123 жыл бұрын
Great teaching Josh Starmer!
@statquest3 жыл бұрын
Thank you kindly!
@igorg4129 Жыл бұрын
In the 1st example both initial relationships(hight to weight and shoe size to hight) are given as linear. Thist the deriviative multiplication gives me not only the deriviative but the slope of themodel predicting the shoe size by weight(final model) What I am missing are 2 things: 1) but in some non linear final model what is the use of knowing the slope equation? It is not a model equation, so can not be used for predictions... what am I missing? 2) another thing confuses me is that here ,at least in ghe shoesize example you use the chainrool to get the final model. But further in backpropagation in each iteration the use of it is different itis kinda to predict weights using them get the "final model" Could you please formulate this difference better then I am trying to? Thank you so much.
@statquest Жыл бұрын
I'm not sure I understand your questions. The idea is that we want to establish a relationship among variables - and how much one changes when we change another. This works for linear and non-linear equations. It also sounds like you are interested in how derivatives are used for backpropagation. For details, see: kzbin.info/www/bejne/qXXZZZlqqJeGeJo and kzbin.info/www/bejne/f3-ViaB4na5_qpY
@irischin61652 жыл бұрын
I graduated with stats degrees from college 10+ years ago and never touched it. Now I feel I re-learned everything overnight!!!!!
@statquest2 жыл бұрын
BAM! :)
@Vanadium404 Жыл бұрын
Such a beautiful intuition that weight height then height shoe size example was just commendable
@statquest Жыл бұрын
Thanks!
@tagoreji21432 жыл бұрын
Thank you Sir for the amazing Tutorial.
@statquest2 жыл бұрын
Thanks!
@jhfoleiss4 жыл бұрын
Awesome Explanation Mr. Starmer! I wish your videos existed back when I was taking Calculus in the university!!! ( which was a long time ago =) )
@statquest4 жыл бұрын
Wow, thanks!
@saifqawasmeh9664 Жыл бұрын
Reading abour Loss in Neural Network and optimization from 20+ sources and could not understand it until watching this video. Big BAM!
@statquest Жыл бұрын
Hooray! Thank you!
@39_ganesh_ghodke983 ай бұрын
Your are an amazing teacher !
@statquest3 ай бұрын
Thank you! 😃
@andersk3 жыл бұрын
Is 13:06 a slight error? The residual-intercept graph shows the point in the negative part of the residual's axis (negative y), yet the residual-sq-residual graph shows the point on the positive side of the residual's axis on that graph (positive x)
@statquest3 жыл бұрын
You are correct! The x-axis on the Residual vs Residual^2 graph is backwards.
@andersk3 жыл бұрын
@@statquest thanks for clarifying - and amazing video again, looking forward to your illustrated guide!
@Klyaa4 ай бұрын
At 6:54 you said that you fit an exponential line to the graph and got hunger = time^2 + 1/2. I have a few questions about that. 1. I've never heard the phrase 'exponential line' before. Do you just mean an exponential 'line' of best fit? 2. You said that the equation is exponential, but that looks quadratic to me. Am I missing something? I really like the way you explained this. Once you think about problems in the 'real world' like this it really starts to make sense how changing one function affects and changes the other and then why you need the chain rule to find the rate of change.
@statquest4 ай бұрын
1. I just mean that we fit a curve defined by the function hunger = time^2 + 1/2 2. I should have said quadratic instead of exponential. I apologize for any confusion that this may have caused.
@Klyaa4 ай бұрын
@@statquest Thanks for replying so quickly on an older video like this! I'm making some math videos of my own right now and I can't believe how easy it is to misspeak or write something wrong. You've done an amazing job with all your videos. This is the only video I've found that attempts to explain the chain rule in an intuitive way without using the limit definition.
@Dy-yg9wq3 жыл бұрын
Please add more adds so we can watch them and actually give back to you
@statquest3 жыл бұрын
Ha! I wish I could remove all the ads. But even then, KZbin will add them.
@jonathangallant-mills64342 жыл бұрын
Hey, can someone help me understand why at 14:55 we Observed and Weight to 0 because they do not contain the intercept? I thought I understood until this point. Now I'm a bit confused and discouraged! Thank you!
@statquest2 жыл бұрын
When we change the value for the intercept, the Observed values do not change (because they are what we observed, they don't every change). Since there is 0 change in the observed values when we change the intercept, the derivative of the observed values with respect to the intercept is 0.
@jonathangallant-mills64342 жыл бұрын
@@statquest Thank you!!!😄
@sheilawang38474 жыл бұрын
such a clean and simple explanation! can't wait for more Math and Statistic videos. You are the awesomeness in KZbin!
@statquest4 жыл бұрын
Thank you! :)
@warrenb74504 жыл бұрын
The best Chain Role tutorial! Do you have any for Relu? Thank you!!
@statquest4 жыл бұрын
Coming soon!
@tanubist77214 жыл бұрын
This be the first time I am laughing learning stats🤣 Thanks alot!
@statquest4 жыл бұрын
Hooray! :)
@jialushen62484 жыл бұрын
Oh boy that's a teaser for neural net. Been looking forward to this!!
@statquest4 жыл бұрын
YES!!! This is the first video in my series on Neural Nets!!!!!!! The next one should be out soon (hopefully late July, but I always run behind so maybe early August).
@alfcnz4 жыл бұрын
6:52 that's not an exponential line (2^x), it's just a parabola (x^2). Anyhow, you're awesome! BAM! Just subscribed!
@statquest4 жыл бұрын
Thanks for catching that. :)
@rosmontis064 ай бұрын
I have a couple questions... At 6:54, what's the time^2 + 1/2 formula supposed to be representing? 🤔 and is that 1/2 supposed to be the intercept? why do we plug it in, is that just a set formula you've gotta learn?
@statquest4 ай бұрын
The formula is for the curve that fits our data. What you do is you get some data and then fit a line (or curve in this case) to it - so the line, and the equation for it, depend on the data.
@robelbelay40654 жыл бұрын
An epically clear explanation. Thank you so much!
@statquest4 жыл бұрын
Thank you! :)
@lightxx062 жыл бұрын
BAM! best explanation so far
@statquest2 жыл бұрын
Thank you! :)
@seabraes2134 жыл бұрын
Thanks for the video! In the last example, why not just plug in height = 2 and weight = 1 to solve for the intercept: When residual = 0, height - ( intercept + (1*weight)) = 0, so intercept = 1?
@statquest4 жыл бұрын
Sure, you could solve the equation directly, but the goal is to show how the chain rule works. Furthermore, by using the chain rule, we solve for the general equation and not just a specific equation tied to this specific data.
@KUMAWANI3 жыл бұрын
I would like to thank you from bottom of my heart for such wonderful videos. Such difficult topic made simple, you are awesome man , keep rocking!!!!
@KUMAWANI3 жыл бұрын
And Triple BAM!!!!
@statquest3 жыл бұрын
Thank you very much! :)
@pooravkadiyan2 жыл бұрын
Your explanation is awesome. Make more videos.
@statquest2 жыл бұрын
Thank you!
@favor2012able4 жыл бұрын
Yet another bravo tutorial video! Thank you, Josh! One question is: what visual software/tool do you use to draw those beautiful plots? Are u like 3Blue1Brown to write a JS front-end tool yourself? Thanks!
@statquest4 жыл бұрын
I'm glad you like the videos! I draw the pictures in Keynote.
@Adhithya20033 жыл бұрын
3b1b does not use JS front end tool , It's Python animation lib powered by Cairo (C lib) or now it uses Open GL.
@RachelPun2 жыл бұрын
17:25 How do we plug in the observed height and weight when we have multiple data points?
@statquest2 жыл бұрын
Just create one term per data point and sum them together. For details, see: kzbin.info/www/bejne/qXXZZZlqqJeGeJo
@RachelPun2 жыл бұрын
@@statquest Thank you!
@syedmustahsan48885 ай бұрын
Thanks a lot Sir Josh. Jzakallah. 😊Emotional
@statquest5 ай бұрын
Thank you very much! :)
@alinadi94279 ай бұрын
your videos are fantastic
@statquest9 ай бұрын
Glad you like them!
@luis96xd2 жыл бұрын
Amazing video! Back to basics 😄👍
@statquest2 жыл бұрын
Thanks!
@krishj80115 ай бұрын
Awesome Tutorial...
@statquest5 ай бұрын
Thank you 🙂!
@lin14503 жыл бұрын
Thank you so much for your videos! I got a StatQuest Shirt for my Birthday... hurray! :)
@statquest3 жыл бұрын
BAM! :)
@honghur89773 жыл бұрын
13:39 how come the slope of at a point on squared residual curve can be written in terms of derivate of squared residual with respect to intercept and not derivate of squared residual with respect to residual? Why do we set the derivate of squared residual with respect to intercept to zero when the slope of the squared residual curve should be written in terms with respect to residual? Shouldn’t we set the latter to zero and solve for intercept? Is it because residual is a function of intercept itself?
@statquest3 жыл бұрын
I'm not sure I understand your questions because they all seem to be answered immediately following that time point in the video. The goal is to find the optimal intercept for the graph of "weight vs height". So we use the chain rule to tell us the derivative of the residual^2 with respect to the intercept. This derivative has two parts, the residual^2 with respect to the residual and the residual with respect to the intercept.
@studgaming6160 Жыл бұрын
Thanks for informative video.
@statquest Жыл бұрын
Thanks!
@birzhanabdikhan453 жыл бұрын
There are people who love StatQuest and there are people who don't know about StatQuest yet... poor souls
@statquest3 жыл бұрын
Thanks! :)
@dhruvsharma79922 жыл бұрын
This video is not just the explanation of "The Chain Rule" instead explained the intuition behind the various loss functions.
@statquest2 жыл бұрын
That's right. The chain rule is used a lot in machine learning so I tried to explain it from that perspective.
@dhruvsharma79922 жыл бұрын
@@statquest Thanks for all of the videos, they all really help alot
@statquest2 жыл бұрын
@@dhruvsharma7992 Thanks!
@supernenechi Жыл бұрын
Despite how good you are at explaining, I'm still having a hard time with it all. My confidence isn't exactly helped by the fact that all the other people in the comments seem to somehow be doing PhDs and stuff, but okay... How can I try to understand it even better?
@statquest Жыл бұрын
Can you tell me what time point, minutes and seconds, you first got confused?
@animeadventures-n8j8 ай бұрын
In the explanation, could you provide example code for the chain rule based on what you have shown in the provided examples?
@statquest8 ай бұрын
I'll keep that in mind.
@vaishnavi43544 жыл бұрын
Awesome Statquest... Initially played Song and concept too!!😎😎😎
@statquest4 жыл бұрын
Thanks! :)
@ThePCxbox Жыл бұрын
Another explanation I saw on reddit that solidifies my understanding of the chain rule: "Mommy is a function with a baby function inside. This is how we find out where her bumps are: Differentiate the mommy- keep the baby inside Differentiate the baby Times them together And you're done :-) Example: (3x2 + x)4 The mommy is ( )4 , the baby is 3x2 + x Differentiate the mummy: 4( )3 Keep the baby inside: 4(3x2 + x)3 Differentiate the baby: 6x+1 Times them together: (6x+1)*4(3x2 + x))3 = (24x+4)(3x2 + x)3 And we're done" For your last example: why is observed and (1*weight) = 0 though? the chain rule makes sense now but Im trying to grasp that math. You say its because the terms do not contain the intercept but what does that mean?
@statquest Жыл бұрын
When we take the derivative of something, we take the derivative relative to something. In this case, we take the derivative relative to the intercept. Thus, we are interested in how much things change when the intercept changes. (1 * weight) does not change when the intercept value changes - it stays the exact same, regardless of what value we plug in for the intercept. Since there is 0 change in (1 * weight) when we change the intercept, then the derivative of (1 * weight) with respect to the intercept = 0.
@krystof70596 ай бұрын
15:00 what does it mean that "this term does not contain intecept"?
@statquest5 ай бұрын
The equation has 3 terms separated by minus signs. The first term "observed" is not multiplied or divided by the intercept, so it does not include the intercept. The second term is essentially 1 * intercept, so includes the intercept. The third term (1 * weight) is not multiplied or divided by the intercept, so does not include it.
@jliou3695 ай бұрын
Hi, just bought my copy of The StatQuest Illustrated Guide to Machine Learning (paperback) from Amazon a few days ago and wonder if I can give the kiddle version for free or not? BTW, the Amazon doesn't show the Kindle version, is that mean it no longer exist? Thanks!
@statquest5 ай бұрын
Thank you very much for your support! Unfortunately the kindle version no longer exists.
@hassanalimohammadi45532 жыл бұрын
hanks for all your amazing videos. I'm still learning from you :)
@statquest2 жыл бұрын
Thank you!
@user-or7ji5hv8y4 жыл бұрын
A video also on probability chain rule would be awesome
@statquest4 жыл бұрын
Noted! :)
@joeyshias Жыл бұрын
謝謝!
@statquest Жыл бұрын
Hooray!!! Thank you so much for supporting StatQuest!!! BAM! :)
@motherisape2 жыл бұрын
Awesomeness = like statquest squared 😆 🤣
@statquest2 жыл бұрын
Thank you!
@elielberra28672 жыл бұрын
Amazing video thanks!
@statquest2 жыл бұрын
Thanks!
@wrjazziel4 жыл бұрын
LMAO, the song at the beginning xD, just for that I'm giving it a like.
@statquest4 жыл бұрын
BAM! :)
@Euglerio3 ай бұрын
Thank you for your videos! 07:20 why do we represent the slope with square root? how do we know?Okey I got,it is basics of math. We should know.