No video

Backpropagation Details Pt. 2: Going bonkers with The Chain Rule

  Рет қаралды 128,249

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Пікірлер: 481
@statquest
@statquest 2 жыл бұрын
The full Neural Networks playlist, from the basics to deep learning, is here: kzbin.info/www/bejne/eaKyl5xqZrGZetk Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@TheLLMGuy
@TheLLMGuy 3 жыл бұрын
i think you are one of the few teachers on the planet who understands that the secret of understanding is simplicity! Thank you
@statquest
@statquest 3 жыл бұрын
Thanks!
@Priyanshuc2425
@Priyanshuc2425 4 ай бұрын
Thank yoy for amazing fantastic video. No words is enough to say how great your channel and specially Bam! Is 😊
@maksims4669
@maksims4669 3 жыл бұрын
Throughout all my years of Bachelor studies I was avoiding computer sciences and statistics as much as it was possible, for I could not understand them. However, when I enrolled for Masters I had no other choice but to figure it out. Last semester I had a compulsory course in Computational Intelligence, so in order to understand the material I had to find some additional sources. That's how I encountered StatQuest. You explained everything so well that through the summer I was inspired to make an additional course in Machine Learning. This semester I took several courses in statistics and optimization, next semester I will certainly take more and now I really consider to connect my life to one of these fields. All this with help of your channel. I have no doubts that there are and will be much more people like me, to whom you became a guiding light in their studies. Thank you for your work, Josh, and keep up helping the curious ones in finding answers. Have a great year 2021!
@statquest
@statquest 3 жыл бұрын
Wow! Thank you very much! Good luck with your career and happy 2021!!!! :)
@MelonsIncorporated
@MelonsIncorporated 3 жыл бұрын
I'm surprised more teachers don't know about this channel, I'm not exaggerating when I say it's the MOST incredibly useful tool in helping me understand material in a class that makes no sense.
@statquest
@statquest 3 жыл бұрын
Wow, thank you!
@tensorsj
@tensorsj 3 жыл бұрын
Josh, I cannot stress how much of an empathetic teacher you are. I went through years and years of education in engineering and I tend to forget the low level math behind it all. Here I can rescue that with you. You are just an amazing human being :)
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@Vanadium404
@Vanadium404 Жыл бұрын
This is the best explanation on YT. No fancy animations just pure calculation. Better than 3B1B and others.
@statquest
@statquest Жыл бұрын
Thanks! I specifically tried to go deeper into topics that 3B1B only skimmed over because I wanted our videos to complement each others.
@nandakumar8936
@nandakumar8936 4 ай бұрын
every time you mention the soft plus activation function you show the toilet paper pic. Every time. That's commitment right there.
@statquest
@statquest 3 ай бұрын
bam! :)
@mrglootie101
@mrglootie101 3 жыл бұрын
Finally!, tbh you're really good in teaching Josh, simple but detail, keep it! Bam! From Indonesia
@statquest
@statquest 3 жыл бұрын
Thanks! 😃
@kanui3618
@kanui3618 3 жыл бұрын
mantap bangg
@Nandeesh_N
@Nandeesh_N 3 жыл бұрын
"Quadruple bam" is what I feel as soon as I learn something from your video. It's so amazing and when I get any doubt, first thing I do is to check a relevant video on your channel! Thank you Josh for these amazing videos!
@statquest
@statquest 3 жыл бұрын
Happy to help!
@seanfitzgerald6165
@seanfitzgerald6165 3 жыл бұрын
There were so many BAM's in this video it made my head spin, and I love it.
@statquest
@statquest 3 жыл бұрын
BAM!
@rajathk9691
@rajathk9691 3 жыл бұрын
I'm really confused right now because the quality of the content really makes me want to promote you but I also wanna keep this channel a secret to myself.
@statquest
@statquest 3 жыл бұрын
I'm glad you like my videos! :)
@bruceeugene7500
@bruceeugene7500 3 жыл бұрын
A tip : you can watch movies at Flixzone. I've been using it for watching lots of of movies recently.
@houstonabraham6181
@houstonabraham6181 3 жыл бұрын
@Bruce Eugene yea, have been watching on flixzone} for since december myself =)
@adrienmaverick8334
@adrienmaverick8334 3 жыл бұрын
@Bruce Eugene yup, I've been watching on Flixzone} for years myself :D
@aakarshanraj1176
@aakarshanraj1176 2 жыл бұрын
@Rajath K "Share your knowledge. It’s a way to achieve immortality."
@None_me_
@None_me_ 2 жыл бұрын
This is the best thing ever... it's like you help me understand the complex concepts in 3Blue1Brown in the simplest ways in these videos....you both are the epitome of Bam!......
@statquest
@statquest 2 жыл бұрын
BAM! Thank you very much! :)
@kalpaashhar6522
@kalpaashhar6522 3 жыл бұрын
Brilliantly explained. Not only is it easy to assimilate the information due to the colours, the style in which you have broken down a complex explanation is a skill not many teachers have. Keep up the awesome work
@statquest
@statquest 3 жыл бұрын
Many thanks!!
@tangh146
@tangh146 5 ай бұрын
the preschool vibe really does make everything much less intimidating. love u josh 🥺
@statquest
@statquest 5 ай бұрын
:)
@wenyange2607
@wenyange2607 3 жыл бұрын
Thank you for making this video. It's very clear, easy to follow, and super helpful for understanding the algorithms.
@statquest
@statquest 3 жыл бұрын
Glad it was helpful!
@brucebalfour1002
@brucebalfour1002 3 жыл бұрын
I love your videos so much. So helpful. I always look forward to the BAMs, not only because they are fun, but also because it gives a sense of fulfillment. BAM from Switzerland.
@statquest
@statquest 3 жыл бұрын
BAM from North Carolina! :)
@zihaozhou1256
@zihaozhou1256 3 жыл бұрын
Honestly, some of the illustration is better than the college professors.
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@panku077
@panku077 2 жыл бұрын
A phenomenal example of teaching from first principles. Josh's brand of elegance and charisma kept me engaged and on track to mastering this topic.
@statquest
@statquest 2 жыл бұрын
Hooray! :)
@nabeelhasan6593
@nabeelhasan6593 2 жыл бұрын
Understanding Back Propagation always give me Panic Attack but your style and simplicity is beyond amazing. Thanks for making such a complicated topic to be so easy to understand
@statquest
@statquest 2 жыл бұрын
Thanks!
@siddharthmodi2740
@siddharthmodi2740 2 жыл бұрын
I dont believe , how can someone explain this weird topic with this level of simplicity.Hats off to your efforts. Thank you josh
@statquest
@statquest 2 жыл бұрын
Thank you!
@ramyav4689
@ramyav4689 2 жыл бұрын
Dear Josh, Thank you so much for making this video. I have always been intrigued by the way you explain the mathematics of complicated concepts! It is just amazing.
@statquest
@statquest 2 жыл бұрын
Thank you!
@alejandrocanada2180
@alejandrocanada2180 3 жыл бұрын
I wish all the teachers in the world explain as good as you do. Thank you very much!
@statquest
@statquest 3 жыл бұрын
Wow, thank you!
@mathscorner9281
@mathscorner9281 3 жыл бұрын
I wish I could find videos on every topic in which I m having problem on this channel. Sir, you are really a fabulous teacher.
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@nandankakadiya1494
@nandankakadiya1494 3 жыл бұрын
The best and perfect videos on Machine learning I have ever seen bammmm thank you ......
@statquest
@statquest 3 жыл бұрын
Wow, thanks!
@fengjeremy7878
@fengjeremy7878 Жыл бұрын
Three lectures about backward propagation. Finally understand what's going on in this fancy technique. Thank you sir!
@statquest
@statquest Жыл бұрын
Triple bam!!! :)
@TheGoogly70
@TheGoogly70 3 жыл бұрын
Awesome! It appeared daunting in the beginning, but at the end it was so easy to understand. Great job!
@statquest
@statquest 3 жыл бұрын
Hooray! :)
@jancvrcek1541
@jancvrcek1541 3 жыл бұрын
Erudite, funny and free of charge (triple BAM!!) what more could we wish? Absolutely love your videos!
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@parshanjavanrood368
@parshanjavanrood368 Жыл бұрын
With all the hype around AI and machine learning, it's not easy to find sources that teach the statistics behind these subjects, and the ones that do teach the math behind it, make it very hard to understand(from the perspective of a second-year bachelors student), but what you do is awesome. Thanks for this great series.
@statquest
@statquest Жыл бұрын
Thank you!
@parijatkumar6866
@parijatkumar6866 3 жыл бұрын
As always brings a smile along with great clarity...
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@huidezhu7566
@huidezhu7566 3 жыл бұрын
This is the clearest explanation I can find. Amazing work
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@hunter8831
@hunter8831 3 жыл бұрын
You are such a great teacher that you make machine learning seem so easy and motivate anyone to learn more about it
@statquest
@statquest 3 жыл бұрын
Wow, thank you!
@shivamgaur8624
@shivamgaur8624 2 жыл бұрын
I know I'm enjoying a video when I like it even before it starts. Amazing work!!
@statquest
@statquest 2 жыл бұрын
BAM! :)
@tupaiadhikari
@tupaiadhikari 2 жыл бұрын
Thank You Josh, for these valuable Videos. I am immensely grateful to you for making these videos. You are truly a Legend in the Data Science Community. Love and Gratitude from Kolkata, India.
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@haozeli9660
@haozeli9660 3 жыл бұрын
damn, the quality of the content of this channel is just over the top. Hope u can get more views because u definitely deserve it!!!!!
@statquest
@statquest 3 жыл бұрын
Thank you!
@lisun7158
@lisun7158 2 жыл бұрын
[Notes] 1:27 2:45 3:06 Because of the Chain Rule, we get the formula for the derivative of SSR with respect to w1. (w3 is used) 7:31 formula for the derivative of SSR with respect to b1 (w3 is used) 9:10 formula for the derivative of SSR with respect to w2 (w4 is used) 9:15 formula for the derivative of SSR with respect to b2 (w4 is used) formula for the derivative of SSR with respect to b3 = (d SSR / d Predicted)*(d Predicted/ d b3) [ref.13:24 kzbin.info/www/bejne/f3-ViaB4na5_qpY&ab_channel=StatQuestwithJoshStarmer] Based on formula above, (The best StatQuest gives the succinctest formula I've ever seen) I get the solution for question below, Q: why called "backward"? why dynamic programming? A: The backpropagation algorithm computes the gradient of the loss function with respect to each weight by the chain rule. The calculation of the gradient proceeds backwards through the network, computing the gradient one layer at a time, iterating backward from the last layer to avoid redundant calculations of intermediate terms in the chain rule. [ref. wikipedia] 11:00 use each derivative to calculate respective step size and each new value. -- optimize all parameters of the NN simultaneously
@statquest
@statquest 2 жыл бұрын
Double bam! :)
@avinashbabudebbati1095
@avinashbabudebbati1095 3 ай бұрын
"THE CHAAAAIN RULEEEEEEE" gets me every time 😂
@statquest
@statquest 3 ай бұрын
bam! :)
@azizmouhanna8996
@azizmouhanna8996 8 ай бұрын
Thank you from the bottom of my heart, from a phd student who learns a lot from you 🙏
@statquest
@statquest 8 ай бұрын
Happy to help!
@mitchynz
@mitchynz Жыл бұрын
I love this explanation so much.... It actually helped me intuitively understand the chain rule too. I just purchased your book on machine learning which is the perfect compendium to this series.
@statquest
@statquest Жыл бұрын
Thank you very much!
@powangsept
@powangsept 3 жыл бұрын
Watching the videos in your channel let me love the statistics more and more! Thank you for the great videos!
@statquest
@statquest 3 жыл бұрын
Awesome, thank you!
@aakarshanraj1176
@aakarshanraj1176 2 жыл бұрын
you really explained it better than anyone on youtube. Thanks a lot, it was really helpful.
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@elhairachmohamedlimam9640
@elhairachmohamedlimam9640 Жыл бұрын
Thank you so much, I have wasted a lot of time to understand these things, but really, after watching your videos things become very easy, thank you a lot
@statquest
@statquest Жыл бұрын
Thanks!
@amyma2204
@amyma2204 Жыл бұрын
You saved tons of confused souls by this amazing explanation.
@statquest
@statquest Жыл бұрын
Thank you!
@semenbondarenko3512
@semenbondarenko3512 2 жыл бұрын
I'm your fan, seriously you are the best teacher I have ever met. ~~~Triple Thanks~~~
@statquest
@statquest 2 жыл бұрын
Wow, thanks!
@praveerparmar8157
@praveerparmar8157 3 жыл бұрын
As a matter of fact, I was going bonkers trying to understand Backpropagation until I watched this video. Now I'm stable 😁
@statquest
@statquest 3 жыл бұрын
bam!
@anilkumar-ki1xb
@anilkumar-ki1xb Жыл бұрын
Josh, You are a true saviour....
@statquest
@statquest Жыл бұрын
:)
@willw4096
@willw4096 Жыл бұрын
Thanks for the great video! My notes: 0:59 9:49 10:54 - 11:07
@statquest
@statquest Жыл бұрын
Nice work!
@lakshman587
@lakshman587 3 жыл бұрын
Million billion trillion thanks for this Neural networks videos Josh! BAM!!! You are really awesome!!!!
@statquest
@statquest 3 жыл бұрын
BAM! :)
@MADaniel717
@MADaniel717 3 жыл бұрын
I'm back Josh! Exactly what I needed. Now I'm going to try to implement a neural network from scratch in Python using your videos :D
@statquest
@statquest 3 жыл бұрын
Go for it!
@MADaniel717
@MADaniel717 3 жыл бұрын
@@statquest github.com/danielmarostica/simple-neuralnet/blob/main/neural_network.py I did it! Can't believe haha. Thanks Josh!
@statquest
@statquest 3 жыл бұрын
@@MADaniel717 TRIPLE BAM!!!! Congratulations. That is awesome!!!
@lin1450
@lin1450 2 жыл бұрын
Thank you so much for your content. I will be forever grateful. They way you convey the information with such simplicity, step-by-step and humor makes it so fun to watch and builds up motivation. Concepts that seemed way out of reach for me are becoming something I'm slowely building trust to truly and deeply understand them one day! Thank you so much!
@statquest
@statquest 2 жыл бұрын
Thank you for your support!! When I have time, I'll design some more merch.
@tagoreji2143
@tagoreji2143 Жыл бұрын
Educating along with Entertaining. That too for a complicated topic.Thank you very much, Professor
@statquest
@statquest Жыл бұрын
BAM! :)
@kostjamarschke4613
@kostjamarschke4613 2 жыл бұрын
Love the video and explanation, but I love the SoftPlus commercial every time you mention the activation function even more.
@statquest
@statquest 2 жыл бұрын
bam!
@franciscoruiz6269
@franciscoruiz6269 2 жыл бұрын
You're a master! You have gain all my respect.
@statquest
@statquest 2 жыл бұрын
Wow, thanks!
@magtazeum4071
@magtazeum4071 3 жыл бұрын
this channel is lovely..Thank you Josh...
@statquest
@statquest 3 жыл бұрын
Thank you!
@anasmomani647
@anasmomani647 3 жыл бұрын
u literally should get Nobel prize for your videos
@statquest
@statquest 3 жыл бұрын
bam!
@yonasabebe
@yonasabebe 2 жыл бұрын
You make it look so easy. Thanks for your effort and contribution.🙏
@statquest
@statquest 2 жыл бұрын
Thank you!
@TheGabytls
@TheGabytls 2 жыл бұрын
Thank you, thank you, thank you thank you, thank you!!! I never thought I could understand this type of thing. If somebody asks me to explain this to them I'll say... THE CHAAAIN RULE!!! Thanks Josh! I'll donate part of my first salary to you eventually :)
@statquest
@statquest 2 жыл бұрын
Hooray! I'm glad my videos are helpful. :)
@pavapequeno
@pavapequeno 2 жыл бұрын
Hi Josh, what you have made was for me the first definitive guide to advanced stats and ML that is accessible and understandable by anyone with a basic scientific background. Multiple Bam! I searched a while before stumbling on this treasure trove. Thank you so much! Will you also be releasing some videos on graph embeddings, GNNs/GCNs? I think you would have a very eager audience (certainly including me at least)!
@statquest
@statquest 2 жыл бұрын
Thanks! I'll keep those topics in mind, but for now I'm working on LSTM and Transformer neural networks.
@pavapequeno
@pavapequeno 2 жыл бұрын
Thanks for the quick reply, Josh :-) I‘ll keep my eyes peeled for future videos. I guess graphs and graph embeddings is almost a playlist in itself! Once again thank you for opening my eyes to this wonderful world! All the best with the current new videos!
@ek_minute_
@ek_minute_ 10 ай бұрын
thanks for the toilet paper reference of soft plus now i will always revise the backpropagation Every time i go to toilet.
@statquest
@statquest 10 ай бұрын
bam?
@petercourt
@petercourt 3 жыл бұрын
Fantastic work Josh! :)
@statquest
@statquest 3 жыл бұрын
Thank you! And thank you for your support!
@marccrepeau6853
@marccrepeau6853 3 жыл бұрын
Thanks for this great series of tutorials! One thing I'm confused about: in this video some of the final parameter estimates (after the 450 gradient descent steps) end up larger than the original (random) parameter estimates, and some end up smaller. But the derivatives calculated for the first gradient descent step are all positive, so if you multiply them by a constant (positive) learning rate you will end up *decreasing* all parameters (by subtracting a positive value from them). Thinking about that parabola in your gradient descent tutorial, you would be starting with tangent lines (derivatives) all on the right side of the parabola. All derivatives are positive and thus all the tangent lines have positive slopes. Gradient descent will subtract positive step sizes from all parameters. All parameters will thus *decrease*. So how do some final parameter values end up greater than the original (random) estimates? (For example w1 is originally set at 2.74 and the final estimate is 3.34)
@statquest
@statquest 3 жыл бұрын
I think the answer might just be that the 7-dimensional surface that we are trying to find the bottom of with gradient descent may have some complicated shape that includes local minima that we get out of because the step size is large enough to allow for that sort of thing.
@marccrepeau6853
@marccrepeau6853 3 жыл бұрын
@@statquest So if you step outside of a local minima then you might end up with negative derivatives in subsequent descent cycles? I think that makes sense!
@statquest
@statquest 3 жыл бұрын
@@marccrepeau6853 bam!
@pelocku1234
@pelocku1234 3 жыл бұрын
This video was great and made me want to build this in R. One note to others that are trying to use the same starting values and get to the same optimal values, you will need to use a learning rate that is not .1 because this will find different optimal values that are not quite the same as we see here. Just wanted to drop this note in case someone else tried to recreate it like I did. Again, it was the teaching that inspired me to do it.
@statquest
@statquest 3 жыл бұрын
Glad you were able to figure something out that worked for you! :)
@twandekorte2077
@twandekorte2077 3 жыл бұрын
Great video. You are exceptional with regards to explaining difficult concepts in an easy and straightforward way :)
@statquest
@statquest 3 жыл бұрын
Thank you, and thanks for your support!
@ouche71
@ouche71 3 жыл бұрын
I really liked this series, but I still feel confused what happens when we have more than one feature. Keep the great work!
@statquest
@statquest 3 жыл бұрын
We'll get there soon enough.
@yacinerouizi844
@yacinerouizi844 3 жыл бұрын
best tutorials on machine learning, thank you!
@statquest
@statquest 3 жыл бұрын
Glad you think so!
@devharal6541
@devharal6541 Жыл бұрын
You are best Josh!!
@statquest
@statquest Жыл бұрын
Thank you! :)
@dijkstra4678
@dijkstra4678 2 жыл бұрын
Any other videos always omit the most key details and only generally explain the concepts such that you come out of that video having learned absolutely nothing on how it actually works. The world needs more videos like yours which actually explain these concepts in mathematical detail without getting too difficult either.
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@xinyuechang6062
@xinyuechang6062 2 жыл бұрын
Not me clapping my hands watching that green line fit perfectly with data 🐕
@statquest
@statquest 2 жыл бұрын
:)
@tymothylim6550
@tymothylim6550 2 жыл бұрын
Wonderful video! Now I will always think of toilet paper when seeing the softplus activation function :)
@statquest
@statquest 2 жыл бұрын
BAM!!! :)
@user-ur2en1zq4f
@user-ur2en1zq4f Жыл бұрын
Sir, you are gold. Thanks
@statquest
@statquest Жыл бұрын
Thank you!
@simplifiedscience7497
@simplifiedscience7497 Жыл бұрын
You are so amazing!! You really made me love machine learning!!
@statquest
@statquest Жыл бұрын
Wow, thank you!
@sreerajnr689
@sreerajnr689 3 ай бұрын
This is such a fun to learn!! 😀😀
@statquest
@statquest 3 ай бұрын
Thanks!
@Luxcium
@Luxcium 5 ай бұрын
This was pretty cool to watch 😅but the more I watch the less remains 😢 which is so sad given how JS is so busy with the rest of his life and stuff 😮 I don’t know if he will ever have enough time to make videos awesome again MVAA
@statquest
@statquest 5 ай бұрын
:)
@jiayiwu4101
@jiayiwu4101 7 ай бұрын
I like this beginning song a lot!
@statquest
@statquest 7 ай бұрын
bam! :)
@1243576891
@1243576891 7 ай бұрын
Thanks for the video. This is awesome!
@statquest
@statquest 7 ай бұрын
Glad you liked it!
@ignaciozamanillo9659
@ignaciozamanillo9659 3 жыл бұрын
Thanks as simple and good as always! Planning about a tutorial of Neural Networks in R / Python? Would be great
@statquest
@statquest 3 жыл бұрын
That's the plan! :)
@mischievousmaster
@mischievousmaster 3 жыл бұрын
Hey josh, could we expect videos where you write code in python and implement ML models?. I know you did one on decision trees and I really loved it. I know you gave us a conceptual understanding of Ml in all of your videos but what I'm requesting is to give examples for every popular Ml algorithm along with python code in three or two levels of complexity(basic, intermediate) etc. There are other channels who do this, but if we could learn it from your teaching approach it would totally be one of a kind. The reason I ask you this is because I genuinely think you're a teacher who puts himself in the shoes of a beginner's mind who is laboring to understand all the data science and statistics jargon, and tries to explain stuff so that it stokes up interest and ambition in them. It certainly worked for me. I was never good at math let alone stats. But your videos helped me tremendously and now I am a little better and thanks to you for that. I know I asked you a lot and you may not have the time for all that I requested, but hopefully one day. Thank you Josh.
@statquest
@statquest 3 жыл бұрын
I've also done webinars on SVMs in python: kzbin.info/www/bejne/bnKafWN9qKecgrM and XGBoost in python: kzbin.info/www/bejne/faOtgWx8gbtmfKc and, in 2021 I hope to do more.
@mischievousmaster
@mischievousmaster 3 жыл бұрын
@@statquest thank you!
@bhavikdhandhalya
@bhavikdhandhalya 5 ай бұрын
Thank you for wonderful videos.
@statquest
@statquest 5 ай бұрын
Glad you like them!
@LEELEE-dg3xd
@LEELEE-dg3xd 10 ай бұрын
This video really helped me a lot!
@statquest
@statquest 10 ай бұрын
BAM! :)
@yeyuan4235
@yeyuan4235 3 жыл бұрын
Thanks Josh for another great video! Do you plan to introduce RNN and CNN at some point? Looking forward to your clear explanations on these topics.
@statquest
@statquest 3 жыл бұрын
I hope so.
@nikolatotev
@nikolatotev 2 жыл бұрын
I have a question about backpropagation: (edit after finishing writing, actually 2) In a real implementation when performing backpropagation do the weight values get after each layer is reached or does the algorithm go through the whole network, saving how each weight & bias should change, and then after reaching the start of the network all of the values get updated? And a question that is related to a more complicated version of neural networks - In Convolutional neural networks that use Skip Connections, during the forward pass results from the previous layers gets concatenated with a deeper layer. My question is - when performing a backpropagation are the skip connections used to pass the gradient directly to a layer closer to the start of the network or are the skip connections just ignored. I'm not sure if anyone else is struggling with backpropagation in CNNs, if there are more people a video on the topic with your teaching style would be amazing!
@statquest
@statquest 2 жыл бұрын
My understanding is that, for each iteration of backpropagation, there is a single "forward pass", were the data is run through the neural network, and this is used to calculate the loss (in this case, the sum of the squared residuals), and then it does a single "backwards pass", where all of the parameters are updated at once. If the parameters were updated one at a time, then we would have to do a bunch of extra forward passes, one per parameter that we want to update, and that would probably take a lot longer. As for your question about CNNs, I don't know the answer, but, believe it or not, 2022 is the year of the neural network for StatQuest, and I plan on making a lot more videos about them, so I might get to this question before too long (Unfortunately I can't promise I will!).
@ArifDolanGame
@ArifDolanGame 3 жыл бұрын
glad I found this channel! very helpful 👍
@statquest
@statquest 3 жыл бұрын
Glad to hear it!
@sattanathasiva8080
@sattanathasiva8080 3 жыл бұрын
Best videos for stat
@statquest
@statquest 3 жыл бұрын
Thanks!
@juliank7408
@juliank7408 6 ай бұрын
Thank you very much! Appreciated!
@statquest
@statquest 6 ай бұрын
You're welcome!
@harishbattula2672
@harishbattula2672 2 жыл бұрын
Thank you for the explanation.
@statquest
@statquest 2 жыл бұрын
You are welcome!
@SuperYkf
@SuperYkf 6 ай бұрын
OMG This is amazing! 😭
@statquest
@statquest 6 ай бұрын
bam! :)
@pratyushsinha374
@pratyushsinha374 3 жыл бұрын
This is goldmine
@statquest
@statquest 3 жыл бұрын
Thanks!
@namanjha4964
@namanjha4964 8 ай бұрын
Thanks! Loved it!!!!!!!
@statquest
@statquest 8 ай бұрын
Thank you!
@justinwhite2725
@justinwhite2725 3 жыл бұрын
Thank you for the animation at the end. I've been purplexed why my network with 2 hidden layers seems to find a 'happy medium' where all the outputs are muddled values around the median. Your graph showed this is normal. I suspect this means I need more time to train the deeper layers towards the inputs.
@statquest
@statquest 3 жыл бұрын
Good luck! :)
@engindenizalpman
@engindenizalpman 3 жыл бұрын
Amazing as always 😎👍
@statquest
@statquest 3 жыл бұрын
Thank you!
@rameshmitawa2246
@rameshmitawa2246 3 жыл бұрын
This guy is just too awesome :)
@statquest
@statquest 3 жыл бұрын
Thanks!
@jitpackjoyride
@jitpackjoyride 3 жыл бұрын
Thank you for this! Love learning from statquest Hope there’s more videos for CNN
@statquest
@statquest 3 жыл бұрын
More to come!
@jakhongirkhatamov3694
@jakhongirkhatamov3694 2 ай бұрын
You the man!
@statquest
@statquest 2 ай бұрын
Thanks!
@superk9059
@superk9059 2 жыл бұрын
throughly, clearly, amazing, awsome!!! Thounds BAM!!!
@statquest
@statquest 2 жыл бұрын
Wow, thanks!
@suryatejakothakota7742
@suryatejakothakota7742 3 жыл бұрын
Bam from India 😍
@statquest
@statquest 3 жыл бұрын
Thanks!
@joehsiao6224
@joehsiao6224 3 жыл бұрын
awesome! so it doesn't matter which weight to start with!
@statquest
@statquest 3 жыл бұрын
We do them all at the same time.
@MrDemultiplexer
@MrDemultiplexer Жыл бұрын
I think this series is better than 3Blue1Brown's
@statquest
@statquest Жыл бұрын
Thank you very much! I tried to include details that he did not.
@durrotunnashihin5480
@durrotunnashihin5480 2 жыл бұрын
Question please: Why we use backpropagation and forward propagation in training process, while only forward propagation could possibly find the optimal parameters? Is it faster than only forward propagation? Or any other reason? Anw, I watched a lot of videos from your channel, it is very interactive and make the complexity simpler.. thank you for the effort :)))
@statquest
@statquest 2 жыл бұрын
I'm not sure I understand your question. Why do you think it is possible to find the optimal parameters using only forward propagation? I'm pretty sure that would be impossible, but maybe you know something I do not.
@durrotunnashihin5480
@durrotunnashihin5480 2 жыл бұрын
@@statquest let's say we train the data with forward propagation. At iteration 500, we get all the gradients of the weight < 0.001. Is this not possible? of course, in practice, it could be difficult to get the optimal parameters with a few training data only
@statquest
@statquest 2 жыл бұрын
@@durrotunnashihin5480 Again, I've never heard of anyone training a neural network only using forward propagation. Can you provide me a link to a reference where this is done?
@Luthea
@Luthea 3 жыл бұрын
And this is the first step in my adventure to create an earthquake early warning system. BAM!
@statquest
@statquest 3 жыл бұрын
Let me know how it turns out! BAM! :)
@krizroycetahimic4087
@krizroycetahimic4087 3 жыл бұрын
You made me realize what AI truly is. THANKS!
@statquest
@statquest 3 жыл бұрын
bam! :)
@kanui3618
@kanui3618 3 жыл бұрын
Keep it up, josh🔥
@statquest
@statquest 3 жыл бұрын
Thanks!
Neural Networks Pt. 3: ReLU In Action!!!
8:58
StatQuest with Josh Starmer
Рет қаралды 260 М.
Backpropagation Details Pt. 1: Optimizing 3 parameters simultaneously.
18:32
StatQuest with Josh Starmer
Рет қаралды 199 М.
UNO!
00:18
БРУНО
Рет қаралды 4,9 МЛН
Instalando PrimeFaces Formacao Java Web FullStack e Spring Boot REST API
5:10
Alex - JDev Treinamento on-line
Рет қаралды 4
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 382 М.
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,6 МЛН
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!
36:15
StatQuest with Josh Starmer
Рет қаралды 659 М.
Covariance, Clearly Explained!!!
22:23
StatQuest with Josh Starmer
Рет қаралды 548 М.
The medical test paradox, and redesigning Bayes' rule
21:14
3Blue1Brown
Рет қаралды 1,2 МЛН
Backpropagation Algorithm | Neural Networks
13:14
First Principles of Computer Vision
Рет қаралды 36 М.
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 935 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 597 М.
Why Neural Networks can learn (almost) anything
10:30
Emergent Garden
Рет қаралды 1,2 МЛН