Neural Networks Pt. 2: Backpropagation Main Ideas

  Рет қаралды 553,579

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Пікірлер: 591
@statquest
@statquest 3 жыл бұрын
The full Neural Networks playlist, from the basics to deep learning, is here: kzbin.info/www/bejne/eaKyl5xqZrGZetk Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@motherisape
@motherisape 2 жыл бұрын
Bamm
@gbchrs
@gbchrs 2 жыл бұрын
@seanleith5312
@seanleith5312 Жыл бұрын
Quit the singing, please
@statquest
@statquest Жыл бұрын
@@seanleith5312 Noted
@MahediHasanBD
@MahediHasanBD 2 күн бұрын
@@motherisape double Bammmm!
@dcscla
@dcscla 4 жыл бұрын
Man, your promotions are not shameless! Actually, what you do is a gift for us, for the price that you charge and for the level of the content, we are being gifted and not a buying something. You are far better than a lot of paid (and expensive) courses. Just check out your video comments to see how people few happy when they discover your videos!! Great work as always. Thank you so much!!!👏🏻👏🏻👏🏻👏🏻
@statquest
@statquest 4 жыл бұрын
Thank you very much! :)
@Luxcium
@Luxcium Жыл бұрын
He is using the concept of reverse psychology by presenting great stuff at a good price and as you mentioned theses promotions are not shameless… They are shameful, as you hinted he should indeed be ashamed of giving us such a good and advantageous offer… 😅😅😅😅
@jonforce360
@jonforce360 4 жыл бұрын
You released this video just in time for my AI exam! Thank you. Sometimes I think professors use really complex notation just to feel smarter than students, it doesn't help learning. I love your content.
@statquest
@statquest 4 жыл бұрын
Thank you very much!
@sarazahoor9133
@sarazahoor9133 3 жыл бұрын
I want to copy-paste this comment! :D
@puppergump4117
@puppergump4117 2 жыл бұрын
Ain't that right. They must be mad that they don't understand the actually smart people so they don't want to be understood either.
@zhongtianjackwang5346
@zhongtianjackwang5346 Жыл бұрын
lol, that is exactly what I want to say
@ElNick09
@ElNick09 2 жыл бұрын
I have been a student my entire life and have taught college level courses myself, and I must say you are one of the finest lecturers I have ever seen. This statquest is a gem. Your work is so succinct and clear its as much art as it is instruction. Thank you for this incredible resource!
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@hamzasaaran3011
@hamzasaaran3011 2 жыл бұрын
I am studying for a Master's degree in bioinformatics now, and as someone who knows little about statistics, I really can't thank you enough for your videos and the effort that you have put into them.
@statquest
@statquest 2 жыл бұрын
Thank you!
@juliocerono_stone5365
@juliocerono_stone5365 4 ай бұрын
I am already 65, and your videos have helped me understand the basics behind NN. Thank you so much!!!!
@statquest
@statquest 4 ай бұрын
Bam! :)
@iskrega
@iskrega 2 жыл бұрын
I just want you to know your channel has been instrumental in helping me towards my Data Science degree, I'm currently in my last semester. I'll be forever grateful for your channel and the time you take to make these videos. Thank you so much.
@statquest
@statquest 2 жыл бұрын
Thank you and good luck with your final semester! BAM! :)
@filipwojcik4133
@filipwojcik4133 Ай бұрын
I've started my data science journey back in 2010. At that time, I struggled a lot to learn backpropagation for NN - I wanted to fully understand what's going on. Years passed, I worked in the data science industry giants, got my Ph.D. degree in the field meanwhile, and started teaching my own students. I can say one thing - StatQuest's backpropagation explanation is the best I've ever seen. I redirect all my students to this series of videos, as I believe there is no better, cleaner and no bull***t explanation available. Most of the videos and tutorials out there make it even more complicated by introducing **FANCY NOTATION** with deltas, nablas, etc. or other extra symbols. This one is simple, pure and gets the job done. Statquest - this is an excellent work, I wish we had more handbooks and papers written as clearly as your videos + books. All the best wishes!
@statquest
@statquest Ай бұрын
Thank you very much! I really appreciate it. I'm going to have a new book with all of these details about neural networks coming out in the next few months.
@filipwojcik4133
@filipwojcik4133 Ай бұрын
​@@statquest I'm really glad to hear that! Fingers crossed! It will be no 1 on my reading list, and recommended lectures for students as well :) Keep up the pace, that's awesome! TRIPLE BAM!!!!!
@akeslx
@akeslx Жыл бұрын
I finished business school 25 years ago where I studied statistics and math. So happy to see that neural networks are fundamentally just a (much) more advanced regression analysis.
@statquest
@statquest Жыл бұрын
BAM!!! Thank you for supporting StatQuest! Yes, neural networks are a lot like regression, but now we can fit non-linear shapes to the data, and we don't have to know in advance what that shape should be. Given enough activation functions and hidden layers, the neural network can figure it out on its own.
@raunak5344
@raunak5344 9 ай бұрын
I just iterated on a gradient descent and found that this is the best possible way to teach this topic and no other lecture in the entire existence is better than this one
@statquest
@statquest 9 ай бұрын
bam!
@evangeliamm
@evangeliamm 3 жыл бұрын
You have no idea how much I appreciate your work. Your explanations are so fun and simple, I'm just so grateful!
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@Amir-gc8re
@Amir-gc8re 2 жыл бұрын
Finally a proper, detailed, step by step explanation. This guy is absolutely AMAZING ! Thank you so much for all the hard work in putting these videos together for us.
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@babarali4313
@babarali4313 2 жыл бұрын
Its the teacher who makes the Subject easy or difficult and the way you explained Neural Network, I am speechless
@statquest
@statquest 2 жыл бұрын
Thanks!
@ML-jx5zo
@ML-jx5zo 4 жыл бұрын
Now Iam reading backpropagation, I worried about this vedio didn't came for long time , And finally I got a treasure.
@statquest
@statquest 4 жыл бұрын
bam! :)
@katwoods8514
@katwoods8514 3 жыл бұрын
Love this! You've explained it far better than anywhere else I've seen, and you made it entertaining at the same time! Thank you so much for making this.
@statquest
@statquest 3 жыл бұрын
Awesome, thank you!
@jays9591
@jays9591 3 жыл бұрын
May I say .... You are such a good teacher that it is most enjoyable to watch your videos. I am proficient in statistics (via university econometrics 101) ... and I did not realise all those fancy terms in machine learning are actually concepts that are common items in the stats that I learned in the 1970s, e.g., biases and weights, label, activation functions etc. Anyway, I can see that a lot of viewers appreciate your work and teaching. I have also 'updated' myself. Thank you.
@statquest
@statquest 3 жыл бұрын
Thank you very much!
@yasameenmohammed4366
@yasameenmohammed4366 Жыл бұрын
My Machine Learning exam is tomorrow and re-watching your videos to review concepts is helping me so much! Thank you!!!
@statquest
@statquest Жыл бұрын
Good luck! BAM! :)
@TheClearwall
@TheClearwall 4 жыл бұрын
Who else is using these videos to put together a semester project? So far, I've put Regression Trees, K-fold CV, complexity pruning, and now Neural networks as my final model construction. Josh is worth a double bam every time.
@statquest
@statquest 4 жыл бұрын
BAM! Good luck with your project.
@adirozeri7162
@adirozeri7162 16 күн бұрын
You are amazing. THank you for taking this long road of showing us all the calculations. It really deepens my understanding and intuition!
@statquest
@statquest 16 күн бұрын
Thank you!
@ksrajavel
@ksrajavel 4 жыл бұрын
Finally. The wait is overBAM!!!
@statquest
@statquest 4 жыл бұрын
TRIPEL BAM!!!
@mariolira9279
@mariolira9279 3 жыл бұрын
F I F T H B A M!
@syco_Rax
@syco_Rax 3 жыл бұрын
SUPER BAM!!!
@advaithsahasranamam6170
@advaithsahasranamam6170 2 жыл бұрын
This is excellent stuff! As a visual learner, your channel is a BLESSING. Thank you so much for your fantastic work on breaking down concepts into small, bite-sized pieces. It's much less intimidating, and you deserve so much more appreciation . You also gained my subscription to your channel! Keep doing a great job, and thank you SO MUCH for having my back!
@statquest
@statquest 2 жыл бұрын
Thank you very much!!! :)
@mot7
@mot7 3 жыл бұрын
You are the best. I wish every ML learner find you first. I am going to do my part and tweet about you. Thanks for making these videos! Wish you more success.
@statquest
@statquest 3 жыл бұрын
Wow! Thank you very much! I really appreciate the support. BAM! :)
@hyonnj9563
@hyonnj9563 8 ай бұрын
Honestly you do a much better job at teaching using a pre recorded video than my instructors using both written and live materials that I'm paying for.
@statquest
@statquest 8 ай бұрын
I'm glad my videos are helpful! :)
@tagoreji2143
@tagoreji2143 2 жыл бұрын
Teaching such complicated topics in a simple, Easily Understandable way.👏👏👏.Thank you, Professor
@statquest
@statquest 2 жыл бұрын
Thanks!
@subusrable
@subusrable 5 ай бұрын
this video is a gem. I had to watch it a few times and like in gradient descent, I went closer to the target level of knowledge with each step :)
@statquest
@statquest 5 ай бұрын
BAM! :)
@sarazahoor9133
@sarazahoor9133 3 жыл бұрын
For the first time ever in history, I have understood the concept behind Neural Networks! BAM!!!! :D Thanks Josh, so grateful :)
@statquest
@statquest 3 жыл бұрын
BAM! :)
@jennystephens3215
@jennystephens3215 4 жыл бұрын
Josh, this is amazing. You really make things so easy to visualise which is crazy considering the hidden networks are meant to be so hard that they are referred to as black box! Thanks for all your videos. I have used heaps over the last twelve months. Thank you again.
@statquest
@statquest 4 жыл бұрын
Hooray!!! I'm so glad that you like my videos. :)
@mohammadrahman1126
@mohammadrahman1126 3 жыл бұрын
Amazing explanation! I've spent years trying to learn this and it always went too quickly into the gory mathematical details. Aha moment for me was when green squiggle equal blue plus orange squiggles lol Thank you for this Josh!!!
@statquest
@statquest 3 жыл бұрын
Glad it was helpful!
@amandak1396
@amandak1396 3 жыл бұрын
Kind of like how Feyman reduced gory math in physics to actual squiggle, double bam!
@edrobinson8248
@edrobinson8248 4 ай бұрын
simply brilliant. Learning is indeed a quest. A quest for someone who understands and can present understandably. Thanks.
@statquest
@statquest 4 ай бұрын
Thanks!
@katwoods8514
@katwoods8514 3 жыл бұрын
omg yay! I just discovered that you've made a million videos on ML. I'm going to go binge all of them now :D
@statquest
@statquest 3 жыл бұрын
Hope you enjoy!
@motherisape
@motherisape 2 жыл бұрын
Bamm
@dinara8571
@dinara8571 3 жыл бұрын
JUST WOW! Thank you so much, Josh! I cannot express the feeling I had when EVERYTHING made sense!!! TRIPLE BAM! Never thought I would be extremely excited to pause the video and try to solve everything by hand before I look at the next steps
@statquest
@statquest 3 жыл бұрын
BAM! :)
@NadaaTaiyab
@NadaaTaiyab 2 жыл бұрын
oh that's a good idea!
@wong4359
@wong4359 2 жыл бұрын
I found your explanation is far more easier to understand than the edx online course I am taking, BAM !!!
@statquest
@statquest 2 жыл бұрын
bam!
@manalisingh1128
@manalisingh1128 2 жыл бұрын
Wow Josh way to go!!!! You have the concepts so clear in your own head that it seems a piece of cake for us 🍰♥️ Love from India! 🇮🇳
@statquest
@statquest 2 жыл бұрын
Thanks so much!!
@jblacktube
@jblacktube Жыл бұрын
I didn't even get through the jingle before I gave a thumbs up. Thanks for the chuckle, can't wait to watch the rest of this!
@statquest
@statquest Жыл бұрын
BAM! :)
@mashmesh
@mashmesh 4 жыл бұрын
Omg, protect this man at all costs, this was pure gold!!! Also, thank you, sir, for talking so slowly because if my brain squiggles need to work faster they will burn up x)
@statquest
@statquest 4 жыл бұрын
Glad you enjoyed it!
@O5MO
@O5MO 3 жыл бұрын
I never understood backpropagation. I knew some things from other tutorials, but as for beigginer, it was very hard to understand. This video (and probably series) is the best i could find. Thank you.
@statquest
@statquest 3 жыл бұрын
Glad it was helpful!
@codeman2
@codeman2 3 жыл бұрын
I searched neural net and again your video popped that too just 4 month old, love to get your helpful videos right before my semester
@statquest
@statquest 3 жыл бұрын
:)
@lukasaudir8
@lukasaudir8 Жыл бұрын
I am really glad that people like you exist!! Thank you so much for those incredible lessons
@statquest
@statquest Жыл бұрын
Glad you like them!
@BlochSphere
@BlochSphere 10 ай бұрын
The level of detailing in this video is just 🤯 I hope i can try to make my Quantum Computing videos this clear!
@statquest
@statquest 10 ай бұрын
Good luck!
@edphi
@edphi Жыл бұрын
It should be made a crime for anyone to see other videos on backpropagation before they reach Statquest. The world is confused by teachers who tell the big story before the basic. Learn the basic and the picture fall into place like the chain rule 😊
@statquest
@statquest Жыл бұрын
bam! :)
@perhaps467
@perhaps467 2 жыл бұрын
Thank you so much for this series! I haven’t been able to find any other videos that really break down the mechanics of neural networks like this.
@statquest
@statquest 2 жыл бұрын
Thanks!
@user-re1bi2bc8b
@user-re1bi2bc8b 4 жыл бұрын
Incredible. Sometimes I need a refresher on these topics. There’s much to remember as a data scientist. I’m so glad I found your channel!
@statquest
@statquest 4 жыл бұрын
Bam!
@iliasaarab7922
@iliasaarab7922 3 жыл бұрын
Best explanation that I've seen so far on backpropagation!
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@madghostek3026
@madghostek3026 Жыл бұрын
9:00 at this moment I realised I'm watching the best math content on earth, because you never see simple stuff like this being given attention to. Luckily I already know how summation symbol works, but I didn't know it in the past, and nobody cared to explain. But it's just not about the summation symbol, imagine the other 1000 small things somebody might not understand, and doesn't realise they don't understand, because it's been skimmed over
@statquest
@statquest Жыл бұрын
Thank you so much! I really appreciate it! :)
@maliknauman3566
@maliknauman3566 3 жыл бұрын
How amazing is the way you convey complex concepts.
@statquest
@statquest 3 жыл бұрын
Thank you!
@evie389
@evie389 Жыл бұрын
I was reading an article based on Backpropagation and I did not understand a single word. I had to watch all your videos starting from Chain Rule, Gradient Descent, NNs...I re-read the article and understood everything!!! But now I can't get the beep--boop and small/double/triple/ bam out of my head lol.
@statquest
@statquest Жыл бұрын
BAM! I'm glad my videos were helpful! :)
@ABCEE1000
@ABCEE1000 2 ай бұрын
have no idea how can i thank you as you deserve .. thank you so much
@statquest
@statquest 2 ай бұрын
Thanks! :)
@mjcampbell1183
@mjcampbell1183 2 жыл бұрын
Wow! This is an incredible video. Thank you SO MUCH for making this for us. This is one of the best videos I've seen to explain this concept. The hard work you have put into this is something that I am incredibly appreciative of. Thanks, man.
@statquest
@statquest 2 жыл бұрын
Wow, thank you!
@nojoodothmanal-ghamdi1026
@nojoodothmanal-ghamdi1026 2 жыл бұрын
I . JUST . LOVE . YOUR . CHANNEL !! you literly explain things very clearly and step by step! I just cannot thank you enough really
@statquest
@statquest 2 жыл бұрын
Wow, thank you!
@deepanjan1234
@deepanjan1234 4 жыл бұрын
This is really awesome. I thank you for your effort in developing this highly enriched content. BAM !!!
@statquest
@statquest 4 жыл бұрын
Thank you!
@David5005ful
@David5005ful 2 жыл бұрын
The type of in depth video I’ve always wanted!
@statquest
@statquest 2 жыл бұрын
Thank you!
@CHERKE_JEMA5575
@CHERKE_JEMA5575 2 жыл бұрын
You rescued me from the unknown!! Much Love from Ethiopia
@statquest
@statquest 2 жыл бұрын
Bam! :)
@aryabartarout5697
@aryabartarout5697 Жыл бұрын
You have cleared my doubt on back propagation, gradient descent and chain rule. Triple Bam !
@statquest
@statquest Жыл бұрын
:)
@AhmadAbuNassar
@AhmadAbuNassar 4 ай бұрын
Thank you very much for this comprehensive yet simple explanation
@statquest
@statquest 4 ай бұрын
Glad it was helpful!
@ucanhnguyen4751
@ucanhnguyen4751 4 жыл бұрын
Thank you for this video. I have been waiting for this all the time. Finally, it appeared just 1 day before my exam. You are a life saver!!
@statquest
@statquest 4 жыл бұрын
Good luck with your exam! :)
@knt2112
@knt2112 Жыл бұрын
Hello sir, Thanks for such an simple explanation, never understood back propagation in such a depth at this ease. 🎉
@statquest
@statquest Жыл бұрын
Thank you!
@yiliu5403
@yiliu5403 Жыл бұрын
Best Neural Networks Lectures! Just ordered the book from Amazon to support!
@statquest
@statquest Жыл бұрын
Wow! Than you very much! :)
@willw4096
@willw4096 Жыл бұрын
Thanks for the great video!! My notes: 7:23 8:11 8:48 10:00 10:22❗,11:13 - 11:48, 11:56 12:08 13:30❗,
@statquest
@statquest Жыл бұрын
BAM! :)
@mrglootie101
@mrglootie101 4 жыл бұрын
I've been waiting for this all the time checking the notification haha
@statquest
@statquest 4 жыл бұрын
Hooray! The wait is over.
@epistemophilicmetalhead9454
@epistemophilicmetalhead9454 Жыл бұрын
Back propagation (aka finding w's and b's) start with b_final=0. you'll notice that error = (actual - predicted)^2 is really high. so you find the gradient descent of squared error wrt b_final and find out the value of b_final for which the squared error is minimum. that is your optimal b_final. gradient descent: derivative of sum of squared errors wrt b_final = derivative of sum of squared errors wrt predicted value y * derivative of y wrt b_final. d(y observed - y predicted)^2/d(y predicted) = -2*(y observed - y predicted) d(y predicted)/d(b_final) = d(sum of all those previous curves obtained through each node of the layer + b_final)/d(b_final) = 0+0+0....+0+1=1 take the predicted curve ke x values and find the derivative/slope. step size = slope*learning rate. new b_final = old b_final - step size. keep repeating until slope touches 0. this is how gradient descent works and you've found your optimal b_final.
@statquest
@statquest Жыл бұрын
double bam
@terrepus9856
@terrepus9856 4 жыл бұрын
The time couln't be more perfect ... 3 hours before my machine learning exam !! Thank you!!!!
@statquest
@statquest 4 жыл бұрын
Good luck with your exam! I hope it goes well.
@aviknash
@aviknash 3 жыл бұрын
Excellent job Josh!!! Just loved it!!! Thanks a ton for your fun-filled tutorials :)
@statquest
@statquest 3 жыл бұрын
Glad you like them!
@utkugulgec5508
@utkugulgec5508 4 жыл бұрын
These videos should be protected at all costs
@statquest
@statquest 4 жыл бұрын
:)
@rodrigovm
@rodrigovm 25 күн бұрын
best teacher ever. Even better than Andrew Ng.
@statquest
@statquest 24 күн бұрын
Thank you!
@viethoalam9958
@viethoalam9958 7 ай бұрын
give respect to my math teacher, but this is so much easier to understand.
@statquest
@statquest 7 ай бұрын
bam! :)
@chrislee4531
@chrislee4531 2 жыл бұрын
I learn more from four of your videos than 200 pages of textbook gibberish
@statquest
@statquest 2 жыл бұрын
Thanks!
@Luxcium
@Luxcium Жыл бұрын
Wow 😮 I didn't knew I had to watch the *Gradient Descent Step-by-Step!!!* before I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to this video...
@statquest
@statquest Жыл бұрын
Getting warmer...
@aminmoghaddam7624
@aminmoghaddam7624 8 ай бұрын
I wish our lecturers watched these videos before trying to make their own teaching slides! (With acknowledgement of course!)
@statquest
@statquest 8 ай бұрын
bam!
@superk9059
@superk9059 2 жыл бұрын
Thank you very much for your video~ Your videos make me feel that studying English make so much sense, otherwise I can't enjoy such beautiful thing~ 👍👍👍❤❤❤
@statquest
@statquest 2 жыл бұрын
WOW! Thank you very much!!! And thank you for your support!!! :)
@josephif
@josephif Жыл бұрын
Lecture was awesome,more affective and easy to understand Thanks
@statquest
@statquest Жыл бұрын
Thank you! :)
@VishalKhopkar1296
@VishalKhopkar1296 2 жыл бұрын
you taught this better than professors at CMU, not kidding
@statquest
@statquest 2 жыл бұрын
Thank you! :)
@richarda1630
@richarda1630 3 жыл бұрын
Where were you 5 years ago???!?!?! :D Awesome work man! Keep it up :)
@statquest
@statquest 3 жыл бұрын
Thanks! I have 4 more neural network videos coming out in the next month.
@richarda1630
@richarda1630 3 жыл бұрын
@@statquest awesome! can't wait :D
@d_polymorpha
@d_polymorpha 11 ай бұрын
Hello, thank you for the video! This series has been really helpful to learn about deep learning. I have a couple of questions. 1. When using gradient descent and backpropagation, do we always use SSR to measure how good a fit the parameter we are estimating is? Or are there other ways? 2. The second question is when using chain rule for calculating derivatives. The first part is d SSR/ d Predicted. In that first part @ 11:25 are you using chain rule again within that first part? And when deriving the inside Observed - Predicted @ 11:34 where do you get 0 and 1 from?
@statquest
@statquest 11 ай бұрын
1. The "loss function" we use for gradient descent depends on the problem we are trying to solve. In this case, we can use the SSR. However, another commonly used "loss function" is called Cross Entropy. You can learn more about cross entropy here: kzbin.info/www/bejne/bHLVhKypatZ7d7c and kzbin.info/www/bejne/rnOomWlsi56akNE 2. You can learn how the chain rule works (and understand the 0 and 1) here: kzbin.info/www/bejne/rZ2Unqyup9mEfrM
@alexissanchezbro
@alexissanchezbro 3 жыл бұрын
Your getting better and better. Thank you
@alexissanchezbro
@alexissanchezbro 3 жыл бұрын
BAAAAAAAMMM
@statquest
@statquest 3 жыл бұрын
:)
@SPLICY
@SPLICY 4 жыл бұрын
The understated BAM at 4:40 cracked me up 😂
@statquest
@statquest 4 жыл бұрын
SPLICY in the house!!! BAM! :)
@marpin6162
@marpin6162 4 жыл бұрын
Thank you. Now everything is more clear.
@statquest
@statquest 4 жыл бұрын
BAM! :)
@miriza2
@miriza2 4 жыл бұрын
BAM! Thanks Josh! You’re the best! Got myself a pink T-shirt 😍😍😍
@statquest
@statquest 4 жыл бұрын
Hooray! And thank you for supporting StatQuest!!!
@royazullay7556
@royazullay7556 Жыл бұрын
That Josh guy is just awsome !! Definitely will support !!
@statquest
@statquest Жыл бұрын
Thank you!
@preetikharb8283
@preetikharb8283 3 жыл бұрын
This video made my day, thank you so much, Josh!!
@statquest
@statquest 3 жыл бұрын
Thanks!
@igorg4129
@igorg4129 4 жыл бұрын
Josh, finished watching. Thank you again 1 If I as a researcher know +/- which range of inputs I am going to insert, and which range of outputs I expect to get in the end, will I want to adjust somehow from the very beginning the weights range, maybe weights distribution, same thing about biases and same about activation functions, or today we let the algorithm to do this job? 2 most interesting question: Lets say that while finding the prediction curve we kind of discover some "hidden truth". I think our curve might never be exact also because we do not know all of the independent variables which in nature affect our dependent variable. Say we know one, but there is another one which we do not know about. If so, will it be right to say that when neural network with one input splits the input by different weights into two neurons of a hidden layer (from which the final output is calculated), it is like simulating somehow presence of another "secret independent variable" even without knowing what it is? Thanks
@statquest
@statquest 4 жыл бұрын
I'll be honest, I'm not sure how to answer question #1. I don't know. I do know that some of the methods used for initializing the weights with random values increase the variation allowed in the values based on how many layers are in the neural network - so that might do the trick. As for the second question: Adding the second node in the hidden layer allows the squiggle to go up *and* go down. If I just had one node, I would only be able to go up *or* down. So, in some sense, that is sort of like adding a secret independent variable.
@igorg4129
@igorg4129 4 жыл бұрын
@@statquest Also thought this way. Thank you again and again you do here a titanic job Josh. If not you I wasn't here to ask new questios. :)!
@ZachariahRosenberg
@ZachariahRosenberg 4 жыл бұрын
@@igorg4129 It's tempting to want to initialize weights to a target range in the hopes of speeding up convergence, however this actually might be counter productive. The weights of individual nodes do not have to conform to the same distribution as your output. When you use an appropriate (adaptive) optimizer, it should be able to tune the weights pretty quickly, considering that the first few passes will likely have larger gradients.
@zombieeplays3146
@zombieeplays3146 7 ай бұрын
I come to this channel for the intros tbh!
@statquest
@statquest 7 ай бұрын
bam! :)
@danielkim4151
@danielkim4151 Ай бұрын
Wow! Thank you so much for this video!!!! It helped me so much for my AI assignment.
@statquest
@statquest Ай бұрын
Glad it helped!
@JamesWasTakenOhWell
@JamesWasTakenOhWell Жыл бұрын
Thank you for the amazing effort you put into this video and BAM!!! as always!
@statquest
@statquest Жыл бұрын
Thanks!
@xuantungnguyen9719
@xuantungnguyen9719 4 жыл бұрын
StatQuest is the best
@statquest
@statquest 4 жыл бұрын
Thank you very much! :)
@DanielRamBeats
@DanielRamBeats Жыл бұрын
This is finally all making sense to me thank you
@statquest
@statquest Жыл бұрын
Thanks!
@Morais115
@Morais115 4 жыл бұрын
I'm buying the shirt! Kudos to you sir.
@statquest
@statquest 4 жыл бұрын
Awesome! Thank you!
@constantthomas3830
@constantthomas3830 3 жыл бұрын
Thank you from France
@statquest
@statquest 3 жыл бұрын
Merci! :)
@abhishekm4996
@abhishekm4996 4 жыл бұрын
Much waiting.... Finally came..
@statquest
@statquest 4 жыл бұрын
Bam! :)
@SPLICY
@SPLICY 4 жыл бұрын
This is what she said
@tuhinsuryachakraborty
@tuhinsuryachakraborty Ай бұрын
Long live StatQuest
@statquest
@statquest Ай бұрын
bam! :)
@ssanand3
@ssanand3 3 жыл бұрын
I wish someday you make a video in person so that we can see the saint behind the voice 😀
@statquest
@statquest 3 жыл бұрын
:)
@zhenyuhe1537
@zhenyuhe1537 2 жыл бұрын
You can imagine how complicated professors explain it when even stat quest has to use equations
@statquest
@statquest 2 жыл бұрын
:)
@ge13r
@ge13r 6 ай бұрын
Saludos desde San Cristóbal, Venezuela!!!
@statquest
@statquest 6 ай бұрын
:)
@amirhossientakeh5540
@amirhossientakeh5540 2 жыл бұрын
perfect you explain complicated things very underatandable it's amazing
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@akashsoni5870
@akashsoni5870 4 жыл бұрын
Thanks a lot Sir, was waiting for this
@statquest
@statquest 4 жыл бұрын
Bam! :)
@lisun7158
@lisun7158 2 жыл бұрын
[Notes] 3:00 "Backpropagation starts from the last parameter and works its way backwards to estimate all of the other parameters". 10:40 The Chain Rule [Question] 3:22 Why is it reasonable that we assume all other optimal parameters are known except the last one in this part? Because in practice, we use dynamic programming (backpropagation) to save computing time complexity, i.e., calculating the last layer first, then backward through the NN. -- ref.: kzbin.info/www/bejne/n6rRY62adrGcn5o&ab_channel=StatQuestwithJoshStarmer kzbin.info/www/bejne/fXy9oIJ-jayWgtE&ab_channel=StatQuestwithJoshStarmer
@statquest
@statquest 2 жыл бұрын
Triple bam! :)
@jieunboy
@jieunboy 2 жыл бұрын
insane teaching quality, thanks !
@statquest
@statquest 2 жыл бұрын
Glad you think so!
@vokoramusyuriy106
@vokoramusyuriy106 Жыл бұрын
Thanks a lot, Josh!
@statquest
@statquest Жыл бұрын
My pleasure!
@Cam-su7os
@Cam-su7os 2 жыл бұрын
It actually baffles me how videos like this are free, yet we have to pay astronomical amounts for sub-standard tuition where we are made to feel stupid.
@statquest
@statquest 2 жыл бұрын
bam! :)
@teetanrobotics5363
@teetanrobotics5363 4 жыл бұрын
Waiting for a complete DL playlist
@statquest
@statquest 4 жыл бұрын
noted
@vishnukeyen7244
@vishnukeyen7244 3 жыл бұрын
I have a small suggestion for your otherwise AWESOME videos. Please use colorblind friendly colors in your videos. Green and Red are indistinguishable for me, so sometime I dont get the visual distinction you are drawing! I Again its not a complaint, just a suggestion. Thanks a lot for making these videos, I cannot stress enough how useful they are for me!
@statquest
@statquest 3 жыл бұрын
I'll keep that in mind.
Neural Networks Pt. 3: ReLU In Action!!!
8:58
StatQuest with Josh Starmer
Рет қаралды 279 М.
Backpropagation Details Pt. 1: Optimizing 3 parameters simultaneously.
18:32
StatQuest with Josh Starmer
Рет қаралды 215 М.
Кто круче, как думаешь?
00:44
МЯТНАЯ ФАНТА
Рет қаралды 6 МЛН
The IMPOSSIBLE Puzzle..
00:55
Stokes Twins
Рет қаралды 183 МЛН
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,4 МЛН
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 524 М.
I Built a Neural Network from Scratch
9:15
Green Code
Рет қаралды 426 М.
The Chain Rule
18:24
StatQuest with Josh Starmer
Рет қаралды 273 М.
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,7 МЛН
The Essential Main Ideas of Neural Networks
18:54
StatQuest with Josh Starmer
Рет қаралды 992 М.
Backpropagation : Data Science Concepts
19:29
ritvikmath
Рет қаралды 39 М.
Backpropagation Algorithm | Neural Networks
13:14
First Principles of Computer Vision
Рет қаралды 43 М.