The full Neural Networks playlist, from the basics to deep learning, is here: kzbin.info/www/bejne/eaKyl5xqZrGZetk Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@motherisape2 жыл бұрын
Bamm
@gbchrs2 жыл бұрын
@seanleith5312 Жыл бұрын
Quit the singing, please
@statquest Жыл бұрын
@@seanleith5312 Noted
@MahediHasanBD2 күн бұрын
@@motherisape double Bammmm!
@dcscla4 жыл бұрын
Man, your promotions are not shameless! Actually, what you do is a gift for us, for the price that you charge and for the level of the content, we are being gifted and not a buying something. You are far better than a lot of paid (and expensive) courses. Just check out your video comments to see how people few happy when they discover your videos!! Great work as always. Thank you so much!!!👏🏻👏🏻👏🏻👏🏻
@statquest4 жыл бұрын
Thank you very much! :)
@Luxcium Жыл бұрын
He is using the concept of reverse psychology by presenting great stuff at a good price and as you mentioned theses promotions are not shameless… They are shameful, as you hinted he should indeed be ashamed of giving us such a good and advantageous offer… 😅😅😅😅
@jonforce3604 жыл бұрын
You released this video just in time for my AI exam! Thank you. Sometimes I think professors use really complex notation just to feel smarter than students, it doesn't help learning. I love your content.
@statquest4 жыл бұрын
Thank you very much!
@sarazahoor91333 жыл бұрын
I want to copy-paste this comment! :D
@puppergump41172 жыл бұрын
Ain't that right. They must be mad that they don't understand the actually smart people so they don't want to be understood either.
@zhongtianjackwang5346 Жыл бұрын
lol, that is exactly what I want to say
@ElNick092 жыл бұрын
I have been a student my entire life and have taught college level courses myself, and I must say you are one of the finest lecturers I have ever seen. This statquest is a gem. Your work is so succinct and clear its as much art as it is instruction. Thank you for this incredible resource!
@statquest2 жыл бұрын
Thank you very much! :)
@hamzasaaran30112 жыл бұрын
I am studying for a Master's degree in bioinformatics now, and as someone who knows little about statistics, I really can't thank you enough for your videos and the effort that you have put into them.
@statquest2 жыл бұрын
Thank you!
@juliocerono_stone53654 ай бұрын
I am already 65, and your videos have helped me understand the basics behind NN. Thank you so much!!!!
@statquest4 ай бұрын
Bam! :)
@iskrega2 жыл бұрын
I just want you to know your channel has been instrumental in helping me towards my Data Science degree, I'm currently in my last semester. I'll be forever grateful for your channel and the time you take to make these videos. Thank you so much.
@statquest2 жыл бұрын
Thank you and good luck with your final semester! BAM! :)
@filipwojcik4133Ай бұрын
I've started my data science journey back in 2010. At that time, I struggled a lot to learn backpropagation for NN - I wanted to fully understand what's going on. Years passed, I worked in the data science industry giants, got my Ph.D. degree in the field meanwhile, and started teaching my own students. I can say one thing - StatQuest's backpropagation explanation is the best I've ever seen. I redirect all my students to this series of videos, as I believe there is no better, cleaner and no bull***t explanation available. Most of the videos and tutorials out there make it even more complicated by introducing **FANCY NOTATION** with deltas, nablas, etc. or other extra symbols. This one is simple, pure and gets the job done. Statquest - this is an excellent work, I wish we had more handbooks and papers written as clearly as your videos + books. All the best wishes!
@statquestАй бұрын
Thank you very much! I really appreciate it. I'm going to have a new book with all of these details about neural networks coming out in the next few months.
@filipwojcik4133Ай бұрын
@@statquest I'm really glad to hear that! Fingers crossed! It will be no 1 on my reading list, and recommended lectures for students as well :) Keep up the pace, that's awesome! TRIPLE BAM!!!!!
@akeslx Жыл бұрын
I finished business school 25 years ago where I studied statistics and math. So happy to see that neural networks are fundamentally just a (much) more advanced regression analysis.
@statquest Жыл бұрын
BAM!!! Thank you for supporting StatQuest! Yes, neural networks are a lot like regression, but now we can fit non-linear shapes to the data, and we don't have to know in advance what that shape should be. Given enough activation functions and hidden layers, the neural network can figure it out on its own.
@raunak53449 ай бұрын
I just iterated on a gradient descent and found that this is the best possible way to teach this topic and no other lecture in the entire existence is better than this one
@statquest9 ай бұрын
bam!
@evangeliamm3 жыл бұрын
You have no idea how much I appreciate your work. Your explanations are so fun and simple, I'm just so grateful!
@statquest3 жыл бұрын
Thank you very much! :)
@Amir-gc8re2 жыл бұрын
Finally a proper, detailed, step by step explanation. This guy is absolutely AMAZING ! Thank you so much for all the hard work in putting these videos together for us.
@statquest2 жыл бұрын
Thank you very much! :)
@babarali43132 жыл бұрын
Its the teacher who makes the Subject easy or difficult and the way you explained Neural Network, I am speechless
@statquest2 жыл бұрын
Thanks!
@ML-jx5zo4 жыл бұрын
Now Iam reading backpropagation, I worried about this vedio didn't came for long time , And finally I got a treasure.
@statquest4 жыл бұрын
bam! :)
@katwoods85143 жыл бұрын
Love this! You've explained it far better than anywhere else I've seen, and you made it entertaining at the same time! Thank you so much for making this.
@statquest3 жыл бұрын
Awesome, thank you!
@jays95913 жыл бұрын
May I say .... You are such a good teacher that it is most enjoyable to watch your videos. I am proficient in statistics (via university econometrics 101) ... and I did not realise all those fancy terms in machine learning are actually concepts that are common items in the stats that I learned in the 1970s, e.g., biases and weights, label, activation functions etc. Anyway, I can see that a lot of viewers appreciate your work and teaching. I have also 'updated' myself. Thank you.
@statquest3 жыл бұрын
Thank you very much!
@yasameenmohammed4366 Жыл бұрын
My Machine Learning exam is tomorrow and re-watching your videos to review concepts is helping me so much! Thank you!!!
@statquest Жыл бұрын
Good luck! BAM! :)
@TheClearwall4 жыл бұрын
Who else is using these videos to put together a semester project? So far, I've put Regression Trees, K-fold CV, complexity pruning, and now Neural networks as my final model construction. Josh is worth a double bam every time.
@statquest4 жыл бұрын
BAM! Good luck with your project.
@adirozeri716216 күн бұрын
You are amazing. THank you for taking this long road of showing us all the calculations. It really deepens my understanding and intuition!
@statquest16 күн бұрын
Thank you!
@ksrajavel4 жыл бұрын
Finally. The wait is overBAM!!!
@statquest4 жыл бұрын
TRIPEL BAM!!!
@mariolira92793 жыл бұрын
F I F T H B A M!
@syco_Rax3 жыл бұрын
SUPER BAM!!!
@advaithsahasranamam61702 жыл бұрын
This is excellent stuff! As a visual learner, your channel is a BLESSING. Thank you so much for your fantastic work on breaking down concepts into small, bite-sized pieces. It's much less intimidating, and you deserve so much more appreciation . You also gained my subscription to your channel! Keep doing a great job, and thank you SO MUCH for having my back!
@statquest2 жыл бұрын
Thank you very much!!! :)
@mot73 жыл бұрын
You are the best. I wish every ML learner find you first. I am going to do my part and tweet about you. Thanks for making these videos! Wish you more success.
@statquest3 жыл бұрын
Wow! Thank you very much! I really appreciate the support. BAM! :)
@hyonnj95638 ай бұрын
Honestly you do a much better job at teaching using a pre recorded video than my instructors using both written and live materials that I'm paying for.
@statquest8 ай бұрын
I'm glad my videos are helpful! :)
@tagoreji21432 жыл бұрын
Teaching such complicated topics in a simple, Easily Understandable way.👏👏👏.Thank you, Professor
@statquest2 жыл бұрын
Thanks!
@subusrable5 ай бұрын
this video is a gem. I had to watch it a few times and like in gradient descent, I went closer to the target level of knowledge with each step :)
@statquest5 ай бұрын
BAM! :)
@sarazahoor91333 жыл бұрын
For the first time ever in history, I have understood the concept behind Neural Networks! BAM!!!! :D Thanks Josh, so grateful :)
@statquest3 жыл бұрын
BAM! :)
@jennystephens32154 жыл бұрын
Josh, this is amazing. You really make things so easy to visualise which is crazy considering the hidden networks are meant to be so hard that they are referred to as black box! Thanks for all your videos. I have used heaps over the last twelve months. Thank you again.
@statquest4 жыл бұрын
Hooray!!! I'm so glad that you like my videos. :)
@mohammadrahman11263 жыл бұрын
Amazing explanation! I've spent years trying to learn this and it always went too quickly into the gory mathematical details. Aha moment for me was when green squiggle equal blue plus orange squiggles lol Thank you for this Josh!!!
@statquest3 жыл бұрын
Glad it was helpful!
@amandak13963 жыл бұрын
Kind of like how Feyman reduced gory math in physics to actual squiggle, double bam!
@edrobinson82484 ай бұрын
simply brilliant. Learning is indeed a quest. A quest for someone who understands and can present understandably. Thanks.
@statquest4 ай бұрын
Thanks!
@katwoods85143 жыл бұрын
omg yay! I just discovered that you've made a million videos on ML. I'm going to go binge all of them now :D
@statquest3 жыл бұрын
Hope you enjoy!
@motherisape2 жыл бұрын
Bamm
@dinara85713 жыл бұрын
JUST WOW! Thank you so much, Josh! I cannot express the feeling I had when EVERYTHING made sense!!! TRIPLE BAM! Never thought I would be extremely excited to pause the video and try to solve everything by hand before I look at the next steps
@statquest3 жыл бұрын
BAM! :)
@NadaaTaiyab2 жыл бұрын
oh that's a good idea!
@wong43592 жыл бұрын
I found your explanation is far more easier to understand than the edx online course I am taking, BAM !!!
@statquest2 жыл бұрын
bam!
@manalisingh11282 жыл бұрын
Wow Josh way to go!!!! You have the concepts so clear in your own head that it seems a piece of cake for us 🍰♥️ Love from India! 🇮🇳
@statquest2 жыл бұрын
Thanks so much!!
@jblacktube Жыл бұрын
I didn't even get through the jingle before I gave a thumbs up. Thanks for the chuckle, can't wait to watch the rest of this!
@statquest Жыл бұрын
BAM! :)
@mashmesh4 жыл бұрын
Omg, protect this man at all costs, this was pure gold!!! Also, thank you, sir, for talking so slowly because if my brain squiggles need to work faster they will burn up x)
@statquest4 жыл бұрын
Glad you enjoyed it!
@O5MO3 жыл бұрын
I never understood backpropagation. I knew some things from other tutorials, but as for beigginer, it was very hard to understand. This video (and probably series) is the best i could find. Thank you.
@statquest3 жыл бұрын
Glad it was helpful!
@codeman23 жыл бұрын
I searched neural net and again your video popped that too just 4 month old, love to get your helpful videos right before my semester
@statquest3 жыл бұрын
:)
@lukasaudir8 Жыл бұрын
I am really glad that people like you exist!! Thank you so much for those incredible lessons
@statquest Жыл бұрын
Glad you like them!
@BlochSphere10 ай бұрын
The level of detailing in this video is just 🤯 I hope i can try to make my Quantum Computing videos this clear!
@statquest10 ай бұрын
Good luck!
@edphi Жыл бұрын
It should be made a crime for anyone to see other videos on backpropagation before they reach Statquest. The world is confused by teachers who tell the big story before the basic. Learn the basic and the picture fall into place like the chain rule 😊
@statquest Жыл бұрын
bam! :)
@perhaps4672 жыл бұрын
Thank you so much for this series! I haven’t been able to find any other videos that really break down the mechanics of neural networks like this.
@statquest2 жыл бұрын
Thanks!
@user-re1bi2bc8b4 жыл бұрын
Incredible. Sometimes I need a refresher on these topics. There’s much to remember as a data scientist. I’m so glad I found your channel!
@statquest4 жыл бұрын
Bam!
@iliasaarab79223 жыл бұрын
Best explanation that I've seen so far on backpropagation!
@statquest3 жыл бұрын
Thank you! :)
@madghostek3026 Жыл бұрын
9:00 at this moment I realised I'm watching the best math content on earth, because you never see simple stuff like this being given attention to. Luckily I already know how summation symbol works, but I didn't know it in the past, and nobody cared to explain. But it's just not about the summation symbol, imagine the other 1000 small things somebody might not understand, and doesn't realise they don't understand, because it's been skimmed over
@statquest Жыл бұрын
Thank you so much! I really appreciate it! :)
@maliknauman35663 жыл бұрын
How amazing is the way you convey complex concepts.
@statquest3 жыл бұрын
Thank you!
@evie389 Жыл бұрын
I was reading an article based on Backpropagation and I did not understand a single word. I had to watch all your videos starting from Chain Rule, Gradient Descent, NNs...I re-read the article and understood everything!!! But now I can't get the beep--boop and small/double/triple/ bam out of my head lol.
@statquest Жыл бұрын
BAM! I'm glad my videos were helpful! :)
@ABCEE10002 ай бұрын
have no idea how can i thank you as you deserve .. thank you so much
@statquest2 ай бұрын
Thanks! :)
@mjcampbell11832 жыл бұрын
Wow! This is an incredible video. Thank you SO MUCH for making this for us. This is one of the best videos I've seen to explain this concept. The hard work you have put into this is something that I am incredibly appreciative of. Thanks, man.
@statquest2 жыл бұрын
Wow, thank you!
@nojoodothmanal-ghamdi10262 жыл бұрын
I . JUST . LOVE . YOUR . CHANNEL !! you literly explain things very clearly and step by step! I just cannot thank you enough really
@statquest2 жыл бұрын
Wow, thank you!
@deepanjan12344 жыл бұрын
This is really awesome. I thank you for your effort in developing this highly enriched content. BAM !!!
@statquest4 жыл бұрын
Thank you!
@David5005ful2 жыл бұрын
The type of in depth video I’ve always wanted!
@statquest2 жыл бұрын
Thank you!
@CHERKE_JEMA55752 жыл бұрын
You rescued me from the unknown!! Much Love from Ethiopia
@statquest2 жыл бұрын
Bam! :)
@aryabartarout5697 Жыл бұрын
You have cleared my doubt on back propagation, gradient descent and chain rule. Triple Bam !
@statquest Жыл бұрын
:)
@AhmadAbuNassar4 ай бұрын
Thank you very much for this comprehensive yet simple explanation
@statquest4 ай бұрын
Glad it was helpful!
@ucanhnguyen47514 жыл бұрын
Thank you for this video. I have been waiting for this all the time. Finally, it appeared just 1 day before my exam. You are a life saver!!
@statquest4 жыл бұрын
Good luck with your exam! :)
@knt2112 Жыл бұрын
Hello sir, Thanks for such an simple explanation, never understood back propagation in such a depth at this ease. 🎉
@statquest Жыл бұрын
Thank you!
@yiliu5403 Жыл бұрын
Best Neural Networks Lectures! Just ordered the book from Amazon to support!
@statquest Жыл бұрын
Wow! Than you very much! :)
@willw4096 Жыл бұрын
Thanks for the great video!! My notes: 7:23 8:11 8:48 10:00 10:22❗,11:13 - 11:48, 11:56 12:08 13:30❗,
@statquest Жыл бұрын
BAM! :)
@mrglootie1014 жыл бұрын
I've been waiting for this all the time checking the notification haha
@statquest4 жыл бұрын
Hooray! The wait is over.
@epistemophilicmetalhead9454 Жыл бұрын
Back propagation (aka finding w's and b's) start with b_final=0. you'll notice that error = (actual - predicted)^2 is really high. so you find the gradient descent of squared error wrt b_final and find out the value of b_final for which the squared error is minimum. that is your optimal b_final. gradient descent: derivative of sum of squared errors wrt b_final = derivative of sum of squared errors wrt predicted value y * derivative of y wrt b_final. d(y observed - y predicted)^2/d(y predicted) = -2*(y observed - y predicted) d(y predicted)/d(b_final) = d(sum of all those previous curves obtained through each node of the layer + b_final)/d(b_final) = 0+0+0....+0+1=1 take the predicted curve ke x values and find the derivative/slope. step size = slope*learning rate. new b_final = old b_final - step size. keep repeating until slope touches 0. this is how gradient descent works and you've found your optimal b_final.
@statquest Жыл бұрын
double bam
@terrepus98564 жыл бұрын
The time couln't be more perfect ... 3 hours before my machine learning exam !! Thank you!!!!
@statquest4 жыл бұрын
Good luck with your exam! I hope it goes well.
@aviknash3 жыл бұрын
Excellent job Josh!!! Just loved it!!! Thanks a ton for your fun-filled tutorials :)
@statquest3 жыл бұрын
Glad you like them!
@utkugulgec55084 жыл бұрын
These videos should be protected at all costs
@statquest4 жыл бұрын
:)
@rodrigovm25 күн бұрын
best teacher ever. Even better than Andrew Ng.
@statquest24 күн бұрын
Thank you!
@viethoalam99587 ай бұрын
give respect to my math teacher, but this is so much easier to understand.
@statquest7 ай бұрын
bam! :)
@chrislee45312 жыл бұрын
I learn more from four of your videos than 200 pages of textbook gibberish
@statquest2 жыл бұрын
Thanks!
@Luxcium Жыл бұрын
Wow 😮 I didn't knew I had to watch the *Gradient Descent Step-by-Step!!!* before I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to this video...
@statquest Жыл бұрын
Getting warmer...
@aminmoghaddam76248 ай бұрын
I wish our lecturers watched these videos before trying to make their own teaching slides! (With acknowledgement of course!)
@statquest8 ай бұрын
bam!
@superk90592 жыл бұрын
Thank you very much for your video~ Your videos make me feel that studying English make so much sense, otherwise I can't enjoy such beautiful thing~ 👍👍👍❤❤❤
@statquest2 жыл бұрын
WOW! Thank you very much!!! And thank you for your support!!! :)
@josephif Жыл бұрын
Lecture was awesome,more affective and easy to understand Thanks
@statquest Жыл бұрын
Thank you! :)
@VishalKhopkar12962 жыл бұрын
you taught this better than professors at CMU, not kidding
@statquest2 жыл бұрын
Thank you! :)
@richarda16303 жыл бұрын
Where were you 5 years ago???!?!?! :D Awesome work man! Keep it up :)
@statquest3 жыл бұрын
Thanks! I have 4 more neural network videos coming out in the next month.
@richarda16303 жыл бұрын
@@statquest awesome! can't wait :D
@d_polymorpha11 ай бұрын
Hello, thank you for the video! This series has been really helpful to learn about deep learning. I have a couple of questions. 1. When using gradient descent and backpropagation, do we always use SSR to measure how good a fit the parameter we are estimating is? Or are there other ways? 2. The second question is when using chain rule for calculating derivatives. The first part is d SSR/ d Predicted. In that first part @ 11:25 are you using chain rule again within that first part? And when deriving the inside Observed - Predicted @ 11:34 where do you get 0 and 1 from?
@statquest11 ай бұрын
1. The "loss function" we use for gradient descent depends on the problem we are trying to solve. In this case, we can use the SSR. However, another commonly used "loss function" is called Cross Entropy. You can learn more about cross entropy here: kzbin.info/www/bejne/bHLVhKypatZ7d7c and kzbin.info/www/bejne/rnOomWlsi56akNE 2. You can learn how the chain rule works (and understand the 0 and 1) here: kzbin.info/www/bejne/rZ2Unqyup9mEfrM
@alexissanchezbro3 жыл бұрын
Your getting better and better. Thank you
@alexissanchezbro3 жыл бұрын
BAAAAAAAMMM
@statquest3 жыл бұрын
:)
@SPLICY4 жыл бұрын
The understated BAM at 4:40 cracked me up 😂
@statquest4 жыл бұрын
SPLICY in the house!!! BAM! :)
@marpin61624 жыл бұрын
Thank you. Now everything is more clear.
@statquest4 жыл бұрын
BAM! :)
@miriza24 жыл бұрын
BAM! Thanks Josh! You’re the best! Got myself a pink T-shirt 😍😍😍
@statquest4 жыл бұрын
Hooray! And thank you for supporting StatQuest!!!
@royazullay7556 Жыл бұрын
That Josh guy is just awsome !! Definitely will support !!
@statquest Жыл бұрын
Thank you!
@preetikharb82833 жыл бұрын
This video made my day, thank you so much, Josh!!
@statquest3 жыл бұрын
Thanks!
@igorg41294 жыл бұрын
Josh, finished watching. Thank you again 1 If I as a researcher know +/- which range of inputs I am going to insert, and which range of outputs I expect to get in the end, will I want to adjust somehow from the very beginning the weights range, maybe weights distribution, same thing about biases and same about activation functions, or today we let the algorithm to do this job? 2 most interesting question: Lets say that while finding the prediction curve we kind of discover some "hidden truth". I think our curve might never be exact also because we do not know all of the independent variables which in nature affect our dependent variable. Say we know one, but there is another one which we do not know about. If so, will it be right to say that when neural network with one input splits the input by different weights into two neurons of a hidden layer (from which the final output is calculated), it is like simulating somehow presence of another "secret independent variable" even without knowing what it is? Thanks
@statquest4 жыл бұрын
I'll be honest, I'm not sure how to answer question #1. I don't know. I do know that some of the methods used for initializing the weights with random values increase the variation allowed in the values based on how many layers are in the neural network - so that might do the trick. As for the second question: Adding the second node in the hidden layer allows the squiggle to go up *and* go down. If I just had one node, I would only be able to go up *or* down. So, in some sense, that is sort of like adding a secret independent variable.
@igorg41294 жыл бұрын
@@statquest Also thought this way. Thank you again and again you do here a titanic job Josh. If not you I wasn't here to ask new questios. :)!
@ZachariahRosenberg4 жыл бұрын
@@igorg4129 It's tempting to want to initialize weights to a target range in the hopes of speeding up convergence, however this actually might be counter productive. The weights of individual nodes do not have to conform to the same distribution as your output. When you use an appropriate (adaptive) optimizer, it should be able to tune the weights pretty quickly, considering that the first few passes will likely have larger gradients.
@zombieeplays31467 ай бұрын
I come to this channel for the intros tbh!
@statquest7 ай бұрын
bam! :)
@danielkim4151Ай бұрын
Wow! Thank you so much for this video!!!! It helped me so much for my AI assignment.
@statquestАй бұрын
Glad it helped!
@JamesWasTakenOhWell Жыл бұрын
Thank you for the amazing effort you put into this video and BAM!!! as always!
@statquest Жыл бұрын
Thanks!
@xuantungnguyen97194 жыл бұрын
StatQuest is the best
@statquest4 жыл бұрын
Thank you very much! :)
@DanielRamBeats Жыл бұрын
This is finally all making sense to me thank you
@statquest Жыл бұрын
Thanks!
@Morais1154 жыл бұрын
I'm buying the shirt! Kudos to you sir.
@statquest4 жыл бұрын
Awesome! Thank you!
@constantthomas38303 жыл бұрын
Thank you from France
@statquest3 жыл бұрын
Merci! :)
@abhishekm49964 жыл бұрын
Much waiting.... Finally came..
@statquest4 жыл бұрын
Bam! :)
@SPLICY4 жыл бұрын
This is what she said
@tuhinsuryachakrabortyАй бұрын
Long live StatQuest
@statquestАй бұрын
bam! :)
@ssanand33 жыл бұрын
I wish someday you make a video in person so that we can see the saint behind the voice 😀
@statquest3 жыл бұрын
:)
@zhenyuhe15372 жыл бұрын
You can imagine how complicated professors explain it when even stat quest has to use equations
@statquest2 жыл бұрын
:)
@ge13r6 ай бұрын
Saludos desde San Cristóbal, Venezuela!!!
@statquest6 ай бұрын
:)
@amirhossientakeh55402 жыл бұрын
perfect you explain complicated things very underatandable it's amazing
@statquest2 жыл бұрын
Thank you very much! :)
@akashsoni58704 жыл бұрын
Thanks a lot Sir, was waiting for this
@statquest4 жыл бұрын
Bam! :)
@lisun71582 жыл бұрын
[Notes] 3:00 "Backpropagation starts from the last parameter and works its way backwards to estimate all of the other parameters". 10:40 The Chain Rule [Question] 3:22 Why is it reasonable that we assume all other optimal parameters are known except the last one in this part? Because in practice, we use dynamic programming (backpropagation) to save computing time complexity, i.e., calculating the last layer first, then backward through the NN. -- ref.: kzbin.info/www/bejne/n6rRY62adrGcn5o&ab_channel=StatQuestwithJoshStarmer kzbin.info/www/bejne/fXy9oIJ-jayWgtE&ab_channel=StatQuestwithJoshStarmer
@statquest2 жыл бұрын
Triple bam! :)
@jieunboy2 жыл бұрын
insane teaching quality, thanks !
@statquest2 жыл бұрын
Glad you think so!
@vokoramusyuriy106 Жыл бұрын
Thanks a lot, Josh!
@statquest Жыл бұрын
My pleasure!
@Cam-su7os2 жыл бұрын
It actually baffles me how videos like this are free, yet we have to pay astronomical amounts for sub-standard tuition where we are made to feel stupid.
@statquest2 жыл бұрын
bam! :)
@teetanrobotics53634 жыл бұрын
Waiting for a complete DL playlist
@statquest4 жыл бұрын
noted
@vishnukeyen72443 жыл бұрын
I have a small suggestion for your otherwise AWESOME videos. Please use colorblind friendly colors in your videos. Green and Red are indistinguishable for me, so sometime I dont get the visual distinction you are drawing! I Again its not a complaint, just a suggestion. Thanks a lot for making these videos, I cannot stress enough how useful they are for me!