The full Neural Networks playlist, from the basics to deep learning, is here: kzbin.info/www/bejne/eaKyl5xqZrGZetk Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@motherisape2 жыл бұрын
Bamm
@gbchrs2 жыл бұрын
@seanleith5312 Жыл бұрын
Quit the singing, please
@statquest Жыл бұрын
@@seanleith5312 Noted
@dcscla4 жыл бұрын
Man, your promotions are not shameless! Actually, what you do is a gift for us, for the price that you charge and for the level of the content, we are being gifted and not a buying something. You are far better than a lot of paid (and expensive) courses. Just check out your video comments to see how people few happy when they discover your videos!! Great work as always. Thank you so much!!!👏🏻👏🏻👏🏻👏🏻
@statquest4 жыл бұрын
Thank you very much! :)
@Luxcium Жыл бұрын
He is using the concept of reverse psychology by presenting great stuff at a good price and as you mentioned theses promotions are not shameless… They are shameful, as you hinted he should indeed be ashamed of giving us such a good and advantageous offer… 😅😅😅😅
@jonforce3604 жыл бұрын
You released this video just in time for my AI exam! Thank you. Sometimes I think professors use really complex notation just to feel smarter than students, it doesn't help learning. I love your content.
@statquest4 жыл бұрын
Thank you very much!
@sarazahoor91332 жыл бұрын
I want to copy-paste this comment! :D
@puppergump41172 жыл бұрын
Ain't that right. They must be mad that they don't understand the actually smart people so they don't want to be understood either.
@zhongtianjackwang5346 Жыл бұрын
lol, that is exactly what I want to say
@ElNick092 жыл бұрын
I have been a student my entire life and have taught college level courses myself, and I must say you are one of the finest lecturers I have ever seen. This statquest is a gem. Your work is so succinct and clear its as much art as it is instruction. Thank you for this incredible resource!
@statquest2 жыл бұрын
Thank you very much! :)
@juliocerono_stone53654 ай бұрын
I am already 65, and your videos have helped me understand the basics behind NN. Thank you so much!!!!
@statquest4 ай бұрын
Bam! :)
@iskrega2 жыл бұрын
I just want you to know your channel has been instrumental in helping me towards my Data Science degree, I'm currently in my last semester. I'll be forever grateful for your channel and the time you take to make these videos. Thank you so much.
@statquest2 жыл бұрын
Thank you and good luck with your final semester! BAM! :)
@hamzasaaran30112 жыл бұрын
I am studying for a Master's degree in bioinformatics now, and as someone who knows little about statistics, I really can't thank you enough for your videos and the effort that you have put into them.
@statquest2 жыл бұрын
Thank you!
@akeslx Жыл бұрын
I finished business school 25 years ago where I studied statistics and math. So happy to see that neural networks are fundamentally just a (much) more advanced regression analysis.
@statquest Жыл бұрын
BAM!!! Thank you for supporting StatQuest! Yes, neural networks are a lot like regression, but now we can fit non-linear shapes to the data, and we don't have to know in advance what that shape should be. Given enough activation functions and hidden layers, the neural network can figure it out on its own.
@filipwojcik4133Ай бұрын
I've started my data science journey back in 2010. At that time, I struggled a lot to learn backpropagation for NN - I wanted to fully understand what's going on. Years passed, I worked in the data science industry giants, got my Ph.D. degree in the field meanwhile, and started teaching my own students. I can say one thing - StatQuest's backpropagation explanation is the best I've ever seen. I redirect all my students to this series of videos, as I believe there is no better, cleaner and no bull***t explanation available. Most of the videos and tutorials out there make it even more complicated by introducing **FANCY NOTATION** with deltas, nablas, etc. or other extra symbols. This one is simple, pure and gets the job done. Statquest - this is an excellent work, I wish we had more handbooks and papers written as clearly as your videos + books. All the best wishes!
@statquestАй бұрын
Thank you very much! I really appreciate it. I'm going to have a new book with all of these details about neural networks coming out in the next few months.
@filipwojcik4133Ай бұрын
@@statquest I'm really glad to hear that! Fingers crossed! It will be no 1 on my reading list, and recommended lectures for students as well :) Keep up the pace, that's awesome! TRIPLE BAM!!!!!
@adirozeri71624 күн бұрын
You are amazing. THank you for taking this long road of showing us all the calculations. It really deepens my understanding and intuition!
@statquest4 күн бұрын
Thank you!
@evangeliamm3 жыл бұрын
You have no idea how much I appreciate your work. Your explanations are so fun and simple, I'm just so grateful!
@statquest3 жыл бұрын
Thank you very much! :)
@yasameenmohammed4366 Жыл бұрын
My Machine Learning exam is tomorrow and re-watching your videos to review concepts is helping me so much! Thank you!!!
@statquest Жыл бұрын
Good luck! BAM! :)
@Amir-gc8re2 жыл бұрын
Finally a proper, detailed, step by step explanation. This guy is absolutely AMAZING ! Thank you so much for all the hard work in putting these videos together for us.
@statquest2 жыл бұрын
Thank you very much! :)
@jays95913 жыл бұрын
May I say .... You are such a good teacher that it is most enjoyable to watch your videos. I am proficient in statistics (via university econometrics 101) ... and I did not realise all those fancy terms in machine learning are actually concepts that are common items in the stats that I learned in the 1970s, e.g., biases and weights, label, activation functions etc. Anyway, I can see that a lot of viewers appreciate your work and teaching. I have also 'updated' myself. Thank you.
@statquest3 жыл бұрын
Thank you very much!
@babarali43132 жыл бұрын
Its the teacher who makes the Subject easy or difficult and the way you explained Neural Network, I am speechless
@statquest2 жыл бұрын
Thanks!
@raunak53449 ай бұрын
I just iterated on a gradient descent and found that this is the best possible way to teach this topic and no other lecture in the entire existence is better than this one
@statquest9 ай бұрын
bam!
@katwoods85143 жыл бұрын
Love this! You've explained it far better than anywhere else I've seen, and you made it entertaining at the same time! Thank you so much for making this.
@statquest3 жыл бұрын
Awesome, thank you!
@advaithsahasranamam61702 жыл бұрын
This is excellent stuff! As a visual learner, your channel is a BLESSING. Thank you so much for your fantastic work on breaking down concepts into small, bite-sized pieces. It's much less intimidating, and you deserve so much more appreciation . You also gained my subscription to your channel! Keep doing a great job, and thank you SO MUCH for having my back!
@statquest2 жыл бұрын
Thank you very much!!! :)
@mot73 жыл бұрын
You are the best. I wish every ML learner find you first. I am going to do my part and tweet about you. Thanks for making these videos! Wish you more success.
@statquest3 жыл бұрын
Wow! Thank you very much! I really appreciate the support. BAM! :)
@ML-jx5zo4 жыл бұрын
Now Iam reading backpropagation, I worried about this vedio didn't came for long time , And finally I got a treasure.
@statquest4 жыл бұрын
bam! :)
@tagoreji21432 жыл бұрын
Teaching such complicated topics in a simple, Easily Understandable way.👏👏👏.Thank you, Professor
@statquest2 жыл бұрын
Thanks!
@hyonnj95638 ай бұрын
Honestly you do a much better job at teaching using a pre recorded video than my instructors using both written and live materials that I'm paying for.
@statquest8 ай бұрын
I'm glad my videos are helpful! :)
@edrobinson82484 ай бұрын
simply brilliant. Learning is indeed a quest. A quest for someone who understands and can present understandably. Thanks.
@statquest4 ай бұрын
Thanks!
@TheClearwall4 жыл бұрын
Who else is using these videos to put together a semester project? So far, I've put Regression Trees, K-fold CV, complexity pruning, and now Neural networks as my final model construction. Josh is worth a double bam every time.
@statquest4 жыл бұрын
BAM! Good luck with your project.
@subusrable5 ай бұрын
this video is a gem. I had to watch it a few times and like in gradient descent, I went closer to the target level of knowledge with each step :)
@statquest5 ай бұрын
BAM! :)
@wong43592 жыл бұрын
I found your explanation is far more easier to understand than the edx online course I am taking, BAM !!!
@statquest2 жыл бұрын
bam!
@ksrajavel4 жыл бұрын
Finally. The wait is overBAM!!!
@statquest4 жыл бұрын
TRIPEL BAM!!!
@mariolira92793 жыл бұрын
F I F T H B A M!
@syco_Rax3 жыл бұрын
SUPER BAM!!!
@BlochSphere9 ай бұрын
The level of detailing in this video is just 🤯 I hope i can try to make my Quantum Computing videos this clear!
@statquest9 ай бұрын
Good luck!
@mohammadrahman11263 жыл бұрын
Amazing explanation! I've spent years trying to learn this and it always went too quickly into the gory mathematical details. Aha moment for me was when green squiggle equal blue plus orange squiggles lol Thank you for this Josh!!!
@statquest3 жыл бұрын
Glad it was helpful!
@amandak13963 жыл бұрын
Kind of like how Feyman reduced gory math in physics to actual squiggle, double bam!
@katwoods85143 жыл бұрын
omg yay! I just discovered that you've made a million videos on ML. I'm going to go binge all of them now :D
@statquest3 жыл бұрын
Hope you enjoy!
@motherisape2 жыл бұрын
Bamm
@lukasaudir8 Жыл бұрын
I am really glad that people like you exist!! Thank you so much for those incredible lessons
@statquest Жыл бұрын
Glad you like them!
@codeman23 жыл бұрын
I searched neural net and again your video popped that too just 4 month old, love to get your helpful videos right before my semester
@statquest3 жыл бұрын
:)
@jblacktube Жыл бұрын
I didn't even get through the jingle before I gave a thumbs up. Thanks for the chuckle, can't wait to watch the rest of this!
@statquest Жыл бұрын
BAM! :)
@O5MO3 жыл бұрын
I never understood backpropagation. I knew some things from other tutorials, but as for beigginer, it was very hard to understand. This video (and probably series) is the best i could find. Thank you.
@statquest3 жыл бұрын
Glad it was helpful!
@jennystephens32154 жыл бұрын
Josh, this is amazing. You really make things so easy to visualise which is crazy considering the hidden networks are meant to be so hard that they are referred to as black box! Thanks for all your videos. I have used heaps over the last twelve months. Thank you again.
@statquest4 жыл бұрын
Hooray!!! I'm so glad that you like my videos. :)
@ABCEE10002 ай бұрын
have no idea how can i thank you as you deserve .. thank you so much
@statquest2 ай бұрын
Thanks! :)
@dinara85713 жыл бұрын
JUST WOW! Thank you so much, Josh! I cannot express the feeling I had when EVERYTHING made sense!!! TRIPLE BAM! Never thought I would be extremely excited to pause the video and try to solve everything by hand before I look at the next steps
@statquest3 жыл бұрын
BAM! :)
@NadaaTaiyab2 жыл бұрын
oh that's a good idea!
@AhmadAbuNassar4 ай бұрын
Thank you very much for this comprehensive yet simple explanation
@statquest4 ай бұрын
Glad it was helpful!
@mashmesh4 жыл бұрын
Omg, protect this man at all costs, this was pure gold!!! Also, thank you, sir, for talking so slowly because if my brain squiggles need to work faster they will burn up x)
@statquest4 жыл бұрын
Glad you enjoyed it!
@sarazahoor91332 жыл бұрын
For the first time ever in history, I have understood the concept behind Neural Networks! BAM!!!! :D Thanks Josh, so grateful :)
@statquest2 жыл бұрын
BAM! :)
@manalisingh11282 жыл бұрын
Wow Josh way to go!!!! You have the concepts so clear in your own head that it seems a piece of cake for us 🍰♥️ Love from India! 🇮🇳
@statquest2 жыл бұрын
Thanks so much!!
@madghostek3026 Жыл бұрын
9:00 at this moment I realised I'm watching the best math content on earth, because you never see simple stuff like this being given attention to. Luckily I already know how summation symbol works, but I didn't know it in the past, and nobody cared to explain. But it's just not about the summation symbol, imagine the other 1000 small things somebody might not understand, and doesn't realise they don't understand, because it's been skimmed over
@statquest Жыл бұрын
Thank you so much! I really appreciate it! :)
@perhaps467 Жыл бұрын
Thank you so much for this series! I haven’t been able to find any other videos that really break down the mechanics of neural networks like this.
@statquest Жыл бұрын
Thanks!
@user-re1bi2bc8b4 жыл бұрын
Incredible. Sometimes I need a refresher on these topics. There’s much to remember as a data scientist. I’m so glad I found your channel!
@statquest4 жыл бұрын
Bam!
@maliknauman35662 жыл бұрын
How amazing is the way you convey complex concepts.
@statquest2 жыл бұрын
Thank you!
@edphi Жыл бұрын
It should be made a crime for anyone to see other videos on backpropagation before they reach Statquest. The world is confused by teachers who tell the big story before the basic. Learn the basic and the picture fall into place like the chain rule 😊
@statquest Жыл бұрын
bam! :)
@yiliu5403 Жыл бұрын
Best Neural Networks Lectures! Just ordered the book from Amazon to support!
@statquest Жыл бұрын
Wow! Than you very much! :)
@iliasaarab79223 жыл бұрын
Best explanation that I've seen so far on backpropagation!
@statquest3 жыл бұрын
Thank you! :)
@knt2112 Жыл бұрын
Hello sir, Thanks for such an simple explanation, never understood back propagation in such a depth at this ease. 🎉
@statquest Жыл бұрын
Thank you!
@CHERKE_JEMA55752 жыл бұрын
You rescued me from the unknown!! Much Love from Ethiopia
@statquest2 жыл бұрын
Bam! :)
@nojoodothmanal-ghamdi10262 жыл бұрын
I . JUST . LOVE . YOUR . CHANNEL !! you literly explain things very clearly and step by step! I just cannot thank you enough really
@statquest2 жыл бұрын
Wow, thank you!
@aryabartarout5697 Жыл бұрын
You have cleared my doubt on back propagation, gradient descent and chain rule. Triple Bam !
@statquest Жыл бұрын
:)
@David5005ful2 жыл бұрын
The type of in depth video I’ve always wanted!
@statquest2 жыл бұрын
Thank you!
@rodrigovm13 күн бұрын
best teacher ever. Even better than Andrew Ng.
@statquest12 күн бұрын
Thank you!
@ucanhnguyen47514 жыл бұрын
Thank you for this video. I have been waiting for this all the time. Finally, it appeared just 1 day before my exam. You are a life saver!!
@statquest4 жыл бұрын
Good luck with your exam! :)
@terrepus98564 жыл бұрын
The time couln't be more perfect ... 3 hours before my machine learning exam !! Thank you!!!!
@statquest4 жыл бұрын
Good luck with your exam! I hope it goes well.
@viethoalam99587 ай бұрын
give respect to my math teacher, but this is so much easier to understand.
@statquest7 ай бұрын
bam! :)
@chrislee45312 жыл бұрын
I learn more from four of your videos than 200 pages of textbook gibberish
@statquest2 жыл бұрын
Thanks!
@mjcampbell11832 жыл бұрын
Wow! This is an incredible video. Thank you SO MUCH for making this for us. This is one of the best videos I've seen to explain this concept. The hard work you have put into this is something that I am incredibly appreciative of. Thanks, man.
@statquest2 жыл бұрын
Wow, thank you!
@superk90592 жыл бұрын
Thank you very much for your video~ Your videos make me feel that studying English make so much sense, otherwise I can't enjoy such beautiful thing~ 👍👍👍❤❤❤
@statquest2 жыл бұрын
WOW! Thank you very much!!! And thank you for your support!!! :)
@josephif Жыл бұрын
Lecture was awesome,more affective and easy to understand Thanks
@statquest Жыл бұрын
Thank you! :)
@evie389 Жыл бұрын
I was reading an article based on Backpropagation and I did not understand a single word. I had to watch all your videos starting from Chain Rule, Gradient Descent, NNs...I re-read the article and understood everything!!! But now I can't get the beep--boop and small/double/triple/ bam out of my head lol.
@statquest Жыл бұрын
BAM! I'm glad my videos were helpful! :)
@aminmoghaddam76248 ай бұрын
I wish our lecturers watched these videos before trying to make their own teaching slides! (With acknowledgement of course!)
@statquest8 ай бұрын
bam!
@deepanjan12344 жыл бұрын
This is really awesome. I thank you for your effort in developing this highly enriched content. BAM !!!
@statquest4 жыл бұрын
Thank you!
@willw4096 Жыл бұрын
Thanks for the great video!! My notes: 7:23 8:11 8:48 10:00 10:22❗,11:13 - 11:48, 11:56 12:08 13:30❗,
@statquest Жыл бұрын
BAM! :)
@d_polymorpha10 ай бұрын
Hello, thank you for the video! This series has been really helpful to learn about deep learning. I have a couple of questions. 1. When using gradient descent and backpropagation, do we always use SSR to measure how good a fit the parameter we are estimating is? Or are there other ways? 2. The second question is when using chain rule for calculating derivatives. The first part is d SSR/ d Predicted. In that first part @ 11:25 are you using chain rule again within that first part? And when deriving the inside Observed - Predicted @ 11:34 where do you get 0 and 1 from?
@statquest10 ай бұрын
1. The "loss function" we use for gradient descent depends on the problem we are trying to solve. In this case, we can use the SSR. However, another commonly used "loss function" is called Cross Entropy. You can learn more about cross entropy here: kzbin.info/www/bejne/bHLVhKypatZ7d7c and kzbin.info/www/bejne/rnOomWlsi56akNE 2. You can learn how the chain rule works (and understand the 0 and 1) here: kzbin.info/www/bejne/rZ2Unqyup9mEfrM
@royazullay7556 Жыл бұрын
That Josh guy is just awsome !! Definitely will support !!
@statquest Жыл бұрын
Thank you!
@danielkim415125 күн бұрын
Wow! Thank you so much for this video!!!! It helped me so much for my AI assignment.
@statquest24 күн бұрын
Glad it helped!
@VishalKhopkar12962 жыл бұрын
you taught this better than professors at CMU, not kidding
@statquest2 жыл бұрын
Thank you! :)
@Luxcium Жыл бұрын
Wow 😮 I didn't knew I had to watch the *Gradient Descent Step-by-Step!!!* before I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to this video...
@statquest Жыл бұрын
Getting warmer...
@mrglootie1014 жыл бұрын
I've been waiting for this all the time checking the notification haha
@statquest4 жыл бұрын
Hooray! The wait is over.
@alexissanchezbro3 жыл бұрын
Your getting better and better. Thank you
@alexissanchezbro3 жыл бұрын
BAAAAAAAMMM
@statquest3 жыл бұрын
:)
@utkugulgec55084 жыл бұрын
These videos should be protected at all costs
@statquest4 жыл бұрын
:)
@epistemophilicmetalhead9454 Жыл бұрын
Back propagation (aka finding w's and b's) start with b_final=0. you'll notice that error = (actual - predicted)^2 is really high. so you find the gradient descent of squared error wrt b_final and find out the value of b_final for which the squared error is minimum. that is your optimal b_final. gradient descent: derivative of sum of squared errors wrt b_final = derivative of sum of squared errors wrt predicted value y * derivative of y wrt b_final. d(y observed - y predicted)^2/d(y predicted) = -2*(y observed - y predicted) d(y predicted)/d(b_final) = d(sum of all those previous curves obtained through each node of the layer + b_final)/d(b_final) = 0+0+0....+0+1=1 take the predicted curve ke x values and find the derivative/slope. step size = slope*learning rate. new b_final = old b_final - step size. keep repeating until slope touches 0. this is how gradient descent works and you've found your optimal b_final.
@statquest Жыл бұрын
double bam
@amirhossientakeh55402 жыл бұрын
perfect you explain complicated things very underatandable it's amazing
@statquest2 жыл бұрын
Thank you very much! :)
@aviknash3 жыл бұрын
Excellent job Josh!!! Just loved it!!! Thanks a ton for your fun-filled tutorials :)
@statquest3 жыл бұрын
Glad you like them!
@DanielRamBeats Жыл бұрын
This is finally all making sense to me thank you
@statquest Жыл бұрын
Thanks!
@alhaanali2502 Жыл бұрын
You got the best way to teach thank you❤
@statquest Жыл бұрын
Thanks!
@ge13r6 ай бұрын
Saludos desde San Cristóbal, Venezuela!!!
@statquest6 ай бұрын
:)
@cthutu10 ай бұрын
Excellent content, excellent delivery - just bought your book!
@statquest10 ай бұрын
Thank you so much for supporting StatQuest! BAM! :)
@jieunboy2 жыл бұрын
insane teaching quality, thanks !
@statquest2 жыл бұрын
Glad you think so!
@zombieeplays31467 ай бұрын
I come to this channel for the intros tbh!
@statquest7 ай бұрын
bam! :)
@marpin61624 жыл бұрын
Thank you. Now everything is more clear.
@statquest4 жыл бұрын
BAM! :)
@JamesWasTakenOhWell Жыл бұрын
Thank you for the amazing effort you put into this video and BAM!!! as always!
@statquest Жыл бұрын
Thanks!
@preetikharb82833 жыл бұрын
This video made my day, thank you so much, Josh!!
@statquest3 жыл бұрын
Thanks!
@constantthomas38303 жыл бұрын
Thank you from France
@statquest3 жыл бұрын
Merci! :)
@harishbattula26723 жыл бұрын
great explanation sir. Tripple BAM........kudos to your presentation.
@statquest3 жыл бұрын
Thank you! :)
@igorg41294 жыл бұрын
Thanks Josh! you simply the best
@statquest4 жыл бұрын
Thank you very much. I can't wait to get the other videos out soon.
@Morais1154 жыл бұрын
I'm buying the shirt! Kudos to you sir.
@statquest4 жыл бұрын
Awesome! Thank you!
@gilao5 ай бұрын
Another great one. Thanks!
@statquest5 ай бұрын
Thanks again!
@mike___-fi5kp Жыл бұрын
You always are the best.
@statquest Жыл бұрын
Thanks!
@igorg41294 жыл бұрын
Josh, finished watching. Thank you again 1 If I as a researcher know +/- which range of inputs I am going to insert, and which range of outputs I expect to get in the end, will I want to adjust somehow from the very beginning the weights range, maybe weights distribution, same thing about biases and same about activation functions, or today we let the algorithm to do this job? 2 most interesting question: Lets say that while finding the prediction curve we kind of discover some "hidden truth". I think our curve might never be exact also because we do not know all of the independent variables which in nature affect our dependent variable. Say we know one, but there is another one which we do not know about. If so, will it be right to say that when neural network with one input splits the input by different weights into two neurons of a hidden layer (from which the final output is calculated), it is like simulating somehow presence of another "secret independent variable" even without knowing what it is? Thanks
@statquest4 жыл бұрын
I'll be honest, I'm not sure how to answer question #1. I don't know. I do know that some of the methods used for initializing the weights with random values increase the variation allowed in the values based on how many layers are in the neural network - so that might do the trick. As for the second question: Adding the second node in the hidden layer allows the squiggle to go up *and* go down. If I just had one node, I would only be able to go up *or* down. So, in some sense, that is sort of like adding a secret independent variable.
@igorg41294 жыл бұрын
@@statquest Also thought this way. Thank you again and again you do here a titanic job Josh. If not you I wasn't here to ask new questios. :)!
@ZachariahRosenberg4 жыл бұрын
@@igorg4129 It's tempting to want to initialize weights to a target range in the hopes of speeding up convergence, however this actually might be counter productive. The weights of individual nodes do not have to conform to the same distribution as your output. When you use an appropriate (adaptive) optimizer, it should be able to tune the weights pretty quickly, considering that the first few passes will likely have larger gradients.
@pranjalpatil96592 жыл бұрын
Perfect explanation!
@statquest2 жыл бұрын
Thank you!
@harshpahade27033 ай бұрын
Hey, thanks for this playlist, it was a great find. Just pointing out a typo at 1:44 where the Note says 'spit' instead of 'split'.
@miriza24 жыл бұрын
BAM! Thanks Josh! You’re the best! Got myself a pink T-shirt 😍😍😍
@statquest4 жыл бұрын
Hooray! And thank you for supporting StatQuest!!!
@richarda16303 жыл бұрын
Where were you 5 years ago???!?!?! :D Awesome work man! Keep it up :)
@statquest3 жыл бұрын
Thanks! I have 4 more neural network videos coming out in the next month.
@richarda16303 жыл бұрын
@@statquest awesome! can't wait :D
@SPLICY4 жыл бұрын
The understated BAM at 4:40 cracked me up 😂
@statquest4 жыл бұрын
SPLICY in the house!!! BAM! :)
@davidlu10033 ай бұрын
I think I understand the neural network clearly now.😁😁😁
@statquest3 ай бұрын
bam! :)
@ssanand33 жыл бұрын
I wish someday you make a video in person so that we can see the saint behind the voice 😀
@statquest3 жыл бұрын
:)
@zhenyuhe15372 жыл бұрын
You can imagine how complicated professors explain it when even stat quest has to use equations