10.5: Neural Networks: Multilayer Perceptron Part 2 - The Nature of Code

  Рет қаралды 158,999

The Coding Train

The Coding Train

Күн бұрын

Пікірлер: 132
@KayYesYouTuber
@KayYesYouTuber 5 жыл бұрын
This is beautiful. For the first time I got a clear understanding of the basics of Neural network. Thank you very much
@trickstur5994
@trickstur5994 Жыл бұрын
This channel should have a magnitude more subscribers than it does. Truly a great resource.
@Chuukwudi
@Chuukwudi 3 жыл бұрын
I can't believe I watched this for free!!! Thank you very much!!
@EntwinedGraces
@EntwinedGraces 7 жыл бұрын
Im taking a deep learning course this semester and Im so glad these vids exist lol.
@zinsy23
@zinsy23 4 жыл бұрын
This is really fascinating! I hear about things like this but I was never thinking I was going to have an understanding about how it works in my life! Now I'm starting to see that change way sooner than I would have ever thought it would! I don't really want to learn this from anyone else except you! This is probably one of the best ways to thoroughly teach and learn this!
@FabianMendez
@FabianMendez 7 жыл бұрын
Dude, you're amazing as a teacher. First time ever I'd been able to properly understand this. Also super awesome video the one with the Markov Chains. Thank you for your work!!!
@Qual_
@Qual_ 6 жыл бұрын
Amazing how the same topic teached by different persons can be either boring/ultra complicated as hell or can be so inspiring that you want to build something right away after the lesson. This guy falls immediately in the second category. It's like he can unlock something inside you which makes you understand something you never understood before.
@pchol5972
@pchol5972 7 жыл бұрын
Hi Daniel this video blew my mind ! Quick explanation : I'm 19 and I've been spending the last year learning linear algebra in maths lessons (vectorial spaces, matrix...) and for the first time, someone is giving me a practical use for all of this (very) theoric things, and it's furthermore related to my favourite subject ! Thank you a lot for all your work, it's awesome and it's enabling me to improve myself in IT, but also in english as I'm french :P
@Ben-up4lj
@Ben-up4lj 5 жыл бұрын
I want to say there are more common uses for matrix calculation which we (German) got teached in school. Most about equations which are related to real world problems.
@guilhermetorresj
@guilhermetorresj 3 жыл бұрын
Finding solutions for systems of linear equations, doing linear transformations to map between coordinate systems (this very channel has a series of videos where he used some linear algebra to get room scale point cloud data from a kinect sensor that basically outputs a matrix), electrical circuits, and I've even heard about using linear algebra to model real-life traffic flow. The way all this math is taught where I live (Brazil) is not optimal from the perspective of a student who just wants to know how to use it. To be honest, most of them won't anyway. But once you get a glimpse of how versatile this bit of maths is, everything just clicks in your mind and it's wonderful.
@BajoMundoUnderground
@BajoMundoUnderground 7 жыл бұрын
keep the good work nice... finally someone that can explain this in the right way
@cameronnichols9905
@cameronnichols9905 4 жыл бұрын
I know he later corrects it one way, but at 17:30, if you want to keep the subscript of the weights as (row*col) while also keeping the row index as the input index, and the column index as the hidden layer index, you could do the transpose of the input matrix and multiply that (on the left) of the weight matrix. This would result in a 1 x n matrix for the output, but all the subscripts would work out in a way that's easier to understand
@alonattar3836
@alonattar3836 7 жыл бұрын
good work! i'm a beginner programer and you realy help me to be better!! never stop make videos! keep it up
@elocore1702
@elocore1702 6 жыл бұрын
These videos are awesome, Ive watched many other videos and i always ended up getting lost and basically just copying and pasting code and hoping it works right but with your videos I am actually understanding whats going on and am able to troubleshoot on my own if i do something wrong and figure out what i did incorrectly because I actually understand how its supposed to work. Great job!
@adeelfitness4993
@adeelfitness4993 4 жыл бұрын
Honestly speaking. You are the best teacher I have ever had. Love from Pakistan. And keep this great stuff going on 👍
@shrikanthsingh8243
@shrikanthsingh8243 6 жыл бұрын
Very descriptive explanation. Thank you for making my lectures easier.
@devinvenable4587
@devinvenable4587 7 жыл бұрын
I really enjoy your teaching style. Quirky, funny and informative.
@TheCodingTrain
@TheCodingTrain 7 жыл бұрын
thank you!
@VijayChethanSFCD
@VijayChethanSFCD 4 жыл бұрын
Well explained ,after watching many videos I found this is the best video which gave me clear idea about NN
@DustinTWilliams
@DustinTWilliams 7 жыл бұрын
Great video! I hope you spring back quick and well from your incident, my good sir!
@miteshsharma3106
@miteshsharma3106 7 жыл бұрын
Dan I hope you get well soon ....... waiting for your return on the coding train!!!
@HeduAI
@HeduAI 5 жыл бұрын
Awesome explanation! This is how 'teaching' is done! :)
@hr4735
@hr4735 3 жыл бұрын
Love the energy and excitement, happy to have discovered you!
@anuragdixit7107
@anuragdixit7107 6 жыл бұрын
Hey sir u r the most wonderful teacher I ever got Just Love Ur Classes
@sanchitverma2892
@sanchitverma2892 5 жыл бұрын
chup beti
@lutzruhmann7162
@lutzruhmann7162 5 жыл бұрын
Hello, I just would like to say Thank You! After your videos I get it.
@cristianrgreco
@cristianrgreco 7 жыл бұрын
Thank you for these videos. Genuinely interesting and very well presented.
@mademensociety5210
@mademensociety5210 4 жыл бұрын
This video is still good learning material today!! Thank you .
@marianagonzales3201
@marianagonzales3201 5 жыл бұрын
Hi! I loved your explanation! It's really clear and easy to understand, great teacher, thank you very much ☺️
@EscapeMinecraft
@EscapeMinecraft 7 жыл бұрын
Great explanation of neural networks.Based on your lastly written code and Tariq Rashid code i've written my neural network library in c++ but with customized multi layers. I mean it's not great but it's working. Can't wait to see how the series continues.
@surobhilahiri
@surobhilahiri 5 жыл бұрын
Thanks so much for explaining the concept so well!
@zumamusic5246
@zumamusic5246 3 жыл бұрын
Following along and making a library for Java! Nice video!
@osmanmustafaquddusi318
@osmanmustafaquddusi318 7 жыл бұрын
Sorry to hear about you arm. I will pray to God for your fast recovery. Get we soon. ☺☺☺ brother.
@DaleIsWigging
@DaleIsWigging 7 жыл бұрын
are you going to eventually move onto evolving the overall topolgy (how the graph is connected) , for example the NEAT algorithim? (note: you used * for scalar multiplication and x for matrix and vector notation, if you are planning to cover the cross product you should probably avoid using x for multidimensional products that are associative...uh I mean use it only for the cross product)
@pantepember
@pantepember 4 жыл бұрын
Also thank you for your clear pronunciation.
@alirezanet
@alirezanet 5 жыл бұрын
this video is awesome thanks... for the first time I completely understand why we need linear algebra in neural networks xD
@suvranjansanyal2717
@suvranjansanyal2717 7 жыл бұрын
EXCELLENT .. YOU ARE AWESOME PROFESSOR.. KEEP LOADING YOUR VIDEOS.
@majdoubwided6666
@majdoubwided6666 4 жыл бұрын
You are the best really. You make it easy and i like MLP just because of you !
@omicron296
@omicron296 Жыл бұрын
Você é fantástico! Estou aprendendo com suas explicações! Obrigado!!!!!
@ParitoshBaronVLOGS
@ParitoshBaronVLOGS 6 жыл бұрын
Fall in Love With ML ❤️
@MarcelRuland
@MarcelRuland 7 жыл бұрын
19:44 I got goosebumps when I heard Dan say 'Python'.
@armandasbarkauskas4485
@armandasbarkauskas4485 6 жыл бұрын
Best tutorials on KZbin!!! Keep going ;p
@taihatranduc8613
@taihatranduc8613 4 жыл бұрын
I love you so much. You are so likable, so funny. You are awesome.
@mightyleguan1451
@mightyleguan1451 7 жыл бұрын
Hi Daniel, get well soon and keep up the amazing work!
@ajiththiyar7609
@ajiththiyar7609 6 жыл бұрын
Man you just help a lot of people......Thanks😁
@gajuahmed4426
@gajuahmed4426 4 жыл бұрын
Great Explanation. What about the bias that should be added with weight and input when you do forward propagation but you didn't mention it here.
@Lionel_Ding
@Lionel_Ding 5 жыл бұрын
You are just amazing ! When you explain it, everything seems simple ^^'
@youssefnim
@youssefnim Жыл бұрын
great video, very clearly explained, thank you !
@Gentleman217
@Gentleman217 5 жыл бұрын
OMG. I thought that I will never understand it. But its was all about the teacher. Neural Networks is so easy. I think I will retake BBM406 Fund. of Mach. Learning lecture.. And I will write down the increase of my grade. When I see 'and or truth table' at first I was like 'Really nigga, wtf Im not gonna watch that', but after I watched, I got the concept. It really helped me thanx a lot.. :P
@climito
@climito Жыл бұрын
This man is gold
@georgechristoforou991
@georgechristoforou991 5 жыл бұрын
You talk about the normalisation of inputs that are much larger than other inputs. Could this just be handled by adjustment of the weights to normalise the very large input?
@raimetm
@raimetm 3 жыл бұрын
Daniel, you are an amazing guy e do an amazing job! Thank you so much!
@luqmanahmad3153
@luqmanahmad3153 5 жыл бұрын
very amazing stuff and explanation i want to ask you will you do a list on convocation NN? it will help a lot
@viktorstrate
@viktorstrate 7 жыл бұрын
11:46 You say you would need to normalize the inputs, is this really needed, as all the inputs are weighted. If the weight for house area is weighted by a really small number, lets say 0.00001. It would automatically get normalized, right?
@ilaapattu9845
@ilaapattu9845 7 жыл бұрын
This Video Really Has Helped Me! Thumbs Up!
@Cnys100
@Cnys100 4 жыл бұрын
Thank you ! This is really helping me!
@MorneBooysen
@MorneBooysen 6 жыл бұрын
How would you manage "degrees of freedom" or is there a feasible solution problem; before you dump data into a neural net which will try to fit data regardless? Great vids!
@smrazaabidi1495
@smrazaabidi1495 7 жыл бұрын
Excellent !! Keep it up but could you please tell me that how we will come to know to choose how many hidden neurons in hidden layers? I mean is there any formula?
@grapheggspecies8907
@grapheggspecies8907 5 жыл бұрын
Hi Daniel Schiffman! I have loads of questions from this video and several others I've watched of yours... I am 14, we were tought all of this linenear algebra last year in school, so i have no problem with that.I want to know why we use perceptrons and neural networks when we can do the same thing with linear and polynomial regression (I've tried it and it took very little time and processing power it also had a lower loss ).Also can we optimize the weights in a neural net using geneticic algorithms you described in a video series you have already made? please whoever can answer these questions...I have no idea who to ask and I dont trust stack exchange (somehow they arent allowing me to post anything there).Thank you for making this video, it really helped me, so did all the others I've watched.They have pulled me out of video game addiction...(I think) *ps (for every one who has had trouble with matrices) these videos on neural nets by you and fastai really help me ace my math tests...
@TheCodingTrain
@TheCodingTrain 5 жыл бұрын
I like to use simple scenarios that don't need neural networks to practice / learn about neural networks. So that's really the main reason here!
@grapheggspecies8907
@grapheggspecies8907 5 жыл бұрын
@@TheCodingTrain Hi again! I understood that from the video but I want to know is - in what kind of situation would a neural network work better than polynomial regression ( because in my eyes they seem the same ).I hope I'm not bothering you too much... Also I am not making any reference to the videos content. cheers, -my_youtube_username!
@MrVildy
@MrVildy 7 жыл бұрын
Hey Dan, first of all I love your content and your personal way of engaging with your viewers! You have a very natural and captivating appearance. I would like to ask you and the rest of the audience a question. I'm hopefully starting on software development in august at a university, and I have just been wondering: Could there be any interest for the programming community to follow a fresh programmer, because I have been thinking about starting a Blog or a KZbin channel, where I would talk about the things we learn and some of the challenges. Is it interesting to learn with a new student or should I wait until I get more experienced? If it is in any way interesting, how can I prepare to start?
@dinoswarleafs
@dinoswarleafs 7 жыл бұрын
Hope your arm gets better soon :)
@mightyleguan1451
@mightyleguan1451 7 жыл бұрын
Hi man, nice to see you here! I enjoy your CS:GO videos just as much as Dan's videos
@CocoChanel1313
@CocoChanel1313 6 жыл бұрын
Thank you, I've just started to study machine translation. Your videos help me to understand it more deeply. I haven't seen the next videos yet but I hope you will also make one in which you will use Python3 and that you'll explain more maths like the sigmoid fonction, softmax,.. :).
@MJ-in9wt
@MJ-in9wt 7 жыл бұрын
Hey, is P5 support EcmaScript6? Specifically classes and inheritance. Because those are easier to teach junior students rather than objects in JSON format
@davederdudigedude
@davederdudigedude 7 жыл бұрын
Hello all, what I don't get is: what's the difference between a feed forward neural network trained with backpropagation and a MLP? Is someone here who can explain the difference?
@xrayer4412
@xrayer4412 5 жыл бұрын
what programing environment are you using? why arent you using processing anymore in the series?
@thevfxwizard7758
@thevfxwizard7758 6 жыл бұрын
THANK YOU.
@patrik8641
@patrik8641 3 жыл бұрын
Isn't the house prediction linearly separable though?
@anuragghosh1139
@anuragghosh1139 6 жыл бұрын
You are awesome so far!
@wawied7881
@wawied7881 7 жыл бұрын
keep it u
@marcolinyi
@marcolinyi 7 жыл бұрын
If I want to schedule the games that language I need to study and schedule I need to download? Ps. I'm Italian boy so my English is bad
@amc8437
@amc8437 3 жыл бұрын
Can you also create a perceptron using Python and PyTorch
@ahmedel-asasey1982
@ahmedel-asasey1982 7 жыл бұрын
VERY GOOD KEEP GOING
@xxsunmanxx5430
@xxsunmanxx5430 6 жыл бұрын
hey, i have a question, can you do this in processing?
@Jabh88073
@Jabh88073 5 жыл бұрын
Your video helps a lot Thank you >
@ranjithkumarpranjithkumar
@ranjithkumarpranjithkumar 7 жыл бұрын
Bias is not require for Hidden Layers ? if we get all values are zero
@ItsGlizda
@ItsGlizda 7 жыл бұрын
Typically, a single bias node is added for the input layer and every hidden layer in a feedforward network. You are right, it helps with dealing with zero inputs.
@tw7522
@tw7522 7 жыл бұрын
I'm not sure if you've ever introduced yourself? Anyway, keep up the good work. Thank you!
@fvcalderan
@fvcalderan 7 жыл бұрын
he introduces himself at the beginning of every livestream.
@georgechristoforou991
@georgechristoforou991 5 жыл бұрын
You should have gone for a football formation such as 4-4-2 or a 3-5-2
@abhishekjadav2371
@abhishekjadav2371 7 жыл бұрын
when is the next live session for NN??(I think it is today but i m nt sure)
@marufhasan9365
@marufhasan9365 7 жыл бұрын
I was wondering the same thing. Can anybody clarify the actual schedule ?
@AHuMycK
@AHuMycK 6 жыл бұрын
Good video But you did a mistake at 18:22 it has to be h1=w11*x1+w12*x2 h2=w21*x1+w22*x2
@modernmirza5303
@modernmirza5303 4 жыл бұрын
One genius guy.^
@inbasasis8221
@inbasasis8221 7 жыл бұрын
great video sir
@StandardName562
@StandardName562 4 жыл бұрын
Thank you so much!
@OmkaarMuley
@OmkaarMuley 7 жыл бұрын
thanks for making this video!
@peter_dockit
@peter_dockit Жыл бұрын
The best!
@michaellewis8211
@michaellewis8211 6 жыл бұрын
Can You Tell Us The Process You Use To Make A Project
@mohan_manju
@mohan_manju 7 жыл бұрын
Hey Daniel how to modify when require multiple hidden layer . It will it useful if you teach that Thank You :)
@datdemlomikstik4443
@datdemlomikstik4443 7 жыл бұрын
I have made a version with multiple hidden layers github.com/lomikstik/javascript-p5/blob/master/neuro-evolution-shiffman%2Blib/flappy_for_multi_layer_nn/libraries/nn_multi.js
@grapheggspecies8907
@grapheggspecies8907 5 жыл бұрын
yeah! how to modify weights in a neural net...maybe he explains that later on
@equarbeyra5501
@equarbeyra5501 4 жыл бұрын
hello dear how can get this software Neusciences Neuframe v4
@siddhantkumar7257
@siddhantkumar7257 6 жыл бұрын
what are you taking.................humphry davy's N2O (Nitrous Oxide). The Laughing Gas.
@ThomasLe
@ThomasLe 7 жыл бұрын
Is this a repost? I swear I saw this exact same thing a few days ago when I binged the series. I even called out that you made a mistake when connecting the lines from the hidden layer to the output layer. How are there comments from 3 days ago but it says published June 30 (TODAY) ? Was there a glitch in The Matrix?! :P The reason I am back to this video is because it was in my notifications as a new video in the series, but it, obviously, is not.
@TheCodingTrain
@TheCodingTrain 7 жыл бұрын
It's because you saw this video while it was still unlisted. We make all the videos available as soon as we can to anybody who wants to watch the whole series, but we publish them roughly once a day to keep a steady flow of videos coming out. -MB
@ThomasLe
@ThomasLe 7 жыл бұрын
Okay, that makes sense! Thanks!
@sololife9403
@sololife9403 3 жыл бұрын
THEEEE best!
@chancenigel0186
@chancenigel0186 5 жыл бұрын
Why cant you do this stuff with JAVA??? Please lemme know cause im trying to do NN coding with JAVA
@TheCodingTrain
@TheCodingTrain 5 жыл бұрын
You can find a Java port of this series here! github.com/CodingTrain/Toy-Neural-Network-JS/blob/master/README.md#libraries-built-by-the-community
@srikanthshankar8871
@srikanthshankar8871 7 жыл бұрын
Collab with siraj raval waiting to see you both in a video that would be a blast!
@0xTim
@0xTim 7 жыл бұрын
kzbin.info/www/bejne/l5WahIivg5qJbaM
@corrompido7680
@corrompido7680 7 жыл бұрын
Is your Matrix powered by human beeings?
@Ikpoppy
@Ikpoppy 7 жыл бұрын
0:16:20 you are at 10.5
@sukanyabasu7090
@sukanyabasu7090 6 жыл бұрын
It was helpful
@algeria7527
@algeria7527 7 жыл бұрын
wow, u r amazing
@camjansen5025
@camjansen5025 4 жыл бұрын
why don't make this course with Python? (Python for AI)!! Nice work
@pyclassy
@pyclassy 4 жыл бұрын
can't you write codes in python ?it will be more helpful if you can do so.
@arezhomayounnejad8399
@arezhomayounnejad8399 3 жыл бұрын
hi there. how's going? thank you for energetic description. Can I have your email for further discussion?
@seanliu5582
@seanliu5582 7 жыл бұрын
Nice video! First 750 views :D
@nkosanamabuza109
@nkosanamabuza109 3 жыл бұрын
I wish you could do it in python
@kristianwichmann9996
@kristianwichmann9996 7 жыл бұрын
w_{12} and w_{21} should be switched. Edit: Ah, it was caught :-)
@grapheggspecies8907
@grapheggspecies8907 5 жыл бұрын
I didnt understand...
@demetriusdemarcusbartholom8063
@demetriusdemarcusbartholom8063 2 жыл бұрын
ECE 449
@waleedazam6916
@waleedazam6916 6 жыл бұрын
Too much good, but one thing I must say too much comprehensive, they should be in summarize way so listener can get more in less time.
@yassine321
@yassine321 4 жыл бұрын
the 17 dislikes are from the "AI takeover" ideology
10.6: Neural Networks: Matrix Math Part 1 - The Nature of Code
18:13
The Coding Train
Рет қаралды 137 М.
10.4: Neural Networks: Multilayer Perceptron Part 1 - The Nature of Code
15:56
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 83 МЛН
99.9% IMPOSSIBLE
00:24
STORROR
Рет қаралды 31 МЛН
10.3: Neural Networks: Perceptron Part 2 - The Nature of Code
27:42
The Coding Train
Рет қаралды 152 М.
10.12: Neural Networks: Feedforward Algorithm Part 1 - The Nature of Code
27:41
Backpropagation Details Pt. 1: Optimizing 3 parameters simultaneously.
18:32
StatQuest with Josh Starmer
Рет қаралды 219 М.
How Deep Neural Networks Work
24:38
Brandon Rohrer
Рет қаралды 1,5 МЛН
10.2: Neural Networks: Perceptron Part 1 - The Nature of Code
44:39
The Coding Train
Рет қаралды 504 М.
3.4: Linear Regression with Gradient Descent - Intelligence and Learning
21:33
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 397 М.
Coding Challenge 180: Falling Sand
23:00
The Coding Train
Рет қаралды 1 МЛН
3.5: Mathematics of Gradient Descent - Intelligence and Learning
22:36
The Coding Train
Рет қаралды 243 М.
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 83 МЛН