This is beautiful. For the first time I got a clear understanding of the basics of Neural network. Thank you very much
@EntwinedGraces7 жыл бұрын
Im taking a deep learning course this semester and Im so glad these vids exist lol.
@trickstur5994 Жыл бұрын
This channel should have a magnitude more subscribers than it does. Truly a great resource.
@pchol59727 жыл бұрын
Hi Daniel this video blew my mind ! Quick explanation : I'm 19 and I've been spending the last year learning linear algebra in maths lessons (vectorial spaces, matrix...) and for the first time, someone is giving me a practical use for all of this (very) theoric things, and it's furthermore related to my favourite subject ! Thank you a lot for all your work, it's awesome and it's enabling me to improve myself in IT, but also in english as I'm french :P
@Ben-up4lj5 жыл бұрын
I want to say there are more common uses for matrix calculation which we (German) got teached in school. Most about equations which are related to real world problems.
@guilhermetorresj3 жыл бұрын
Finding solutions for systems of linear equations, doing linear transformations to map between coordinate systems (this very channel has a series of videos where he used some linear algebra to get room scale point cloud data from a kinect sensor that basically outputs a matrix), electrical circuits, and I've even heard about using linear algebra to model real-life traffic flow. The way all this math is taught where I live (Brazil) is not optimal from the perspective of a student who just wants to know how to use it. To be honest, most of them won't anyway. But once you get a glimpse of how versatile this bit of maths is, everything just clicks in your mind and it's wonderful.
@Chuukwudi3 жыл бұрын
I can't believe I watched this for free!!! Thank you very much!!
@zinsy234 жыл бұрын
This is really fascinating! I hear about things like this but I was never thinking I was going to have an understanding about how it works in my life! Now I'm starting to see that change way sooner than I would have ever thought it would! I don't really want to learn this from anyone else except you! This is probably one of the best ways to thoroughly teach and learn this!
@FabianMendez7 жыл бұрын
Dude, you're amazing as a teacher. First time ever I'd been able to properly understand this. Also super awesome video the one with the Markov Chains. Thank you for your work!!!
@Qual_5 жыл бұрын
Amazing how the same topic teached by different persons can be either boring/ultra complicated as hell or can be so inspiring that you want to build something right away after the lesson. This guy falls immediately in the second category. It's like he can unlock something inside you which makes you understand something you never understood before.
@elocore17026 жыл бұрын
These videos are awesome, Ive watched many other videos and i always ended up getting lost and basically just copying and pasting code and hoping it works right but with your videos I am actually understanding whats going on and am able to troubleshoot on my own if i do something wrong and figure out what i did incorrectly because I actually understand how its supposed to work. Great job!
@BajoMundoUnderground7 жыл бұрын
keep the good work nice... finally someone that can explain this in the right way
@alonattar38367 жыл бұрын
good work! i'm a beginner programer and you realy help me to be better!! never stop make videos! keep it up
@adeelfitness49934 жыл бұрын
Honestly speaking. You are the best teacher I have ever had. Love from Pakistan. And keep this great stuff going on 👍
@VijayChethanSFCD4 жыл бұрын
Well explained ,after watching many videos I found this is the best video which gave me clear idea about NN
@miteshsharma31067 жыл бұрын
Dan I hope you get well soon ....... waiting for your return on the coding train!!!
@devinvenable45877 жыл бұрын
I really enjoy your teaching style. Quirky, funny and informative.
@TheCodingTrain7 жыл бұрын
thank you!
@shrikanthsingh82435 жыл бұрын
Very descriptive explanation. Thank you for making my lectures easier.
@DustinTWilliams7 жыл бұрын
Great video! I hope you spring back quick and well from your incident, my good sir!
@anuragdixit71076 жыл бұрын
Hey sir u r the most wonderful teacher I ever got Just Love Ur Classes
@sanchitverma28925 жыл бұрын
chup beti
@HeduAI5 жыл бұрын
Awesome explanation! This is how 'teaching' is done! :)
@cameronnichols99054 жыл бұрын
I know he later corrects it one way, but at 17:30, if you want to keep the subscript of the weights as (row*col) while also keeping the row index as the input index, and the column index as the hidden layer index, you could do the transpose of the input matrix and multiply that (on the left) of the weight matrix. This would result in a 1 x n matrix for the output, but all the subscripts would work out in a way that's easier to understand
@hr47353 жыл бұрын
Love the energy and excitement, happy to have discovered you!
@pantepember4 жыл бұрын
Also thank you for your clear pronunciation.
@EscapeMinecraft7 жыл бұрын
Great explanation of neural networks.Based on your lastly written code and Tariq Rashid code i've written my neural network library in c++ but with customized multi layers. I mean it's not great but it's working. Can't wait to see how the series continues.
@omicron296 Жыл бұрын
Você é fantástico! Estou aprendendo com suas explicações! Obrigado!!!!!
@DaleIsWigging7 жыл бұрын
are you going to eventually move onto evolving the overall topolgy (how the graph is connected) , for example the NEAT algorithim? (note: you used * for scalar multiplication and x for matrix and vector notation, if you are planning to cover the cross product you should probably avoid using x for multidimensional products that are associative...uh I mean use it only for the cross product)
@zumamusic52463 жыл бұрын
Following along and making a library for Java! Nice video!
@mademensociety52104 жыл бұрын
This video is still good learning material today!! Thank you .
@cristianrgreco7 жыл бұрын
Thank you for these videos. Genuinely interesting and very well presented.
@lutzruhmann71625 жыл бұрын
Hello, I just would like to say Thank You! After your videos I get it.
@marianagonzales32015 жыл бұрын
Hi! I loved your explanation! It's really clear and easy to understand, great teacher, thank you very much ☺️
@ParitoshBaronVLOGS5 жыл бұрын
Fall in Love With ML ❤️
@osmanmustafaquddusi3187 жыл бұрын
Sorry to hear about you arm. I will pray to God for your fast recovery. Get we soon. ☺☺☺ brother.
@surobhilahiri5 жыл бұрын
Thanks so much for explaining the concept so well!
@climito Жыл бұрын
This man is gold
@taihatranduc86134 жыл бұрын
I love you so much. You are so likable, so funny. You are awesome.
@MrVildy7 жыл бұрын
Hey Dan, first of all I love your content and your personal way of engaging with your viewers! You have a very natural and captivating appearance. I would like to ask you and the rest of the audience a question. I'm hopefully starting on software development in august at a university, and I have just been wondering: Could there be any interest for the programming community to follow a fresh programmer, because I have been thinking about starting a Blog or a KZbin channel, where I would talk about the things we learn and some of the challenges. Is it interesting to learn with a new student or should I wait until I get more experienced? If it is in any way interesting, how can I prepare to start?
@gajuahmed44263 жыл бұрын
Great Explanation. What about the bias that should be added with weight and input when you do forward propagation but you didn't mention it here.
@georgechristoforou9915 жыл бұрын
You talk about the normalisation of inputs that are much larger than other inputs. Could this just be handled by adjustment of the weights to normalise the very large input?
@youssefnim Жыл бұрын
great video, very clearly explained, thank you !
@alirezanet5 жыл бұрын
this video is awesome thanks... for the first time I completely understand why we need linear algebra in neural networks xD
@suvranjansanyal27177 жыл бұрын
EXCELLENT .. YOU ARE AWESOME PROFESSOR.. KEEP LOADING YOUR VIDEOS.
@majdoubwided66664 жыл бұрын
You are the best really. You make it easy and i like MLP just because of you !
@mightyleguan14517 жыл бұрын
Hi Daniel, get well soon and keep up the amazing work!
@ajiththiyar76096 жыл бұрын
Man you just help a lot of people......Thanks😁
@Cnys1003 жыл бұрын
Thank you ! This is really helping me!
@MorneBooysen6 жыл бұрын
How would you manage "degrees of freedom" or is there a feasible solution problem; before you dump data into a neural net which will try to fit data regardless? Great vids!
@luqmanahmad31535 жыл бұрын
very amazing stuff and explanation i want to ask you will you do a list on convocation NN? it will help a lot
@thevfxwizard77586 жыл бұрын
THANK YOU.
@MarcelRuland7 жыл бұрын
19:44 I got goosebumps when I heard Dan say 'Python'.
@dinoswarleafs7 жыл бұрын
Hope your arm gets better soon :)
@mightyleguan14517 жыл бұрын
Hi man, nice to see you here! I enjoy your CS:GO videos just as much as Dan's videos
@patrik86413 жыл бұрын
Isn't the house prediction linearly separable though?
@raimetm3 жыл бұрын
Daniel, you are an amazing guy e do an amazing job! Thank you so much!
@xrayer44125 жыл бұрын
what programing environment are you using? why arent you using processing anymore in the series?
@Gentleman2174 жыл бұрын
OMG. I thought that I will never understand it. But its was all about the teacher. Neural Networks is so easy. I think I will retake BBM406 Fund. of Mach. Learning lecture.. And I will write down the increase of my grade. When I see 'and or truth table' at first I was like 'Really nigga, wtf Im not gonna watch that', but after I watched, I got the concept. It really helped me thanx a lot.. :P
@CocoChanel13136 жыл бұрын
Thank you, I've just started to study machine translation. Your videos help me to understand it more deeply. I haven't seen the next videos yet but I hope you will also make one in which you will use Python3 and that you'll explain more maths like the sigmoid fonction, softmax,.. :).
@smrazaabidi14957 жыл бұрын
Excellent !! Keep it up but could you please tell me that how we will come to know to choose how many hidden neurons in hidden layers? I mean is there any formula?
@amc84373 жыл бұрын
Can you also create a perceptron using Python and PyTorch
@MJ-in9wt7 жыл бұрын
Hey, is P5 support EcmaScript6? Specifically classes and inheritance. Because those are easier to teach junior students rather than objects in JSON format
@Lionel_Ding5 жыл бұрын
You are just amazing ! When you explain it, everything seems simple ^^'
@grapheggspecies89075 жыл бұрын
Hi Daniel Schiffman! I have loads of questions from this video and several others I've watched of yours... I am 14, we were tought all of this linenear algebra last year in school, so i have no problem with that.I want to know why we use perceptrons and neural networks when we can do the same thing with linear and polynomial regression (I've tried it and it took very little time and processing power it also had a lower loss ).Also can we optimize the weights in a neural net using geneticic algorithms you described in a video series you have already made? please whoever can answer these questions...I have no idea who to ask and I dont trust stack exchange (somehow they arent allowing me to post anything there).Thank you for making this video, it really helped me, so did all the others I've watched.They have pulled me out of video game addiction...(I think) *ps (for every one who has had trouble with matrices) these videos on neural nets by you and fastai really help me ace my math tests...
@TheCodingTrain5 жыл бұрын
I like to use simple scenarios that don't need neural networks to practice / learn about neural networks. So that's really the main reason here!
@grapheggspecies89075 жыл бұрын
@@TheCodingTrain Hi again! I understood that from the video but I want to know is - in what kind of situation would a neural network work better than polynomial regression ( because in my eyes they seem the same ).I hope I'm not bothering you too much... Also I am not making any reference to the videos content. cheers, -my_youtube_username!
@ilaapattu98457 жыл бұрын
This Video Really Has Helped Me! Thumbs Up!
@viktorstrate7 жыл бұрын
11:46 You say you would need to normalize the inputs, is this really needed, as all the inputs are weighted. If the weight for house area is weighted by a really small number, lets say 0.00001. It would automatically get normalized, right?
@armandasbarkauskas44855 жыл бұрын
Best tutorials on KZbin!!! Keep going ;p
@ahmedel-asasey19827 жыл бұрын
VERY GOOD KEEP GOING
@peter_dockit Жыл бұрын
The best!
@modernmirza53034 жыл бұрын
One genius guy.^
@anuragghosh11396 жыл бұрын
You are awesome so far!
@StandardName5624 жыл бұрын
Thank you so much!
@xxsunmanxx54306 жыл бұрын
hey, i have a question, can you do this in processing?
@inbasasis82216 жыл бұрын
great video sir
@sololife94033 жыл бұрын
THEEEE best!
@OmkaarMuley7 жыл бұрын
thanks for making this video!
@wawied78817 жыл бұрын
keep it u
@michaellewis82116 жыл бұрын
Can You Tell Us The Process You Use To Make A Project
@davederdudigedude7 жыл бұрын
Hello all, what I don't get is: what's the difference between a feed forward neural network trained with backpropagation and a MLP? Is someone here who can explain the difference?
@ThomasLe7 жыл бұрын
Is this a repost? I swear I saw this exact same thing a few days ago when I binged the series. I even called out that you made a mistake when connecting the lines from the hidden layer to the output layer. How are there comments from 3 days ago but it says published June 30 (TODAY) ? Was there a glitch in The Matrix?! :P The reason I am back to this video is because it was in my notifications as a new video in the series, but it, obviously, is not.
@TheCodingTrain7 жыл бұрын
It's because you saw this video while it was still unlisted. We make all the videos available as soon as we can to anybody who wants to watch the whole series, but we publish them roughly once a day to keep a steady flow of videos coming out. -MB
@ThomasLe7 жыл бұрын
Okay, that makes sense! Thanks!
@tw75227 жыл бұрын
I'm not sure if you've ever introduced yourself? Anyway, keep up the good work. Thank you!
@fvcalderan7 жыл бұрын
he introduces himself at the beginning of every livestream.
@marcolinyi7 жыл бұрын
If I want to schedule the games that language I need to study and schedule I need to download? Ps. I'm Italian boy so my English is bad
@ranjithkumarpranjithkumar7 жыл бұрын
Bias is not require for Hidden Layers ? if we get all values are zero
@ItsGlizda7 жыл бұрын
Typically, a single bias node is added for the input layer and every hidden layer in a feedforward network. You are right, it helps with dealing with zero inputs.
@siddhantkumar72576 жыл бұрын
what are you taking.................humphry davy's N2O (Nitrous Oxide). The Laughing Gas.
@corrompido76807 жыл бұрын
Is your Matrix powered by human beeings?
@Jabh880735 жыл бұрын
Your video helps a lot Thank you >
@georgechristoforou9915 жыл бұрын
You should have gone for a football formation such as 4-4-2 or a 3-5-2
@equarbeyra55014 жыл бұрын
hello dear how can get this software Neusciences Neuframe v4
@sukanyabasu70905 жыл бұрын
It was helpful
@camjansen50254 жыл бұрын
why don't make this course with Python? (Python for AI)!! Nice work
@abhishekjadav23717 жыл бұрын
when is the next live session for NN??(I think it is today but i m nt sure)
@marufhasan93657 жыл бұрын
I was wondering the same thing. Can anybody clarify the actual schedule ?
@algeria75277 жыл бұрын
wow, u r amazing
@chancenigel01865 жыл бұрын
Why cant you do this stuff with JAVA??? Please lemme know cause im trying to do NN coding with JAVA
@TheCodingTrain5 жыл бұрын
You can find a Java port of this series here! github.com/CodingTrain/Toy-Neural-Network-JS/blob/master/README.md#libraries-built-by-the-community
@srikanthshankar88717 жыл бұрын
Collab with siraj raval waiting to see you both in a video that would be a blast!
@0xTim7 жыл бұрын
kzbin.info/www/bejne/l5WahIivg5qJbaM
@AHuMycK6 жыл бұрын
Good video But you did a mistake at 18:22 it has to be h1=w11*x1+w12*x2 h2=w21*x1+w22*x2
@pyclassy4 жыл бұрын
can't you write codes in python ?it will be more helpful if you can do so.
@mohan_manju7 жыл бұрын
Hey Daniel how to modify when require multiple hidden layer . It will it useful if you teach that Thank You :)
@datdemlomikstik44437 жыл бұрын
I have made a version with multiple hidden layers github.com/lomikstik/javascript-p5/blob/master/neuro-evolution-shiffman%2Blib/flappy_for_multi_layer_nn/libraries/nn_multi.js
@grapheggspecies89075 жыл бұрын
yeah! how to modify weights in a neural net...maybe he explains that later on
@nkosanamabuza1093 жыл бұрын
I wish you could do it in python
@Ikpoppy7 жыл бұрын
0:16:20 you are at 10.5
@seanliu55827 жыл бұрын
Nice video! First 750 views :D
@waleedazam69166 жыл бұрын
Too much good, but one thing I must say too much comprehensive, they should be in summarize way so listener can get more in less time.
@demetriusdemarcusbartholom80632 жыл бұрын
ECE 449
@hjjol93616 жыл бұрын
why not stop now ?
@kristianwichmann99967 жыл бұрын
w_{12} and w_{21} should be switched. Edit: Ah, it was caught :-)
@grapheggspecies89075 жыл бұрын
I didnt understand...
@arezhomayounnejad83993 жыл бұрын
hi there. how's going? thank you for energetic description. Can I have your email for further discussion?