One year ago I was watching your tutorials on how to draw squares on a canvas with code. One year later I'm trying to build machine learning models, also with the help of your tutorials. I'm not even a CS student, I'm a pianist!
@rakib178743 жыл бұрын
great! Do u play in concert??
@bilourkhan33455 жыл бұрын
A happy face always helps to learn with ease and fun. Keep it up man !
@saramariacl4 жыл бұрын
yes, is so true
@bilalazeemshamsi78953 жыл бұрын
Sir, the question is how can a person who is in this field be this happy? lol :P
@optymystyc Жыл бұрын
I'm here because the coursera course instructor in the class I'm taking just can't explain it with as much joy and happiness as you. I feel like the information I'm getting here is paired with enthusiasm and that's the way it should first be introduced to my brain.
@stopaskingmetousemyrealnam38104 жыл бұрын
Taking it back to Boolean Algebra makes it very clear why MLPs are a natural solution to the XOR problem, thank you. Nobody's done that yet in anything I've seen, even though it's obvious in hindsight and maybe should have been obvious in advance.
@jonkleiman80187 жыл бұрын
It's because of your teaching that I've decided to pursue a career in this field. A brilliant balance of fun and seriousness.
@TheCodingTrain7 жыл бұрын
Best of luck to you!
@seemarai53106 жыл бұрын
Well, i have to say you can be elected for the best teacher award. You are simply a perfect teacher.
@8eck4 жыл бұрын
Linearly separable and linearly not separable explanation is the best explanation! It's now logical, why there are multiple layers was required! GREAT! Thank you!
@vanshitagupta41832 жыл бұрын
You teach this subject with such passion. It is kinda getting me excited about learning it too
@LeandroBarbksa6 жыл бұрын
I love how excited you are explaining this.
@chandranshsharma16856 жыл бұрын
Amazing teacher.I have my semester exam tomorrow and was searching a lot about multi layer perceptron on the internet and wasn't able to find good explanation.thank god I found your video.💙
@morphman865 жыл бұрын
I have been trying for quite some time to figure out what the "hidden layer" is, how it works and what the purpose is. So many others either get right up to that subject and then stop posting, or talk about it as if I should already know. So for some time, I have only been able to do simple perceptrons. Now I finally understand that hidden layers are just layers of multiple perceptrons being pushed into other perceptrons, where each perceptron has been trained to complete a different task. Thank you!
@muskansaxena57086 ай бұрын
The way you teach is fun , it's like you're yourself enjoying teaching, which us students love ...one could fall in love with the knowledge presented here!!
@joachimsaindon36583 жыл бұрын
This video is a great example of why your channel is one of my favorites.
@shedytaieb10833 жыл бұрын
Man, you have no idea how the content you're creating is useful and interesting GOOD JOB
@marcocastellano24515 жыл бұрын
I found you years ago when I needed to learn steering algorithms. You made the math and algorithm simple(r) to understand and your videos are a lot like a drop of sunshine in my day. It reminds me of Reading Rainbow when I was a youngster. Now I am back to continue my work on CNNs. And there you are again in my suggested videos :D
@ericmrozinski61437 ай бұрын
An excellent and exciting explanation! This is exactly what I was looking for in trying to understand the motive behind the multi-layer perceptron. Not to be taken for granted!
@henriqueb2873 жыл бұрын
Keep going man, I wish you were my teacher from college. Fun, smiles and learning together. Such a great experience to learn with you, 15 minutes passed like nothing but full of knoweledge. Love from Brazil! Keep going!
@scipsyche55967 жыл бұрын
Good job The topic is very interesting, what's more interesting is the way he teaches☺
@waisyousofi91392 жыл бұрын
it is a unique talent to teach and bring smile at the same time. Wow..
@danishshaikh29942 жыл бұрын
Man, I'm speechless, god level explanation 🔥🔥🔥
@CloverSerena3 жыл бұрын
I like you. You are the ideal teacher. The genuine sincere pleasure of teaching what you love to others. I can feel that love.
@waisyousofi91392 жыл бұрын
What a nice teacher. truly enjoying the way you teach and convey your knowledge.. plz keep going.....
@redIroncool6 жыл бұрын
I actually love your enthusiasm!!!!
@kineticsquared6 жыл бұрын
Outstanding explanation of linearly separable. You make it very easy to understand why multiple perceptrons are required. Plus I love Boolean logic. Thank you.
@SidVanam4 жыл бұрын
Cool to see how you linked the "Linearly Seperable" terminology to the boolean Truth tables! - Learned something applicable and new!
@mkthakral2 жыл бұрын
Teachers like you are so rare. Gem.
@rogerhom1512 Жыл бұрын
I've seen a lot of videos about neural networks, both advanced ones (which go over my head) and beginner ones (which are too general). That XOR example in this video was an epiphany for me! Now I have an intuitive sense of what makes neural networks so special (vs., say, linear classifiers.) Now I feel like I'm finally ready to go deeper into this subject
@TheCodingTrain Жыл бұрын
I'm so happy to hear this!
@rogerhom1512 Жыл бұрын
@@TheCodingTrain Yah, that bit about how a single layer network can only solve linearly-separable problems, and how hidden layers fix this limitation, finally makes intuitive sense to me thanks to the XOR example. Thanks! Not sure if you cover this in subsequent videos, but I'd be interested to hear your take about why having multiple hidden layers can be useful, vs. just one hidden layer.
@paulorugal6 жыл бұрын
You're the BEST CS TEACHER THAT I NEVER HAD
@anshrastogi94302 жыл бұрын
I literally want this sort of sense of humour in my college professor. Thanks for saving my semester. Love from India.
@justincollinns6 жыл бұрын
Your answer to "But what is XOR really?" at 10:46 was just what I needed! Thank you!
@FredoCorleone2 жыл бұрын
What a master. We are really fortunate to have Daniel as instructor here on KZbin!
@kumudtripathi40545 жыл бұрын
Loved the way you are teaching...I have already known mlp but your way of teaching makes me watch it again
@parths.19033 жыл бұрын
This dude is so awesome, I can watch him teach all day. Love you, pal.
@kdpoint42215 жыл бұрын
U made me understand better than any simplified notes.......
@samwakieltojar81544 жыл бұрын
this man has ENERGY
@usmanmehmood76146 жыл бұрын
this video just made me simply happy. Great Thanks from Pakistan. NUST needs to hire such professors
@najibsaad57657 жыл бұрын
You are outstandingly interesting. Keep going!
@TheCodingTrain7 жыл бұрын
Thanks for the nice feedback!
@anaibrahim4361 Жыл бұрын
I was extremely happy when I discovered that you had posted a video on a topic that I was searching for.
@d.g.74172 жыл бұрын
I'm speechless. What a beautiful explanation!
@Matt234885 жыл бұрын
"Maybe you just watch my previous videos on the Perceptron" Yes. Yes I did.
@graju20005 жыл бұрын
Man I wish they'd give you nobel prize for teaching!
@HeduAI5 жыл бұрын
Awesome explanation! You are so gifted!
@venkatdinesh44693 жыл бұрын
ur teaching style is really awesome....
@TheAsimjan4 жыл бұрын
Amazing explaining... Magically deliver a complex topic
@fernandolasheras60684 жыл бұрын
OMG. Best video of NN basics concepts by far. And craziest too. Very fun to watch. Congrats!!!
@doug81716 жыл бұрын
Great example of the need for more than one perceptron layer for the XOR.
@nicholask92517 жыл бұрын
Great videos and tutorials, Big fan here. Cool that you dont just make code but also explain the concept at beginning.
@joshvanstaden76153 жыл бұрын
Give this man some Concerta! Lol, in all honesty, I love being taught by people who are passionate about what they do. Keep it up!
@Cipherislive5 жыл бұрын
What a genius teacher you are . Appreciate you sir
@Sworn9737 жыл бұрын
Interesting, so basically same analogy to electronics building Logic gates from transistors. You kind like add they together to get more complex operations. Very good material. Keep going, I'm really into this
@TheTimeforwar4 жыл бұрын
If every black kid in the hood had a teacher like this they'd all succeed at understanding this easily; why? Because this guy's likability makes you want to learn. When you enjoy the person teaching you, you will usually enjoy 'what' they're teaching you. The 'capacity' to 'understand' has very little to do with 'achievability' in human affairs & 'thinking' certainly pertains to human affairs. I'm understanding concepts I've never encountered before, not because I'm 'smart', but because the instructor in this video is interesting, funny, has a charm individually his own and is not intimidating or threatening in anyway, least of all, neither is he boring. Every young person deserves a teacher like this.
@grainfrizz7 жыл бұрын
6:57 genius. Very effective teacher
@backtashmohammadi38243 жыл бұрын
Holly Juice. That was an amazing explanation. My Professor at the uni confused me a lot . but this video made my day
@webberwang65206 жыл бұрын
I haven't heard this great an explanation before on KZbin, great stuff!
@anonymousvevo86972 жыл бұрын
the only channel with no haters ! amazing sir! good luck love you
@critstixdarkspear53756 жыл бұрын
You should be given your own show on the science network. More educational, fun, engaging and entertaining than 99% of the crap we pay for. Better than most courses I have seen on programming. Bill Nye + Bob Ross + Mr. Rogers. 11/10
@Bo_om25907 ай бұрын
This guy has a golden heart
@carlosdebourbondeparme60214 жыл бұрын
you can only lunch if you are hungry AND thirsty. love the videos :)
@missiongrandmastercurvefev87267 жыл бұрын
Awesome. Your way of teaching is perfect.
@baog49375 жыл бұрын
Sir, your method is Excellent
@my_dixie_rect88656 жыл бұрын
Love this video. Explained it really well. I have an exam on Wednesday which covers MLP and the functions of layers and neurones. This should help form my answer.
@60pluscrazy3 жыл бұрын
Very well explained and expressed 👌🙏
@drakshayanibakka114 жыл бұрын
More excited to watch your videos. keep rocking with your enthusiasm
@ahmarhussain87203 жыл бұрын
I got that click where you suddenly understand a concept, by watching this video, thanks so much
@Sripooja.Mahavadi5 жыл бұрын
How can someone dislike his video .he seems to be a genuine happy man..exuding joy...let him be :) The kind of excitement he has towards his code is what I need towards my life ;)
@leylasuleymanli7256 жыл бұрын
today i should study MLP but because of my some problems i could not concentrate.But after watching your tutorial you make me smile and forget about problems and understand the topic.Thanks a lot :)
@TheCodingTrain6 жыл бұрын
glad to hear!
@NightRyder5 жыл бұрын
Thanks got my exam in 8 days!
@mohamedchawila97345 жыл бұрын
Ting!!! i've learned something, 'xor' => 8:04
@jt-kv3mn5 жыл бұрын
this is more than just an Neural Networks tutorial! thx
@Smile-to2ii2 жыл бұрын
I love your energy and smiling face.
@tecnoplayer7 жыл бұрын
Thanks for teaching us assembly, sensei.
@kashan-hussain39485 жыл бұрын
Thank you Sir for making concepts easier.
@yisenliang8114 Жыл бұрын
Fantastic explanation! This is just what I need.
@elizabethmathewst5 жыл бұрын
Beautiful presentation
@battatia7 жыл бұрын
third! Really appreciating these tutorials, much friendlier than others!
@TheCodingTrain7 жыл бұрын
Thanks, that's nice to hear!
@gururajahegdev20862 жыл бұрын
Very Nicely Explained. Great Tutorial
@AM-jx3zf4 жыл бұрын
wow this guy is so animated. instantly likeable.
@PoojaYadav-hr2ub4 жыл бұрын
Woweeeeee ... Another level of explanation
@algeria75277 жыл бұрын
i realy love the way you teaches. good work keep up.
@4Y0P7 жыл бұрын
I love the way you explain things, energetic but informative, loving these videos!
@morphman865 жыл бұрын
Another way to see linearly separable problems: If it has a binary output, as in it either is or it isn't. With the dots on the canvas, they are either below the line, or they aren't. We just picked "aren't" to mean "above", but that's how we humans chose to read the output. We read it as "below" or "above", the computer reads it as "is" or "isn't". If you draw a line across your data and define a relationship between the data point and the line, the point either falls into that relationship, or it doesn't.
@nageshbs89454 жыл бұрын
11.40 very well explained thankyou!!
@KishanKa6 жыл бұрын
Nice way you have explained the basics, thanks 😊
@sarveshrajan16246 жыл бұрын
awesome and easy explanation. thanks!
@TheCodingTrain6 жыл бұрын
Thank you!
@RafaelBritodeOliveira7 жыл бұрын
I'm really enjoying those videos. Thank you very much for all your hard work.
@montserratcano23897 жыл бұрын
Great video! Thank you very much! You just save my academic life :)
@TheCodingTrain7 жыл бұрын
Glad to hear!
@raitomaru6 жыл бұрын
Really enjoyable class!
@furrane7 жыл бұрын
Great video as usual Dan, I'm looking forward to the sequel =) On a side note, I think everyone here understands !AND but the usual way is to call this gate NAND (for Not AND).
@TheCodingTrain7 жыл бұрын
oh, hah, yes, good point!
@endritnazifi3356 Жыл бұрын
Amazing explanation
@eassis24 жыл бұрын
Outstanding teaching method, really thank you.
@sachinsharma-kw4zd6 жыл бұрын
You are amazing bro.keep it up.i m learning a lot from you
@TheCodingTrain6 жыл бұрын
Thank you!
@srinivasadineshparupalli51394 жыл бұрын
Awesomeness at it best.
@likeyou33176 жыл бұрын
Damn Dan you seem to be such a lovely person and I say it as a man! Keep doing these tutorials becouse I don't know if there is any other channel on yt explaining neural networks on code as good as you do it.
@abdulhaseebqadeer10624 жыл бұрын
you are the best sir
@learnapplybuild6 жыл бұрын
So Much Excitement you have to share knowledge ......i liked that gesture .... keep it up dude ...Thank you
@justsmoking23557 жыл бұрын
u are realy very good when u learing ..thank u for this vedio
@ilariavestale98424 жыл бұрын
Hi, thanks for the explanation but there are ways to understand: - how many hidden layers and how many hidden neurons to use? - network type, traning fuction, adption learning fuction, performance fuction and transfer fuction? thank u so much
@Vikram-od6ur4 жыл бұрын
Thank you for making these videos
@wawied78817 жыл бұрын
Goodjob! Quite interesting topic
@xavmanisdabestest5 жыл бұрын
Wait so perceptrons are these crazy learning logic gates that work on linear systems. That's rad!