To try everything Brilliant has to offer -free- for a full 30 days, visit 👉 brilliant.org/DanielBoctor/. You'll also get 20% off an annual premium subscription! THANKS FOR WATCHING ❤ JOIN THE DISCORD! 👉 discord.gg/WYqqp7DXbm 👇 Let me know what type of content you would like to see next! 👇 Thank you for all of the support, I love all of you
@MaakaSakuranbo10 күн бұрын
"you clicked on this video because you wanted to know more about neural networks" Actually 3x+1 just reminded me of the collatz conjecture
@lindhe9 күн бұрын
Yeah, I was completely click baited by that! 😅
@machine-boy8 күн бұрын
same I got baited by it too
@w花b8 күн бұрын
Right?? I was expecting some link between the conjecture and NNs :(
@senyai8 күн бұрын
I feel Rickrolled
@LambdaJack8 күн бұрын
@15:35 It was intentional. I don't see the point. Exactly as _advertised_. I was mistaken to expect anything less from the _Brilliant_ title!
@PascalProducers8 күн бұрын
This was the first video that explained neural networks in a way that actually made sense to me- showing me the math concepts behind it, while not going to the actual level of the calculations and becoming overwhelming, which made it really enjoyable and easy to follow, as well as excited to learn more about the reLU/log functions and how they truly work. Thank you!!
@DanielBoctor5 күн бұрын
thank you for the support, I'm glad it was helpful 😊
@imyourmanzi5 күн бұрын
you explained succinctly in 20 minutes what my college AI professor couldn't get across in 15 weeks
@ASCENDEDPulsar9 күн бұрын
I remember trying to wrote basic neural networks in 2019 and got lost in terminology of activation functions. I want to say thank you for giving clear explanation on sophisticated fields, be it reverse engineering or neural networks
@bradleystone64989 күн бұрын
You really did a great job explaining this. It would really be great if you extended this to show how probabilistic functions are included in this process and how LLMs generate the next token based on the previous chains of tokens.
@EnginoIsRunning10 күн бұрын
Your videos are astonishingly clear in every way. Even with just a basic knowledge of coding, it's possible to understand everything. Good job!
@Shaunmcdonogh-shaunsurfing8 күн бұрын
Very well presented and thank you
@garbage_blob8 күн бұрын
Very informative video. Not my field but you've explained it really well to a layman
@ataarono9 күн бұрын
9:40 negative house prices are perfectly reasonable. If you for example input a given prison cell, it should output the Bail price.
@mohammeddigital9 күн бұрын
Damn!
@tdv86869 күн бұрын
😂😂
@Bola3828 күн бұрын
clickbaited by the Collatz conjecture
@jacobjayme62809 күн бұрын
ANOTHER BANGERRR BY DANIEL BOCTOR!!!!
@airman1224699 күн бұрын
Yes. Neural nets are just combinations of linear functions. It’s like a learned Tayler series expansion for nearly any learnable pattern.
@MartinBlaha9 күн бұрын
Thank you very much ❤ I like the way how you're explaining quite complex topics. I'd love to see more on AI related topics for coders. We need to learn how to elevate AI and this starts with understanding of the relevant concepts like neural networks. Well done 👍
@texastoland9 күн бұрын
I already understood NNs but liked the revelation that using it is just like training without back prop 👏🏼
@royalsrivastava20799 күн бұрын
good work . i really liked the way you explained this topics . its in my syllabus and this was the best explanation. now i can solve numerical easily
@RealRatchet9 күн бұрын
Calling them linear is only technically true because it can't really do proper approximation of a non-linear problem without a non-linear activation functions, what it does is train the slopes of the the linear activation function. But I imagine this will be adressed by the time I finish watching the video.
@shantanusapru9 күн бұрын
Nicely explained! Looking forward to those other 2 in-depth videos...!
@MrVohveli9 күн бұрын
"So it's all functions?" - Always has been.
@oliver17849 күн бұрын
are we just 4 dimensional neural networks? whatever that means... also, great video! ❤
@ginebro19309 күн бұрын
Non linear, and can imitate any non linear function.
@genesisreaper21139 күн бұрын
Weights and biases, neurotransmitters and minimum activation levels, tomato tomato potato potato. All the same thing the further down you go. With advanced enough technology you could turn anyone's brain into an algorithm. Assuming quantum physics can be put into a equation, of course.
@xanaxity9 күн бұрын
it's y=mx+c. 🤔
@nickthetalldude3 күн бұрын
Thumbnail infringes on CGP Gray trademark
@artemis182510 күн бұрын
3b1b OST?
@DeGandalf9 күн бұрын
Just a very small correction: For the definition of deep neural networks it actually also needs to extract the features on its own. So just stuffing the housing features into an MLP doesn't really make it deep learning.
@entropycat9 күн бұрын
I hate your click bait title!
@Darkev779 күн бұрын
It’s not though
@Not_Him_FrFr9 күн бұрын
@@Darkev77they probably just meant the way it attracts people to click, making them expect more, not that the video itself is clickbait
@LambdaJack8 күн бұрын
Too much to expect from the author, an LLM itself, and it's followers... RegExps I guess.
@TrusePkay8 күн бұрын
Video is good but stop using other people's thumbnails and theme music
@ImperialRoads9 күн бұрын
This man is the goat fill me up with more content king 🔥🔥🗣️