Neural nets can learn any function, but they are also prone to overfitting the data. If you are interested in finding out why models overfit and underfit, make sure to check out this video: kzbin.info/www/bejne/a57FiWl_id-hfs0
@SirGisebert2 жыл бұрын
Clear, fluid, and approachable - good job!
@datamlistic2 жыл бұрын
Thank you!!
@simonetruglia6 ай бұрын
This is a very good video mate. Thanks for it
@datamlistic6 ай бұрын
Thanks! Happy to hear that you liked it! :)
@kraigochieng63958 ай бұрын
thanks. you explained all the parameters well. the animations for how the weights and biases affect the final output really helped. thanks again.
@datamlistic8 ай бұрын
Thanks for the feedback! Happy to hear that you liked the explanation! :)
@amatinqureshi4917 Жыл бұрын
I was looking for this type of explanation. Thank you !!!
@datamlistic Жыл бұрын
Glad it was helpful! :)
@exxzxxe11 ай бұрын
Very well done, sir!
@datamlistic11 ай бұрын
Thanks! Glad you liked it! :)
@Nzargnalphabet11 ай бұрын
I would suggest x/|x| as a step function that works on discontinuous functions, but it is discontinuous, and actually not defined at zero, it’s better to simply go into the base code and access the sign bit of the number, although it really isn’t very generalizable at all
@datamlistic11 ай бұрын
Never heard of x/|x| to be used as a step function. Have you encountered it anywhere implemented in a neural network?
@Nzargnalphabet11 ай бұрын
@@datamlistic no, it’s more of just a thing I’ve been expirimenting with, I was mostly trying to find a better way to approximate the floor function, but it still could have a use here
@AfreediZ3 ай бұрын
Great, please keep going
@datamlistic19 күн бұрын
Will do, thanks!🙏
@florin-andreirusu64242 жыл бұрын
Nicely explained! Thanks!
@datamlistic2 жыл бұрын
Many thanks!!
@neolinksrv Жыл бұрын
2:00 What is the operation inside the second neuron?
@datamlistic Жыл бұрын
That's the sigmoid function. :)
@uc53312 жыл бұрын
This was super helpful for me!
@datamlistic2 жыл бұрын
Thank you! I am glad it helped you!
@Nzargnalphabet11 ай бұрын
If you could detect a discontinuity and its position, which I know to be possible using limits, you could use such a function in that case
@iacobsorina69242 жыл бұрын
Greart video! Thanks!
@datamlistic2 жыл бұрын
Thank you!!!
@JanKowalski-dm5vr Жыл бұрын
Nice video, this is exactly what I was wondering about. So adding one node in a layer gives the function 2 new inflection points. But what happens to the function when we add another layer with one node?
@datamlistic Жыл бұрын
Thanks for this! Hopefully I am not wrong in what I am about to say, but you get 2^(n+1) inflection points if you have n layers. However, the produced regions are symmetrical. Take a look at my 'why we build deep neural networks' video, if you wanna learn more about this. :)
@kadirbasol82 Жыл бұрын
How can we fit infinite sine wave or other infinite series ? Every input automatically mapped to modded output or any better solution for infinite series ?
@datamlistic Жыл бұрын
That's a very interesting question! Well, infinite problems require infinite solution? You could theoretically fit an infinite sine wave or other infinite series using the logic explained in this video if you had an infinite number of neurons. However, I think you may need another type of neural networks if you wanted to fit such functions with a finite number of neurons (periodical neural nets?). Unfortunately, I am not aware of such networks, but I can search a little bit in the literature for them if you are still interested.
@kadirbasol82 Жыл бұрын
@@datamlistic Yeah , we are interested in this question.Yes we can make module for sine wave for input.But what if we dont know its infinite series that all the time doing same operation.
@datamlistic Жыл бұрын
Hmm, to the best of my knowledge, and please correct me if I am wrong, then it's impossible to approximate such functions for an infinte number of points, without prior knowledge about the fact that the data is periodical or without an infinite number of neurons. Even more, by using the construction logic in this video, you can't approximate perfectly a function within a certain range without having an infinite number of neurons.
@florinrusu34092 жыл бұрын
Awesome!
@datamlistic2 жыл бұрын
Thanks mate!
@ikartikthakur5 ай бұрын
then does that mean if a function needs 30 steps to approximate.. you'll need 30 hidden neurons is that's the analogy.
@datamlistic5 ай бұрын
yeah... but it's easier to visualize it this way
@ikartikthakur5 ай бұрын
@@datamlistic Yes ..I really liked this animation. Thank a lot for sharing..
@SuperMaDBrothers Жыл бұрын
But this isn't how actual NNs learn functions at all! You could just use intuition to reason this, it's pretty obvious!
@datamlistic Жыл бұрын
Thank you for your feedback! This a correct observation. That's indeed not how NNs learn and are used in practice. However, the following is just a theoretical proof which shows that through the intelligent manipulation of NNs weights you can approximate any continuous function. It's a nice thing to know and be aware of when working with NNs, because not all statistical models have this property.