If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.
@sumitsp012 жыл бұрын
This is by far the best video on introduction to optimizers. Very precise, articulated and clears all the doubts. Thanks a lot brother !
@MachineLearningWithJay2 жыл бұрын
Glad it helped you 😇
@alexkonopatski4292 жыл бұрын
good video! but I have only one question: where does the noise come from, that you mentioned at 5:11?
@MachineLearningWithJay2 жыл бұрын
Thats how the loss changes when we have more number of features
@arvinflores53163 жыл бұрын
Oh wow. When I was start learning ML, last year, your linear regression was one of the vids I first watched. Now, I'm a data scientist, you're still uploading high quality vids. Thank you! Hopefully we could get to see LSTMs and Transformers in the future. :P good day.
@MachineLearningWithJay3 жыл бұрын
Wow... Really good to know this. Thank you for sharing your story! And Yes I will be uploading videos on LSTM.
@arvinflores53163 жыл бұрын
@@MachineLearningWithJay Wow nice! Can I make a suggestion? Maybe in the future you can include weight initialization like Xavier and He Norm. Those topics tend to be ignored because the computer is basically covering those, (I'm guilty of that :P) without knowing the reason behind it, e.g the disadvantages of weight initialization with 0 value.
@MachineLearningWithJay3 жыл бұрын
@@arvinflores5316 Thank you for giving this suggestion. I will definitely consider making videos on these topics.
@dr.ranjinips31038 ай бұрын
Very nice Explanation. Super
@devanshsanghavi2 жыл бұрын
Brilliant explanation... Keep it up!
@MachineLearningWithJay2 жыл бұрын
Thank you!
@afn83707 ай бұрын
very good explanation. u need more views
@harpbeat5002 ай бұрын
great video
@MachineLearningWithJay2 ай бұрын
Glad it was helpful!
@hardikdas23783 жыл бұрын
very informative and precise.
@MachineLearningWithJay3 жыл бұрын
Thank you!
@AbhinavSingh-oq7dk2 жыл бұрын
So from what I understand, in Mini Batch Gradient Descent, model will train 1st mini batch, update the weights and then those updated weights will be used to train the 2nd mini batch, update and then the 3rd mini batch,..., till the last mini batch (1 epoch), then the last mini batch updated weight will be again used on 1st mini batch during 2nd epoch and so on? Do correct if wrong.
@MachineLearningWithJay2 жыл бұрын
You are correct on your understanding Abhinav. I would just like to correct the words. You can say that the updated weights after training on any mini-batch is used to propoagate forward, and then they are updated again in backward propagation. Eg, randomly initialise weights at the beginning. Propagate forward (perform forward propagation) using 1st mini batch, then perform backward propagation, then update weights. Use those updated weights to perform forward propagation using 2nd minibatch, then backward propagation, update weights again and so on.
@AbhinavSingh-oq7dk2 жыл бұрын
@@MachineLearningWithJay Thanks :)
@amirrezamousavi51392 жыл бұрын
Great job
@MachineLearningWithJay2 жыл бұрын
Thank you!
@choice_of_royals52682 жыл бұрын
amzing, deeply explained. thanks
@EEDNAGELIVINAYAKSHRINIWAS3 жыл бұрын
Your videos are helpful, Can you suggest a good book on same...
@MachineLearningWithJay3 жыл бұрын
Hi... I don’t refer any book so can’t suggest you any. Although you can search for good books on ML online. I once found an article which showed top 10 books for learning ML.
@EEDNAGELIVINAYAKSHRINIWAS3 жыл бұрын
@@MachineLearningWithJay Ok ! Thanks!...But have topics like batch normalisation and standard network like LeNet, AlexNet, AGG, GoogleNet in detail
@MachineLearningWithJay3 жыл бұрын
@@EEDNAGELIVINAYAKSHRINIWAS Have you read the original research papers for these? I think you can learn about these in their research papers only.
@EEDNAGELIVINAYAKSHRINIWAS3 жыл бұрын
@@MachineLearningWithJay Ok..can you mail me on my id lets discuss separately on some projects
@MachineLearningWithJay3 жыл бұрын
Hi Vinayak, you can mail me on codeboosterjp@gmail.com with your query. I will see what i can do to help.