Mini Batch Gradient Descent | Deep Learning | with Stochastic Gradient Descent

  Рет қаралды 20,150

Learn With Jay

Learn With Jay

Күн бұрын

Пікірлер: 42
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.
@sumitsp01
@sumitsp01 2 жыл бұрын
This is by far the best video on introduction to optimizers. Very precise, articulated and clears all the doubts. Thanks a lot brother !
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
Glad it helped you 😇
@alexkonopatski429
@alexkonopatski429 2 жыл бұрын
good video! but I have only one question: where does the noise come from, that you mentioned at 5:11?
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
Thats how the loss changes when we have more number of features
@arvinflores5316
@arvinflores5316 3 жыл бұрын
Oh wow. When I was start learning ML, last year, your linear regression was one of the vids I first watched. Now, I'm a data scientist, you're still uploading high quality vids. Thank you! Hopefully we could get to see LSTMs and Transformers in the future. :P good day.
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Wow... Really good to know this. Thank you for sharing your story! And Yes I will be uploading videos on LSTM.
@arvinflores5316
@arvinflores5316 3 жыл бұрын
@@MachineLearningWithJay Wow nice! Can I make a suggestion? Maybe in the future you can include weight initialization like Xavier and He Norm. Those topics tend to be ignored because the computer is basically covering those, (I'm guilty of that :P) without knowing the reason behind it, e.g the disadvantages of weight initialization with 0 value.
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
@@arvinflores5316 Thank you for giving this suggestion. I will definitely consider making videos on these topics.
@dr.ranjinips3103
@dr.ranjinips3103 8 ай бұрын
Very nice Explanation. Super
@devanshsanghavi
@devanshsanghavi 2 жыл бұрын
Brilliant explanation... Keep it up!
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
Thank you!
@afn8370
@afn8370 7 ай бұрын
very good explanation. u need more views
@harpbeat500
@harpbeat500 2 ай бұрын
great video
@MachineLearningWithJay
@MachineLearningWithJay 2 ай бұрын
Glad it was helpful!
@hardikdas2378
@hardikdas2378 3 жыл бұрын
very informative and precise.
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Thank you!
@AbhinavSingh-oq7dk
@AbhinavSingh-oq7dk 2 жыл бұрын
So from what I understand, in Mini Batch Gradient Descent, model will train 1st mini batch, update the weights and then those updated weights will be used to train the 2nd mini batch, update and then the 3rd mini batch,..., till the last mini batch (1 epoch), then the last mini batch updated weight will be again used on 1st mini batch during 2nd epoch and so on? Do correct if wrong.
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
You are correct on your understanding Abhinav. I would just like to correct the words. You can say that the updated weights after training on any mini-batch is used to propoagate forward, and then they are updated again in backward propagation. Eg, randomly initialise weights at the beginning. Propagate forward (perform forward propagation) using 1st mini batch, then perform backward propagation, then update weights. Use those updated weights to perform forward propagation using 2nd minibatch, then backward propagation, update weights again and so on.
@AbhinavSingh-oq7dk
@AbhinavSingh-oq7dk 2 жыл бұрын
@@MachineLearningWithJay Thanks :)
@amirrezamousavi5139
@amirrezamousavi5139 2 жыл бұрын
Great job
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
Thank you!
@choice_of_royals5268
@choice_of_royals5268 2 жыл бұрын
amzing, deeply explained. thanks
@EEDNAGELIVINAYAKSHRINIWAS
@EEDNAGELIVINAYAKSHRINIWAS 3 жыл бұрын
Your videos are helpful, Can you suggest a good book on same...
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Hi... I don’t refer any book so can’t suggest you any. Although you can search for good books on ML online. I once found an article which showed top 10 books for learning ML.
@EEDNAGELIVINAYAKSHRINIWAS
@EEDNAGELIVINAYAKSHRINIWAS 3 жыл бұрын
@@MachineLearningWithJay Ok ! Thanks!...But have topics like batch normalisation and standard network like LeNet, AlexNet, AGG, GoogleNet in detail
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
@@EEDNAGELIVINAYAKSHRINIWAS Have you read the original research papers for these? I think you can learn about these in their research papers only.
@EEDNAGELIVINAYAKSHRINIWAS
@EEDNAGELIVINAYAKSHRINIWAS 3 жыл бұрын
@@MachineLearningWithJay Ok..can you mail me on my id lets discuss separately on some projects
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Hi Vinayak, you can mail me on codeboosterjp@gmail.com with your query. I will see what i can do to help.
@GK-jw8bn
@GK-jw8bn 3 жыл бұрын
very good explanation! well done!
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Thank you!
@chenmoasis
@chenmoasis Жыл бұрын
Amazing helpful video!
@malikhamza9286
@malikhamza9286 3 жыл бұрын
Amazing dude. Keep it up.
@MachineLearningWithJay
@MachineLearningWithJay 3 жыл бұрын
Thank You So Much !!
@SaschaRobitzki
@SaschaRobitzki 2 жыл бұрын
Great job!
@hosseinkarimi3381
@hosseinkarimi3381 2 жыл бұрын
Thank you so much for your videos.
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
You're Welcome. Glad you like them! 😇
@jordiwang
@jordiwang Жыл бұрын
you got a new sub bro, good video
@sharangkulkarni1759
@sharangkulkarni1759 2 жыл бұрын
ty
@MachineLearningWithJay
@MachineLearningWithJay 2 жыл бұрын
Welcome!
@azadehazhdari931
@azadehazhdari931 Жыл бұрын
👌
@mugomuiruri2313
@mugomuiruri2313 Жыл бұрын
good boy
Momentum Optimizer in Deep Learning | Explained in Detail
11:17
Learn With Jay
Рет қаралды 26 М.
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
Beat Ronaldo, Win $1,000,000
22:45
MrBeast
Рет қаралды 158 МЛН
Stochastic Gradient Descent, Clearly Explained!!!
10:53
StatQuest with Josh Starmer
Рет қаралды 492 М.
LSTM Recurrent Neural Network (RNN) | Explained in Detail
19:32
Learn With Jay
Рет қаралды 74 М.
Gradient descent, how neural networks learn | DL2
20:33
3Blue1Brown
Рет қаралды 7 МЛН
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,4 МЛН
Why Your Brain Sabotages Your Goals (and How to Fix It)
11:56
Productive Peter
Рет қаралды 29 М.
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН