I mean the way sir explained what finally dropout achieves was beyond awesome. Giving out the answer and letting us think why and then putting our thoughts into words. Just Wonderful!
@adityarajora72194 жыл бұрын
Moral of the story: Do not ignore NPTEL.
@neotodsoltani59022 жыл бұрын
Yeah, I was very ignorant toward NPTEL and never watch any of their video. But after this series of lectures, NPTEL is one of my top priorities at learning anything :)
@anilyarlapalli38662 жыл бұрын
Finally my search ended here for the drop out intuition, great video, thanks sir and NPTEL
@yatrikshah44922 жыл бұрын
I have gone through many videos but this one is Best. Finally my search ended here for the dropout.
@shambhabchaki54082 ай бұрын
Sera! Great!
@manmeetpatel9475 Жыл бұрын
Very good. Enjoyable. Please someone take care of volume spike in opening and closing themes :)
@musasall57405 жыл бұрын
EXCELLENT!
@ernikitamalviya4 жыл бұрын
Thank you Sir for the fantastic explanation.
@vaibhavthalanki73202 жыл бұрын
10:50 the explanation (for scaling the parameters) i was looking for after watching Andrew's course
@akshayjain22112 жыл бұрын
same here. Infact I got this video as recommendation after Andrew' Ng's video on dropout
@Dyslexic_Neuron Жыл бұрын
But if the probabilities of every node is same then it just scales down the final output . Didn’t know how that is equivalent to passing the test set to all 2^n models and averaging
@dorgeswati3 жыл бұрын
in some cases loss increases after adding a dropout . what should be done in such cases? allow more epochs so model can learn more during training?