Deep Learning(CS7015): Lec 8.11 Dropout

  Рет қаралды 24,885

NPTEL-NOC IITM

NPTEL-NOC IITM

Күн бұрын

Пікірлер: 17
@rashmidhant3364
@rashmidhant3364 4 жыл бұрын
I mean the way sir explained what finally dropout achieves was beyond awesome. Giving out the answer and letting us think why and then putting our thoughts into words. Just Wonderful!
@adityarajora7219
@adityarajora7219 4 жыл бұрын
Moral of the story: Do not ignore NPTEL.
@neotodsoltani5902
@neotodsoltani5902 2 жыл бұрын
Yeah, I was very ignorant toward NPTEL and never watch any of their video. But after this series of lectures, NPTEL is one of my top priorities at learning anything :)
@anilyarlapalli3866
@anilyarlapalli3866 2 жыл бұрын
Finally my search ended here for the drop out intuition, great video, thanks sir and NPTEL
@yatrikshah4492
@yatrikshah4492 2 жыл бұрын
I have gone through many videos but this one is Best. Finally my search ended here for the dropout.
@shambhabchaki5408
@shambhabchaki5408 2 ай бұрын
Sera! Great!
@manmeetpatel9475
@manmeetpatel9475 Жыл бұрын
Very good. Enjoyable. Please someone take care of volume spike in opening and closing themes :)
@musasall5740
@musasall5740 5 жыл бұрын
EXCELLENT!
@ernikitamalviya
@ernikitamalviya 4 жыл бұрын
Thank you Sir for the fantastic explanation.
@vaibhavthalanki7320
@vaibhavthalanki7320 2 жыл бұрын
10:50 the explanation (for scaling the parameters) i was looking for after watching Andrew's course
@akshayjain2211
@akshayjain2211 2 жыл бұрын
same here. Infact I got this video as recommendation after Andrew' Ng's video on dropout
@Dyslexic_Neuron
@Dyslexic_Neuron Жыл бұрын
But if the probabilities of every node is same then it just scales down the final output . Didn’t know how that is equivalent to passing the test set to all 2^n models and averaging
@dorgeswati
@dorgeswati 3 жыл бұрын
in some cases loss increases after adding a dropout . what should be done in such cases? allow more epochs so model can learn more during training?
@PratikShetty-sv9ig
@PratikShetty-sv9ig 10 ай бұрын
Is scaling required during inference?
@venkateshmunagala8089
@venkateshmunagala8089 3 жыл бұрын
Its a beautiful video
@morancium
@morancium 3 жыл бұрын
dude this was awesome
Deep Learning(CS7015): Lec 7.1 Introduction to Autoncoders
52:33
NPTEL-NOC IITM
Рет қаралды 70 М.
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 148 МЛН
Sigma Kid Mistake #funny #sigma
00:17
CRAZY GREAPA
Рет қаралды 30 МЛН
Deep Learning(CS7015): Lec 15.3 Attention Mechanism
27:38
NPTEL-NOC IITM
Рет қаралды 45 М.
Deep Learning(CS7015): Lec 9.3 Better activation functions
28:09
NPTEL-NOC IITM
Рет қаралды 22 М.
Deep Learning(CS7015): Lec 11.1 The convolution operation
18:29
NPTEL-NOC IITM
Рет қаралды 45 М.
Deep Learning(CS7015): Lec 8.4 L2 regularization
24:13
NPTEL-NOC IITM
Рет қаралды 28 М.
Deep Learning(CS7015): Lec 9.5 Batch Normalization
15:44
NPTEL-NOC IITM
Рет қаралды 30 М.
Deep Learning(CS7015): Lec 9.2 Unsupervised pre-training
24:57
NPTEL-NOC IITM
Рет қаралды 21 М.
The 10 Biggest Myths About Our Economy
27:03
Robert Reich
Рет қаралды 220 М.
Deep Learning(CS7015): Lec 13.3 Backpropagation through time
14:24
NPTEL-NOC IITM
Рет қаралды 52 М.
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН