What is Vanishing/Exploding Gradients Problem in NNs

  Рет қаралды 8,934

Mısra Turp

Mısra Turp

Күн бұрын

Пікірлер
@ProgrammingCradle
@ProgrammingCradle 2 жыл бұрын
Wonderful explanation Misra. It's a simple concept but I have seen many people getting confused because of it. I am sure it will help many learners.
@deniz.7200
@deniz.7200 Жыл бұрын
The best explanation of this issue! Thank you very much Mısra!
@angelmcorrea1704
@angelmcorrea1704 Жыл бұрын
Thank you so much, excellent explanation.
@noelleletoile8980
@noelleletoile8980 2 ай бұрын
Thanks, super helpful!!
@mmacaulay
@mmacaulay 2 жыл бұрын
Hi Misra, Your videos are amazing and your explanations are usually very accessible. However, while the vanishing/ exploding gradient problem in NNs is a complex concept, I did unfortunately find your train of thought or explanation in this video confusing. Would it be possible to provide a another video on the vanishing/ exploding gradient problem? Many thanks.
@Sickkkkiddddd
@Sickkkkiddddd Жыл бұрын
Essentially, deeper networks increase the risk of wonky gradients because of the multiplicative effects of the chain rule during back-propagation. Gradients in earlier layers of the network will have diminishing/vanishing gradients which means their neurons will learn essentially nothing during backprop causing the network to take forever to train. In the reverse case, gradients in earlier layers will have exploding gradients which will ultimately destabilise the training process and produce inefficient/unreliable parameters.
@noelleletoile8980
@noelleletoile8980 2 ай бұрын
Her explanation was clear even to a neophyte neuroscientist
@bay-bicerdover
@bay-bicerdover Жыл бұрын
Dynamic Length Factorization Machines hakkinda youtube'da tek bir video var. Benim gibi cömezlere o makinenin isleyisini uygulamali bir videoda anlatirsaniz, makbule gecer.
@khyatipatni6117
@khyatipatni6117 11 ай бұрын
Hi Misra, I brought the notes for deep learning, I did not know it's just one time download, I downloaded that time and then lost it. How can I re-download it without paying again, Please help.
@misraturp
@misraturp 11 ай бұрын
Hello, it is not a one-time download. Did you try the link sent to your email once again?
@bay-bicerdover
@bay-bicerdover Жыл бұрын
Sigmoid islevinin azami türev degeri 0.25
How to Choose the Correct Initializer for your Neural Network
3:53
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 148 МЛН
Beat Ronaldo, Win $1,000,000
22:45
MrBeast
Рет қаралды 158 МЛН
Fool-proof RNN explanation | What are RNNs, how do they work?
16:05
How Does Batch Normalization Work
13:23
Mısra Turp
Рет қаралды 6 М.
Using Data Visualization to Identify Data Problems
12:27
Mısra Turp
Рет қаралды 3,2 М.
Vanishing/Exploding Gradients (C2W1L10)
6:08
DeepLearningAI
Рет қаралды 127 М.
Vanishing Gradient Problem || Quickly Explained
5:48
Developers Hutt
Рет қаралды 26 М.
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,8 МЛН
But what is a neural network? | Deep learning chapter 1
18:40
3Blue1Brown
Рет қаралды 18 МЛН
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН