How Does Batch Normalization Work

  Рет қаралды 4,631

Mısra Turp

Mısra Turp

Күн бұрын

Vanishing/Exploding Gradients are two of the main problems we face when building neural networks. Before jumping into trying out fixes, it is important to understand what they mean, why they happen and what problems they cause for our neural networks. In this video, we will learn what it means for gradients to vanish or explode and we will take a quick look at what techniques there are in order to deal with vanishing or exploding gradients.
Previous lesson: • How to Choose an Activ...
Next lesson: • Gradient Clipping and ...
📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumr...
👩‍💻 You can get access to all the code I develop in this course here: github.com/mis...
❓To get the most out of the course, don't forget to answer the end of module questions:
fishy-dessert-...
👉 You can find the answers here:
fishy-dessert-...
RESOURCES:
🏃‍♀️ Data Science Kick-starter mini-course: www.misraturp....
🐼 Pandas cheat sheet: misraturp.gumr...
📥 Streamlit template (updated in 2023, now for $5): misraturp.gumr...
📝 NNs hyperparameters cheat sheet: www.misraturp....
📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumr...
COURSES:
👩‍💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp....
🌎 Website - misraturp.com/
🐥 Twitter - / misraturp

Пікірлер: 14
@anngladyo5668
@anngladyo5668 21 күн бұрын
you really have a knack for teaching, thank u so much!! gotta kick my deep learning exam in the assss
@pra1699
@pra1699 Жыл бұрын
This topic is very complex , might require a rewatch for me. You are very good in teaching.
@misraturp
@misraturp Жыл бұрын
Thank you! Good to hear you liked it :)
@BobbyWicked
@BobbyWicked Жыл бұрын
Very nice! Re: incorrect calculations, there's a typo at 5:40 on the right side version of x hat. I believe you meant 46 rather than 46^2?
@bay-bicerdover
@bay-bicerdover Жыл бұрын
well spotted
@aadilminhaz1637
@aadilminhaz1637 7 ай бұрын
Perfect explanation in the most simplest way. 👏
@bay-bicerdover
@bay-bicerdover Жыл бұрын
Sanatina vâkif bir kadinsiniz, batch normallestirme katmaniyla ilgili acik ara en aciklayici video olmus
@GregThatcher
@GregThatcher 4 ай бұрын
Thanks!
@massoudkadivar8758
@massoudkadivar8758 Жыл бұрын
Best teacher ever, thanks
@misraturp
@misraturp Жыл бұрын
Wow, thanks!
@nguyenhaidung8833
@nguyenhaidung8833 Жыл бұрын
Hi Misra, for your previous example with Mnist, you divided the input values with 255, is that batch normalization for the input layer ?
@bay-bicerdover
@bay-bicerdover Жыл бұрын
see definition of normalization in 2:10
Жыл бұрын
@@bay-bicerdover 1:30
@AbdallahBoukouffallah
@AbdallahBoukouffallah 10 ай бұрын
You are so pretty i can't stop watching you videos
Batch normalization | What it is and how to implement it
13:51
AssemblyAI
Рет қаралды 60 М.
А ВЫ ЛЮБИТЕ ШКОЛУ?? #shorts
00:20
Паша Осадчий
Рет қаралды 8 МЛН
🍉😋 #shorts
00:24
Денис Кукояка
Рет қаралды 3,1 МЛН
Batch normalization
15:33
Brandon Rohrer
Рет қаралды 10 М.
How to Solve Vanishing Gradients in Keras and Python
10:06
Mısra Turp
Рет қаралды 1,7 М.
Why Does Batch Norm Work? (C2W3L06)
11:40
DeepLearningAI
Рет қаралды 199 М.
How to select the correct optimizer for Neural Networks
13:48
Mısra Turp
Рет қаралды 2,2 М.
NN - 21 - Batch Normalization - Theory
18:09
Meerkat Statistics
Рет қаралды 801
Batch Normalization - EXPLAINED!
8:49
CodeEmporium
Рет қаралды 107 М.
Layer Normalization - EXPLAINED (in Transformer Neural Networks)
13:34
What is Vanishing/Exploding Gradients Problem in NNs
6:07
Mısra Turp
Рет қаралды 8 М.
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,6 МЛН
А ВЫ ЛЮБИТЕ ШКОЛУ?? #shorts
00:20
Паша Осадчий
Рет қаралды 8 МЛН