Batch Normalization in neural networks - EXPLAINED!

  Рет қаралды 5,127

CodeEmporium

CodeEmporium

Күн бұрын

Пікірлер: 14
@KiyotakaAyanokoji1
@KiyotakaAyanokoji1 11 ай бұрын
option B , my thought . . . definitely it is unstable as it is a ellipse and not a circle . the red represents high loss ( peak ) , gree is less loss ( valley ) . . . seeing this a little change along y axis ( Param 2 ) quickly leads us to the valley ( high slope ) giving large loss jumps.
@sccur
@sccur 11 ай бұрын
Man I love your content! I like challenging hikes solo and chill hikes with the kids
@algorithmo134
@algorithmo134 7 ай бұрын
Hi @CodeEmporium, do you have the solution to quiz 2 at 8:46?
@Sickkkkiddddd
@Sickkkkiddddd 6 ай бұрын
How is information preserved with BN? If there was some pattern in the data pre-normalisation, wouldn't that be lost after batch norm?
@x_avimimius3294
@x_avimimius3294 11 ай бұрын
Hi I have an id about an ai based podcast . Here I want to create ai as the main frame of the podcast . Can you guide me on this ?
@thesuriya_3
@thesuriya_3 10 ай бұрын
bro your transformer video is amazing zero to hero 👾
@rethickpavan4264
@rethickpavan4264 11 ай бұрын
U r Mr.Consistant 🎉
@neetpride5919
@neetpride5919 11 ай бұрын
Yes, I like challenging hikes. Real curious to see how this relates to the topic lol
@Pure_Science_and_Technology
@Pure_Science_and_Technology 11 ай бұрын
I’m being selfish here but, can you create a video a out creating datasets for fine-tuning an llm?
@shreysrivastava7515
@shreysrivastava7515 11 ай бұрын
I believe it should be B, B, C
@poorinspirit8322
@poorinspirit8322 6 ай бұрын
1 - B, 2 - B, 3 - C. Are these answers correct?
@AbhijithJayanarayan
@AbhijithJayanarayan 10 ай бұрын
B,A,C
@FaAbYorker
@FaAbYorker 11 ай бұрын
To be honest! Your "QUIZZ TIME" announcement is really annoying 😶
@AbhijithJayanarayan
@AbhijithJayanarayan 10 ай бұрын
B,A,C
Embeddings - EXPLAINED!
12:58
CodeEmporium
Рет қаралды 9 М.
How Does Batch Normalization Work
13:23
Mısra Turp
Рет қаралды 6 М.
Learn to Use Brazen - Our Virtual Conference Platform
55:09
Batch normalization
15:33
Brandon Rohrer
Рет қаралды 10 М.
Layer Normalization - EXPLAINED (in Transformer Neural Networks)
13:34
Informer: complete architecture EXPLAINED!
38:01
CodeEmporium
Рет қаралды 1,9 М.
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,4 МЛН
Batch Normalization (“batch norm”) explained
7:32
deeplizard
Рет қаралды 234 М.
RAG - Explained!
30:00
CodeEmporium
Рет қаралды 3,5 М.
The complete guide to Transformer neural Networks!
27:53
CodeEmporium
Рет қаралды 37 М.
Group Normalization (Paper Explained)
29:06
Yannic Kilcher
Рет қаралды 31 М.
Transfer Learning - EXPLAINED!
16:22
CodeEmporium
Рет қаралды 6 М.