Batch Normalization in neural networks - EXPLAINED!

  Рет қаралды 5,250

CodeEmporium

CodeEmporium

Күн бұрын

Пікірлер: 15
@alhadparashtekar1314
@alhadparashtekar1314 10 күн бұрын
This is the best explanation of batch normalization that I have seen. I really can't code unless i understand the reasoning behind the mathematics. Thanks for the explanation.
@sccur
@sccur 11 ай бұрын
Man I love your content! I like challenging hikes solo and chill hikes with the kids
@KiyotakaAyanokoji1
@KiyotakaAyanokoji1 11 ай бұрын
option B , my thought . . . definitely it is unstable as it is a ellipse and not a circle . the red represents high loss ( peak ) , gree is less loss ( valley ) . . . seeing this a little change along y axis ( Param 2 ) quickly leads us to the valley ( high slope ) giving large loss jumps.
@algorithmo134
@algorithmo134 7 ай бұрын
Hi @CodeEmporium, do you have the solution to quiz 2 at 8:46?
@Sickkkkiddddd
@Sickkkkiddddd 6 ай бұрын
How is information preserved with BN? If there was some pattern in the data pre-normalisation, wouldn't that be lost after batch norm?
@thesuriya_3
@thesuriya_3 11 ай бұрын
bro your transformer video is amazing zero to hero 👾
@x_avimimius3294
@x_avimimius3294 11 ай бұрын
Hi I have an id about an ai based podcast . Here I want to create ai as the main frame of the podcast . Can you guide me on this ?
@neetpride5919
@neetpride5919 11 ай бұрын
Yes, I like challenging hikes. Real curious to see how this relates to the topic lol
@rethickpavan4264
@rethickpavan4264 11 ай бұрын
U r Mr.Consistant 🎉
@Pure_Science_and_Technology
@Pure_Science_and_Technology 11 ай бұрын
I’m being selfish here but, can you create a video a out creating datasets for fine-tuning an llm?
@poorinspirit8322
@poorinspirit8322 7 ай бұрын
1 - B, 2 - B, 3 - C. Are these answers correct?
@shreysrivastava7515
@shreysrivastava7515 11 ай бұрын
I believe it should be B, B, C
@AbhijithJayanarayan
@AbhijithJayanarayan 11 ай бұрын
B,A,C
@FaAbYorker
@FaAbYorker 11 ай бұрын
To be honest! Your "QUIZZ TIME" announcement is really annoying 😶
@AbhijithJayanarayan
@AbhijithJayanarayan 11 ай бұрын
B,A,C
Embeddings - EXPLAINED!
12:58
CodeEmporium
Рет қаралды 9 М.
LoRA - Explained!
29:22
CodeEmporium
Рет қаралды 5 М.
Batch normalization
15:33
Brandon Rohrer
Рет қаралды 10 М.
Why Do We Need Activation Functions in Neural Networks?
14:32
NeuralNine
Рет қаралды 3,5 М.
How Does Batch Normalization Work
13:23
Mısra Turp
Рет қаралды 6 М.
Informer Encoder architecture - EXPLAINED!
26:11
CodeEmporium
Рет қаралды 1,1 М.
Batch normalization | What it is and how to implement it
13:51
AssemblyAI
Рет қаралды 67 М.
Informer: complete architecture EXPLAINED!
38:01
CodeEmporium
Рет қаралды 2 М.
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 572 М.
The complete guide to Transformer neural Networks!
27:53
CodeEmporium
Рет қаралды 37 М.