Here is how Transformers ended the tradition of Inductive Bias in Neural Nets

  Рет қаралды 7,387

Neural Breakdown with AVB

Neural Breakdown with AVB

Күн бұрын

Пікірлер: 22
@hieunguyentranchi947
@hieunguyentranchi947 10 ай бұрын
This is gold I hope it gets the ATTENTION it deserves
@avb_fj
@avb_fj 10 ай бұрын
Thanks!! More attention will surely TRANSFORM this channel! 😂
@TP-ct7qm
@TP-ct7qm 11 ай бұрын
Awesome video! This (together with the last two videos) is one of the best explanations of Transformers I've seen. Thanks and keep it up!
@karthikdulam9830
@karthikdulam9830 13 күн бұрын
This video needs to be more popular!
@AI_ML_DL_LLM
@AI_ML_DL_LLM 11 ай бұрын
the gist of this video: 4:29, a great job, thanks
@JasFox420
@JasFox420 11 ай бұрын
Dude, you are a treasure, keep it up!
@amoghjain
@amoghjain 11 ай бұрын
wowww! what a great explanation! helps knit so many individual concepts together in one cohesive knowledge base!! thanks a lot for making this video and all the animations!
@avb_fj
@avb_fj 11 ай бұрын
Thanks!!
@ZenTheSapo
@ZenTheSapo Ай бұрын
Incredible series, helped me a lot.
@IdPreferNot1
@IdPreferNot1 6 ай бұрын
Came here after interest from one blue three brown. It's clear you've got a great explanation style... plus you were earlier ;). Hope your channel following builds to match your outstanding quality.
@avb_fj
@avb_fj 6 ай бұрын
Welcome! Thanks a lot for the shoutout!
@matiasalonso6430
@matiasalonso6430 11 ай бұрын
Congrats !! Awesome channel !
@avb_fj
@avb_fj 11 ай бұрын
Thanks!
@SuebpongPruttipattanapong
@SuebpongPruttipattanapong 27 күн бұрын
Thank you so much!
@GabrielAnguitaVeas
@GabrielAnguitaVeas 7 ай бұрын
Thank you!
@user-yr9sf2yr3n
@user-yr9sf2yr3n Ай бұрын
great, great video.
@sahhaf1234
@sahhaf1234 10 ай бұрын
I really dont like to leave you a like. Instead, I want to leave you one hundred likes.. Unfortunately google limits me to one...
@avb_fj
@avb_fj 10 ай бұрын
Thanks!! Super appreciated!
@user-wm8xr4bz3b
@user-wm8xr4bz3b 5 ай бұрын
at 2:33, you mentioned that self-attention is more biased, but at 2:54 you also mentioned that self-attention reduces inductive bias?? Sorry but i'm a bit confused.
@avb_fj
@avb_fj 5 ай бұрын
Self-Attention indeed reduces inductive bias and adopts a more general learning framework. At 2:33, I am asking a question: "IS Self-Attention more general or more biased?" And then I continue with "I'll argue that Self-Attention is not only more general than CNNs and RNNs but even more general than MLP layers".
@axe863
@axe863 Ай бұрын
Should have a hybrid structure ...... 😅
@avb_fj
@avb_fj Ай бұрын
This is true. Many computer vision research has now moved into adding elements of CNNs/inductive bias into the Vision Transformer architecture, for example - Swin Transformers.
10 years of NLP history explained in 50 concepts | From Word2Vec, RNNs to GPT
17:32
Neural Breakdown with AVB
Рет қаралды 24 М.
Neural Attention - This simple example will change how you think about it
18:06
Neural Breakdown with AVB
Рет қаралды 5 М.
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2 МЛН
Motorbike Smashes Into Porsche! 😱
00:15
Caters Clips
Рет қаралды 23 МЛН
这是自救的好办法 #路飞#海贼王
00:43
路飞与唐舞桐
Рет қаралды 135 МЛН
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 512 М.
AI can't cross this line and we don't know why.
24:07
Welch Labs
Рет қаралды 1,2 МЛН
The entire history of Computer Vision explained one great visualization at a time.
21:38
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 201 М.
Attention in transformers, visually explained | DL6
26:10
3Blue1Brown
Рет қаралды 1,8 МЛН
Generative Model That Won 2024 Nobel Prize
33:04
Artem Kirsanov
Рет қаралды 213 М.
The many amazing things about Self-Attention and why they work
12:31
Neural Breakdown with AVB
Рет қаралды 4,5 М.
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2 МЛН