Attention is all you need || Transformers Explained || Quick Explained

  Рет қаралды 21,532

Developers Hutt

Developers Hutt

Күн бұрын

Пікірлер: 43
@LoneXeaglE
@LoneXeaglE Жыл бұрын
Everytime I need to remember key concepts in transformers I go back to this video. God bless you bro.
@DevelopersHutt
@DevelopersHutt Жыл бұрын
Thanks bro ❤️
@adokoka
@adokoka 2 жыл бұрын
@8:47 should be: the query (Q) comes from the decoder. The key (K) and value (V) come from the encoder.
@ivant_true
@ivant_true Жыл бұрын
Yep
@adscript4713
@adscript4713 7 ай бұрын
So at 7:47, the encoder output should actually read Ke and Ve, right?
@adokoka
@adokoka 7 ай бұрын
@@adscript4713, that is correct.
@akankshaaggarwal394
@akankshaaggarwal394 2 ай бұрын
A very good and simplied video. Thank you.
@Gank-SquadUK
@Gank-SquadUK 2 ай бұрын
This is really good man - Nice video and you've done a fantastic job explaining some very complex ideas.
@shravanshravan4402
@shravanshravan4402 2 ай бұрын
Beautiful explanation brother!
@nicklam3594
@nicklam3594 Жыл бұрын
I cannot like this enough!!!!!! This channel is SOOO underrated
@DevelopersHutt
@DevelopersHutt Жыл бұрын
I'm glad you liked it ❤️
@jessicahoffmann6278
@jessicahoffmann6278 2 жыл бұрын
Impressively clear and easy to follow while still being precise!
@karthikeyasharmaparunandi
@karthikeyasharmaparunandi 11 ай бұрын
Thanks!
@DevelopersHutt
@DevelopersHutt 11 ай бұрын
Thank you so much 🙏
@karthikeyasharmaparunandi
@karthikeyasharmaparunandi 11 ай бұрын
@@DevelopersHuttKeep up the good work. Your short and to-the-point videos are unique and amazing!
@vil9386
@vil9386 9 ай бұрын
This is very clear. I understood a lot through this video about transformers. Thank you.
@DevelopersHutt
@DevelopersHutt 9 ай бұрын
Glad you liked it!
@fazelamirvahedi9911
@fazelamirvahedi9911 Жыл бұрын
Thank you for the informative video. I was wondering where can I find the image captioning video that you mentioned at the the of your video?
@DevelopersHutt
@DevelopersHutt Жыл бұрын
kzbin.info/www/bejne/aZaXmaZ-gal7kKMsi=9zsaiRaCKkmRLJEF
@ihsanhariadi1056
@ihsanhariadi1056 Жыл бұрын
Thankyou, very clear visualization on the processes involved in AIAYN (Attention-Is-All-You-Need) Concept ...
@natureClipsHimalayan
@natureClipsHimalayan Жыл бұрын
best explanation of transformers ever, thank you 👍
@DevelopersHutt
@DevelopersHutt Жыл бұрын
Glad you liked it!
@AliHamza-le1of
@AliHamza-le1of 2 жыл бұрын
Very Nice Explation. Absolutely Excellent !!! Thanks.
@ibrahimahmethan586
@ibrahimahmethan586 Жыл бұрын
You saved my day! Thank you so much!
@hansolo7516
@hansolo7516 2 жыл бұрын
very easy to understand, nice job!
@23232323rdurian
@23232323rdurian Жыл бұрын
? how about: ? or maybe ?
@edelweiss7680
@edelweiss7680 Жыл бұрын
Hi there ! Excellent work thanks ! But : at 5:06 should the PE vectors be APPENDING to the word embeddings and not ADDING ?
@DevelopersHutt
@DevelopersHutt Жыл бұрын
Yes, correct. I will put a note in the description. Thanks❤
@FreeMind-00x
@FreeMind-00x Жыл бұрын
In Attention Filter example, the dot product between query "WHY" and key "WHY" is 0, this seems not possible unless embedding of "Why" is 0.
@DevelopersHutt
@DevelopersHutt Жыл бұрын
Let me check
@drzobekenobe
@drzobekenobe Жыл бұрын
Brilliant 👏
@ummara2020
@ummara2020 11 ай бұрын
What tool you use for animations?
@DevelopersHutt
@DevelopersHutt 11 ай бұрын
After effects
@LoneXeaglE
@LoneXeaglE Жыл бұрын
Such amazing work! thank you sir!
@JAINHEMENDRAH123
@JAINHEMENDRAH123 2 жыл бұрын
Very well explained
@junaidiqbal5018
@junaidiqbal5018 2 жыл бұрын
wow, well explained!
@AlirezaGolabvand
@AlirezaGolabvand Жыл бұрын
Very Nice Job. can you please tell me what software you are using to make such amazing animated presentation? it will be very helpful.
@DevelopersHutt
@DevelopersHutt Жыл бұрын
After effects
@sudhansubaladas2322
@sudhansubaladas2322 2 жыл бұрын
Nice explanation... But, how embedding layer convert tokens into vectors representation? I guess if you could take an example and explain this then it will be helpful for those who are watching this video...
@DevelopersHutt
@DevelopersHutt 2 жыл бұрын
Embedding layer learns the relationship between words. For more info: read about word2vec model. I'll definitely make videos about how it works in near future.
@sudhansubaladas2322
@sudhansubaladas2322 2 жыл бұрын
@@DevelopersHutt Ok waiting for your video... My suggestion for you is that the description of the Transformer which you are giving is already available in writeup as well as videos. But viewers want to see an explanation using examples.. Like taking any sentence and then how things are happening.. but nice start..
@Dhirajkumar-ls1ws
@Dhirajkumar-ls1ws Жыл бұрын
👌
@pontinhonoceu611
@pontinhonoceu611 2 жыл бұрын
You are a badass
Exploding Gradient Problem | Gradient Clipping | Quickly Explained
2:41
Attention is all you need explained
13:56
Lucidate
Рет қаралды 85 М.
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 20 МЛН
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
The Attention Mechanism in Large Language Models
21:02
Serrano.Academy
Рет қаралды 105 М.
Attention in transformers, visually explained | DL6
26:10
3Blue1Brown
Рет қаралды 1,9 МЛН
Attention Is All You Need
27:07
Yannic Kilcher
Рет қаралды 658 М.
What are Transformer Neural Networks?
16:44
Ari Seff
Рет қаралды 164 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,1 МЛН
The math behind Attention: Keys, Queries, and Values matrices
36:16
Serrano.Academy
Рет қаралды 269 М.
Attention for Neural Networks, Clearly Explained!!!
15:51
StatQuest with Josh Starmer
Рет қаралды 290 М.