DeepMind x UCL | Deep Learning Lectures | 8/12 | Attention and Memory in Deep Learning

  Рет қаралды 64,842

Google DeepMind

Google DeepMind

Күн бұрын

Пікірлер: 45
@leixun
@leixun 4 жыл бұрын
*DeepMind x UCL | Deep Learning Lectures | 8/12 | Attention and Memory in Deep Learning* *My takeaways:* *1. Introduction **1:24* 1.1 Attention, memory and cognition 1:28 1.2 Attention in neural networks 2:50 -Implicit -Can be checked through Jacobian 1.3 Explicit attention: hard attention, non-differentiable 17:00 -It has several advantages over implicit attention --Computational efficiency --Scalability (e.g. fixed size glimpse for any size image) --Sequential processing of static data (e.g. moving gaze) --Easier to interpret -Neural attention models 19:24 -Glimpse distribution 20:25 -Attention with reinforcement learning 21:12 -Complex glimpse 22:46 *2. Explicit attention: soft attention, differentiable **26:27* 2.1 Basic 28:15 2.2 Attention weights 29:22 2.3 An example: handwriting synthesis with RNNs 32:40 2.4 Associative attention 38:38 2.5 Differentiable visual attention 45:30 *3. Introspective attention **49:23* 3.1 Neural Turing Machine 51:02 3.2 Selective attention 52:53 3.3 Content-based and location-based attention 55:28 3.4 Differentiable Neural Computer 1:12:04 *4. Further topics **1:13:51* 4.1 Self-attention in Transformers 1:14:00 *5. Summary **1:34:14*
@softerseltzer
@softerseltzer 4 жыл бұрын
22:25 : I'm checking all my tabs for notifications
@leixun
@leixun 3 жыл бұрын
@jawad mansoor You’re welcome
@menesun
@menesun 2 жыл бұрын
From comment of Lei Xun (I added a 0.00 timestamp for the see chapters in the video) 0. Opening 0:00 1. Introduction 1:24 1.1 Attention, memory and cognition 1:28 1.2 Attention in neural networks 2:50 -Implicit -Can be checked through Jacobian 1.3 Explicit attention: hard attention, non-differentiable 17:00 -It has several advantages over implicit attention --Computational efficiency --Scalability (e.g. fixed size glimpse for any size image) --Sequential processing of static data (e.g. moving gaze) --Easier to interpret -Neural attention models 19:24 -Glimpse distribution 20:25 -Attention with reinforcement learning 21:12 -Complex glimpse 22:46 2. Explicit attention: soft attention, differentiable 26:27 2.1 Basic 28:15 2.2 Attention weights 29:22 2.3 An example: handwriting synthesis with RNNs 32:40 2.4 Associative attention 38:38 2.5 Differentiable visual attention 45:30 3. Introspective attention 49:23 3.1 Neural Turing Machine 51:02 3.2 Selective attention 52:53 3.3 Content-based and location-based attention 55:28 3.4 Differentiable Neural Computer 1:12:04 4. Further topics 1:13:51 4.1 Self-attention in Transformers 1:14:00 5. Summary 1:34:14
@kimchi_taco
@kimchi_taco Жыл бұрын
Alex Graves invented CTC and RNNT, which is basically a modern e2e ASR model in 2013. It created tens of thousands of research jobs, and he left to seek his desire. His journey is inspiring. He doesn't seek fame or money or status. He seeks the answer to his internal curiosity. I wanna live like him.
@drpchankh
@drpchankh 4 жыл бұрын
A no-nonsense detailed attention based lectures. A very well prepared lecture for all (beginners and experienced deep learning practitioner). Greatly recommended for all who want a context on how attention is first thought through in the research world. Thank you Alex. Enjoyed the lecture.
@barisdenizsaglam
@barisdenizsaglam 4 жыл бұрын
Great lecture! I really appreciate how he explains the thought process behind the new ideas.
@skySanter
@skySanter 3 жыл бұрын
Selam türk
@stephennfernandes
@stephennfernandes 4 жыл бұрын
Great Lecture ! Highly recommend anyone who is looking for indepth understanding of attention and different families of attention mechanism please watch this video . Its the best attention explanation available on the entire web.
@pw7225
@pw7225 2 жыл бұрын
This is sooooooo good. So well explained. It's like a Neuralink knowledge upload to my brain. Thanks, Alex!
@peterdavidfagan
@peterdavidfagan 2 жыл бұрын
This is one of my all-time favorite lectures, thanks for making this available. DNCs are very interesting.
@agamemnonc
@agamemnonc Жыл бұрын
I like the "Thank you very much for your attention" punch line at the end.
@Naghaas
@Naghaas Жыл бұрын
Another high quality course from deepmind, thanks !
@marcelomanteigas
@marcelomanteigas 4 жыл бұрын
wonderful! Thanks for putting these lectures out!!
@BlackHermit
@BlackHermit 4 жыл бұрын
This is only the beginning.
@lukn4100
@lukn4100 3 жыл бұрын
Great lecture and big thanks to DeepMind for sharing this great content.
@kaymengjialyu5086
@kaymengjialyu5086 4 жыл бұрын
Dear DeepMind, the link for the slides seems to be valid. Can anyone fix that?
@ansh6848
@ansh6848 2 жыл бұрын
Looking for a lecture on attention mechanism..and This was the best.
@Letsfeelthenaturee
@Letsfeelthenaturee 4 жыл бұрын
You are really brilliant, sir. I am from your friend country Bangladesh 🇧🇩. Hope you will be more and more helpful
@stanislavjirak2894
@stanislavjirak2894 Жыл бұрын
Splendid!! 🎉
@ProfessionalTycoons
@ProfessionalTycoons 4 жыл бұрын
thank you for this lecture, learned a lot about attention
@quentinpaden1481
@quentinpaden1481 9 ай бұрын
Revive KZbin
@PresidentGollumSmeag
@PresidentGollumSmeag 11 ай бұрын
hey! i really enjoyed the machine lecturing! BUT!!!! your name is graves but i dont see a scar and i played u quite a bit in aram and also no cigar and also no shotgun and also no collector in ur item list in the background! PROPS FOR THE BEARD!!! FRAUD!!!!!!!
@Adrixin
@Adrixin 11 ай бұрын
totally agree. needless to say he was camping base the entire time afk
@GrigorySapunov
@GrigorySapunov 4 жыл бұрын
Thanks Alex for the cool lecture and research!
@YaroslavVolovich
@YaroslavVolovich 4 жыл бұрын
Thanks Alex for a great lecture!
@AnonymousIguana
@AnonymousIguana Жыл бұрын
Fantastic :)
@siyn007
@siyn007 4 жыл бұрын
For anyone that watched this lecture and his lecture from two years ago, is the difference large enough for me to watch the one from two years ago? Thanks
@mabbasiazad
@mabbasiazad 4 жыл бұрын
The section discussed after 1:14:00 (Further topic) is new.
@siyn007
@siyn007 4 жыл бұрын
@@mabbasiazad thanks!
@hosseinsheikhi5596
@hosseinsheikhi5596 4 жыл бұрын
Amazing lecture!
@Letsfeelthenaturee
@Letsfeelthenaturee 4 жыл бұрын
really..... so much interesting
@robertfoertsch
@robertfoertsch 4 жыл бұрын
Excellent, Added To My Research Library, Sharing Through TheTRUTH Network...
@priancho
@priancho 4 жыл бұрын
Thank you for such a good lecture! :=)
@pratik245
@pratik245 3 жыл бұрын
These things seem eeringly similar to an idea i had 5 years ago and wrote some innocuous linkedin article around tge same same transformers delved into attnetion mechanism. But, only those who actually are in Harvard, MIT, deep mind can actually implement it with the resources that are required for it.
@pratik245
@pratik245 Жыл бұрын
@@robensonlarokulu4963 yeah.. Also true that losers will be losers right from the time they are born.
@pratik245
@pratik245 Жыл бұрын
Also, understand the difference between me and you, i never want credit for anything but i don't like stealing. Do you know who rudra is, that is what i become when i see too much injustice.. So better stay away from sucking any future generation's blood. If i see such injustice, believe me you will know God's wrath..
@amniasalma307
@amniasalma307 4 жыл бұрын
Thanks for sharing this
@deeplearningpartnership
@deeplearningpartnership 4 жыл бұрын
Amazing.
@rogerab1792
@rogerab1792 4 жыл бұрын
Memmory Augmented Neural Networks are the next big thing.
@quentinpaden1481
@quentinpaden1481 9 ай бұрын
Police KZbin
4 жыл бұрын
"Attention is all you need"? What a missed opportunity to call the paper "Give me some attention and I'll do everything you want". ;)
Арыстанның айқасы, Тәуіржанның шайқасы!
25:51
QosLike / ҚосЛайк / Косылайық
Рет қаралды 700 М.
Enceinte et en Bazard: Les Chroniques du Nettoyage ! 🚽✨
00:21
Two More French
Рет қаралды 42 МЛН
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН
Deep Learning 7. Attention and Memory in Deep Learning
1:40:19
Google DeepMind
Рет қаралды 79 М.
DeepMind x UCL | Deep Learning Lectures | 12/12 |  Responsible Innovation
1:02:28
DeepMind x UCL | Deep Learning Lectures | 2/12 |  Neural Networks Foundations
1:24:13
Арыстанның айқасы, Тәуіржанның шайқасы!
25:51
QosLike / ҚосЛайк / Косылайық
Рет қаралды 700 М.