Lecture 12: Recurrent Networks

  Рет қаралды 36,483

Michigan Online

Michigan Online

Күн бұрын

Пікірлер: 13
@chriswang2464
@chriswang2464 Жыл бұрын
Mad respect to Justin!
@neerajkrishna1983
@neerajkrishna1983 2 жыл бұрын
The gradient flow and introduction to LSTM was great!
@roboticseabass
@roboticseabass 4 жыл бұрын
Another common RNN trick worth mentioning is bidirectional RNNs. So basically you have 2 independent RNN layers -- one that goes through a sequence forwards and another backwards -- and you concatenate their hidden layer outputs at the end. If you have full sequences available this can help!
@mostafashahhosseini3378
@mostafashahhosseini3378 Ай бұрын
I wish Justin could teach any topic in the world
@aritraroygosthipaty3662
@aritraroygosthipaty3662 4 жыл бұрын
42:05 Justin goes on to say that the color blue represents all off, but in his paper, it is quite clearly mentioned that -1 is red and +1 is blue. Another thing to ask here is, the explanation of the color text is reasonable, but in the paper, it states that the text color corresponds to tanh(c). Are we looking at the hidden states of the LSTM or the memory state?
@eddie31415
@eddie31415 4 жыл бұрын
Thanks a lot!
@kainatyasmeen5608
@kainatyasmeen5608 2 жыл бұрын
Great learning. Thanks alot!
@MrAmgadHasan
@MrAmgadHasan Жыл бұрын
image Captioning 43:42
@HesitantOne
@HesitantOne 3 ай бұрын
at 27:09 shouldnt embeddings of last 2 input equal? they are both same token. why their embeddings are different?
@mostinho7
@mostinho7 9 ай бұрын
Start at 13:00
@taghyeertaghyeer5974
@taghyeertaghyeer5974 Жыл бұрын
@32:00, I am wondering, why did Justin say: "Once you process one chunk of data you can throw it away, evict it from the memory, because all the information needed for training from this chunk is stored in that final hidden state of the RNN at the end of processing the chunk". I guess the data from this chunk is saved in all the hidden states obtained at the end of processing the chunk. Am I correct?
@itchainx4375
@itchainx4375 Жыл бұрын
probably not, just the last output of this trunk
@thinhvu6902
@thinhvu6902 Жыл бұрын
It should be the last hidden state obtained at the end of forward processing the chunk
Lecture 13: Attention
1:11:53
Michigan Online
Рет қаралды 67 М.
Lecture 15: Object Detection
1:12:32
Michigan Online
Рет қаралды 62 М.
小丑揭穿坏人的阴谋 #小丑 #天使 #shorts
00:35
好人小丑
Рет қаралды 52 МЛН
У вас там какие таланты ?😂
00:19
Карина Хафизова
Рет қаралды 27 МЛН
When mom gets home, but you're in rollerblades.
00:40
Daniel LaBelle
Рет қаралды 155 МЛН
Lecture 5: Neural Networks
1:02:07
Michigan Online
Рет қаралды 41 М.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 589 М.
Lecture 10: Training Neural Networks I
1:12:14
Michigan Online
Рет қаралды 32 М.
Lecture 10 | Recurrent Neural Networks
1:13:09
Stanford University School of Engineering
Рет қаралды 585 М.
Training RNNs - Loss and BPTT
29:41
NPTEL-NOC IITM
Рет қаралды 23 М.
Lecture 8: CNN Architectures
1:12:03
Michigan Online
Рет қаралды 46 М.
MIT 6.S191 (2020): Recurrent Neural Networks
45:28
Alexander Amini
Рет қаралды 394 М.
Lecture 17: 3D Vision
1:12:34
Michigan Online
Рет қаралды 35 М.
Lecture 18: Videos
1:15:21
Michigan Online
Рет қаралды 22 М.