CS480/680 Lecture 18: Recurrent and recursive neural networks

  Рет қаралды 23,329

Pascal Poupart

Pascal Poupart

Күн бұрын

Пікірлер: 18
@VahidOnTheMove
@VahidOnTheMove 4 жыл бұрын
It's unbelievably easy to understand the lecture. The best explanation I have seen.
@tahamagdy4932
@tahamagdy4932 2 жыл бұрын
Teachers should strive to be like you. Thank You
@gregpaokfc
@gregpaokfc Жыл бұрын
Great lecture,really insightful and more relevant than ever. Thank you mr. Poupart.
@notanape5415
@notanape5415 4 жыл бұрын
Really amazing explanation for Attention! Thanks for making this lecture public.
@datafiasco4126
@datafiasco4126 11 ай бұрын
Great job in explaining the concepts. Really liked the roll out diagrams.
@hadisjalesiyan1558
@hadisjalesiyan1558 4 жыл бұрын
Finally, I understood Attention. Thank you so much
@vitocorleone1991
@vitocorleone1991 3 жыл бұрын
Awesome, God bless you, professor.
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
Best explanation on attention so far for understanding the intuition.
@rubenpartono
@rubenpartono 3 жыл бұрын
10:21 Being unfamiliar with how automatic differentiation works, this feels absolutely magical.
@micknamens8659
@micknamens8659 2 жыл бұрын
1:33:40 A mathematical graph has no fixed geometrical layout. Hence the distriction of functions u and v for "left" and "right" child seems odd.
@datafiasco4126
@datafiasco4126 11 ай бұрын
This is more akin to a tree in computing and hence legacy terminology is used as left and right child.
@andrewmeowmeow
@andrewmeowmeow 4 жыл бұрын
Great lecture!
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
Does s3 depending on s2 and the convex combination of all of the h create unnecessary duplication, since s2 also depend on all the of h?
@samlaf92
@samlaf92 5 жыл бұрын
50:00 Is this really an LSTM? Where's the "c" cell? Seems like a weird mix of LSTM and GRU...
@samlaf92
@samlaf92 5 жыл бұрын
Nevermind. Should have waited until 59:00. I guess the "simple" version presented first around 50:00 would best be described as a peephole LSTM?
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
I don't see how under HMM, why the Y depend on X, given the direction of the arrow. Does the states depend on the past observations under HMM?
@thak456
@thak456 4 жыл бұрын
yes
@puyncharasr6963
@puyncharasr6963 4 жыл бұрын
Thumb up
CS480/680 Lecture 19: Attention and Transformer Networks
1:22:38
Pascal Poupart
Рет қаралды 354 М.
How Strong Is Tape?
00:24
Stokes Twins
Рет қаралды 96 МЛН
3 February 2025
3:21
Like Siri!
Рет қаралды 185
DeepSeek fallout with Databricks CEO: TechCheck Livestream
21:15
Deirdre Bosa - CNBC
Рет қаралды 3,9 М.
CS480/680 Lecture 15: Deep neural networks
1:31:06
Pascal Poupart
Рет қаралды 6 М.
ChatGPT is made from 100 million of these [The Perceptron]
24:01
Welch Labs
Рет қаралды 153 М.
Dendrites: Why Biological Neurons Are Deep Neural Networks
25:28
Artem Kirsanov
Рет қаралды 243 М.
CS480/680 Lecture 23: Normalizing flows (Priyank Jaini)
1:05:23
Pascal Poupart
Рет қаралды 20 М.
CS480/680 Lecture 16: Convolutional neural networks
1:15:18
Pascal Poupart
Рет қаралды 6 М.
Why Recurrent Neural Networks are cursed | LM2
13:17
vcubingx
Рет қаралды 19 М.
CS480/680 Lecture 10: Multi-layer neural networks and backpropagation
1:29:21
Theoretical Foundations of Graph Neural Networks
1:12:20
Petar Veličković
Рет қаралды 95 М.