RNN From Scratch In Python

  Рет қаралды 26,524

Dataquest

Dataquest

Күн бұрын

Пікірлер: 27
@vikasparuchuri
@vikasparuchuri Жыл бұрын
Hi everyone! You can find the lesson for this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . And the full list of lessons in this series is here - github.com/VikParuchuri/zero_to_gpt .
@MohammadKhan-b6p
@MohammadKhan-b6p 8 ай бұрын
Amazing. Every tutorial I've seen of RNNs is just an implementation of the RNN in pytorch or tensorflow with a quick and vague picture of a rolled and unrolled diagram (and this includes paid courses). This is the first video I've seen where I understand how the RNN could potentially process the incoming hidden layer data from the previous iteration.
@gk4539
@gk4539 8 ай бұрын
Just a note for any subsequent videos, if you were pointing on the screen it was not visible in the video, and it would be helpful if we knew where you were pointing to!
@goodmusic284
@goodmusic284 Жыл бұрын
Thank you so much. This is by far the best explanation of RNNs I have seen.
@Dataquestio
@Dataquestio Жыл бұрын
Thanks a lot! I'm planning to release some more deep learning vids soon :) -Vik
@CarlosPfister
@CarlosPfister Жыл бұрын
Thanks for continuously offering up free content, even to non students
@minkijung3
@minkijung3 Жыл бұрын
Thank you for this amazing tutorial. I learned a lot about RNN🙏🏻
@wernerpjjacob6499
@wernerpjjacob6499 8 ай бұрын
Very good didatic, very good man! I can only thank you
@JennieNewt
@JennieNewt 9 ай бұрын
Thanks for your high quality tutorial.
@Vova_learns_AI
@Vova_learns_AI Ай бұрын
Great video
@usernameispassword4023
@usernameispassword4023 5 ай бұрын
Shouldn't it be 1 - hiddens**2 for the tanh derivative?
@phamquang5535
@phamquang5535 7 ай бұрын
thanks, literal life saver
@vubanchowdhury2204
@vubanchowdhury2204 Жыл бұрын
For multi-layer RNNs, isn't the output from the first layer supposed to be the input to the second layer and so on? From what I understand, the code is written in a way that multiple layers of RNNs will all take the same input sequence (from the original data) and not the output from the previous layer. Could you please elaborate on this?
@Dataquestio
@Dataquestio Жыл бұрын
Yeah, you're right - I was using single-layer RNNs in this video, so I didn't consider the multiple layer case closely. You would just need to adjust this loop to take in the previous layer input instead of x: for j in range(x.shape[0]): input_x = x[j,:][np.newaxis,:] @ i_weight hidden_x = input_x + hidden[max(j-1,0),:][np.newaxis,:] @ h_weight + h_bias
@vubanchowdhury2204
@vubanchowdhury2204 Жыл бұрын
@@Dataquestio Thanks for clarifying!
@nicolasndour9851
@nicolasndour9851 Жыл бұрын
Thanks for this presentation! Can i have a clear explainnation about the dimensions of i_weight,h_weight and o_weight? thanks for advance
@anfedoro
@anfedoro Жыл бұрын
which tool do you use to draw such fancy diagrams ? 😀.
@Dataquestio
@Dataquestio Жыл бұрын
I use a tool called Excalidraw! Highly recommend it.
@anfedoro
@anfedoro Жыл бұрын
@@Dataquestio Thanks.. I have installed Excalidraw extension in my vscode and drawing right there with no requirement to use online web tool.
@waisyousofi9139
@waisyousofi9139 Жыл бұрын
Thanks , Where can I get its next video I mean where is the testing step where we can provide our input data.
@BagusSulistyo
@BagusSulistyo Жыл бұрын
thanks this is awesome 🤟
@jonathanlowe2552
@jonathanlowe2552 Жыл бұрын
Can you please indicate where the csv file is found?
@Dataquestio
@Dataquestio Жыл бұрын
It's in the code I linked to - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . If you check in the data folder (same directory it is opened from in the notebook), you'll find it - github.com/VikParuchuri/zero_to_gpt/tree/master/data .
@jonathanlowe2552
@jonathanlowe2552 Жыл бұрын
@@Dataquestio Thank you!
@AurL_69
@AurL_69 8 ай бұрын
thank you !!!!!
@kadenabet
@kadenabet Жыл бұрын
I found this insightgul but very irritated with python nd its eccentricities. For instance int he implementation section, what is params doing there? It looks like a completely useless variable. Should you not update the layers?
@wernerpjjacob6499
@wernerpjjacob6499 8 ай бұрын
Backpropagation In Depth
57:02
Dataquest
Рет қаралды 3,5 М.
Twin Telepathy Challenge!
00:23
Stokes Twins
Рет қаралды 50 МЛН
The Ultimate Sausage Prank! Watch Their Reactions 😂🌭 #Unexpected
00:17
La La Life Shorts
Рет қаралды 6 МЛН
Convolutional Neural Network from Scratch | Mathematics & Python Code
33:23
The Independent Code
Рет қаралды 185 М.
Fool-proof RNN explanation | What are RNNs, how do they work?
16:05
Classification With Neural Networks
43:19
Dataquest
Рет қаралды 9 М.
Let's build GPT: from scratch, in code, spelled out.
1:56:20
Andrej Karpathy
Рет қаралды 4,8 МЛН
Lecture 19 - RNN Implementation
54:34
Deep Learning Systems Course
Рет қаралды 7 М.
Neural Networks Explained from Scratch using Python
17:38
Bot Academy
Рет қаралды 347 М.
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 512 М.