RNN From Scratch In Python

  Рет қаралды 19,895

Dataquest

Dataquest

Күн бұрын

We'll build a recurrent neural network (RNNs) in NumPy. RNNs can process sequences of data, like sentences. We'll start with the theory of RNNs, then build the forward and backward pass in NumPy.
You can find a text version of this video here - github.com/VikParuchuri/zero_... .
And all of the previous lessons here - github.com/VikParuchuri/zero_... .
Chapters
0:00 RNN overview
6:32 Step by step forward pass
15:10 tanh activation function
19:23 Full forward pass
22:59 Full backward pass
39:43 Complete implementation
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK

Пікірлер: 25
@vikasparuchuri
@vikasparuchuri Жыл бұрын
Hi everyone! You can find the lesson for this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . And the full list of lessons in this series is here - github.com/VikParuchuri/zero_to_gpt .
@user-pb9nc4by2k
@user-pb9nc4by2k 3 ай бұрын
Amazing. Every tutorial I've seen of RNNs is just an implementation of the RNN in pytorch or tensorflow with a quick and vague picture of a rolled and unrolled diagram (and this includes paid courses). This is the first video I've seen where I understand how the RNN could potentially process the incoming hidden layer data from the previous iteration.
@minkijung3
@minkijung3 8 ай бұрын
Thank you for this amazing tutorial. I learned a lot about RNN🙏🏻
@CarlosPfister
@CarlosPfister Жыл бұрын
Thanks for continuously offering up free content, even to non students
@goodmusic284
@goodmusic284 Жыл бұрын
Thank you so much. This is by far the best explanation of RNNs I have seen.
@Dataquestio
@Dataquestio Жыл бұрын
Thanks a lot! I'm planning to release some more deep learning vids soon :) -Vik
@user-xe2xd2qi4u
@user-xe2xd2qi4u 3 ай бұрын
Thanks for your high quality tutorial.
@wernerpjjacob6499
@wernerpjjacob6499 3 ай бұрын
Very good didatic, very good man! I can only thank you
@phamquang5535
@phamquang5535 Ай бұрын
thanks, literal life saver
@gk4539
@gk4539 3 ай бұрын
Just a note for any subsequent videos, if you were pointing on the screen it was not visible in the video, and it would be helpful if we knew where you were pointing to!
@BagusSulistyo
@BagusSulistyo Жыл бұрын
thanks this is awesome 🤟
@nicolasndour9851
@nicolasndour9851 Жыл бұрын
Thanks for this presentation! Can i have a clear explainnation about the dimensions of i_weight,h_weight and o_weight? thanks for advance
@AurL_69
@AurL_69 2 ай бұрын
thank you !!!!!
@vubanchowdhury2204
@vubanchowdhury2204 Жыл бұрын
For multi-layer RNNs, isn't the output from the first layer supposed to be the input to the second layer and so on? From what I understand, the code is written in a way that multiple layers of RNNs will all take the same input sequence (from the original data) and not the output from the previous layer. Could you please elaborate on this?
@Dataquestio
@Dataquestio 11 ай бұрын
Yeah, you're right - I was using single-layer RNNs in this video, so I didn't consider the multiple layer case closely. You would just need to adjust this loop to take in the previous layer input instead of x: for j in range(x.shape[0]): input_x = x[j,:][np.newaxis,:] @ i_weight hidden_x = input_x + hidden[max(j-1,0),:][np.newaxis,:] @ h_weight + h_bias
@vubanchowdhury2204
@vubanchowdhury2204 11 ай бұрын
@@Dataquestio Thanks for clarifying!
@waisyousofi9139
@waisyousofi9139 Жыл бұрын
Thanks , Where can I get its next video I mean where is the testing step where we can provide our input data.
@jonathanlowe2552
@jonathanlowe2552 Жыл бұрын
Can you please indicate where the csv file is found?
@Dataquestio
@Dataquestio Жыл бұрын
It's in the code I linked to - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . If you check in the data folder (same directory it is opened from in the notebook), you'll find it - github.com/VikParuchuri/zero_to_gpt/tree/master/data .
@jonathanlowe2552
@jonathanlowe2552 Жыл бұрын
@@Dataquestio Thank you!
@anfedoro
@anfedoro 10 ай бұрын
which tool do you use to draw such fancy diagrams ? 😀.
@Dataquestio
@Dataquestio 10 ай бұрын
I use a tool called Excalidraw! Highly recommend it.
@anfedoro
@anfedoro 10 ай бұрын
@@Dataquestio Thanks.. I have installed Excalidraw extension in my vscode and drawing right there with no requirement to use online web tool.
@kadenabet
@kadenabet 8 ай бұрын
I found this insightgul but very irritated with python nd its eccentricities. For instance int he implementation section, what is params doing there? It looks like a completely useless variable. Should you not update the layers?
@wernerpjjacob6499
@wernerpjjacob6499 3 ай бұрын
Backpropagation In Depth
57:02
Dataquest
Рет қаралды 2,8 М.
Gradient Descent From Scratch In Python
42:39
Dataquest
Рет қаралды 16 М.
g-squad assembles (skibidi toilet 74)
00:46
DaFuq!?Boom!
Рет қаралды 11 МЛН
Dynamic #gadgets for math genius! #maths
00:29
FLIP FLOP Hacks
Рет қаралды 18 МЛН
What Jumping Spiders Teach Us About Color
32:37
Veritasium
Рет қаралды 1,2 МЛН
Fool-proof RNN explanation | What are RNNs, how do they work?
16:05
Mnist Classification with a Convolutional Neural Network using PyTorch
37:35
15 crazy new JS framework features you don’t know yet
6:11
Fireship
Рет қаралды 399 М.
L15.4 Backpropagation Through Time Overview
9:34
Sebastian Raschka
Рет қаралды 27 М.
Какой смысл худеть к лету? / Редакция
49:15
Редакция
Рет қаралды 875 М.
КРОВАВАЯ ЛУНА В AMONG US МАЙНКРАФТ!
22:05
EdisonPts
Рет қаралды 772 М.
g-squad assembles (skibidi toilet 74)
00:46
DaFuq!?Boom!
Рет қаралды 11 МЛН