Introduction to Coding Neural Networks with PyTorch and Lightning

  Рет қаралды 66,147

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Пікірлер: 205
@statquest
@statquest 2 жыл бұрын
NOTE: Lightning 2.0 changed the way the learning rate tuner is accessed. This has been updated in the jupyter notebook that you can download here: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning To learn more about Lightning: lightning.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@reasonerenlightened2456
@reasonerenlightened2456 2 жыл бұрын
ANN are about 60 years old but only when they became profitable we see stuff like this video. NO WONDER WE STILL DO NOT HAVE GENARAL AI. there is just not enough profit in it yet.
@andrea-mj9ce
@andrea-mj9ce 2 жыл бұрын
I don't see any code in the second link.
@statquest
@statquest 2 жыл бұрын
@@andrea-mj9ce Fill out the form and you'll get it in an email.
@xinfangmin3665
@xinfangmin3665 2 жыл бұрын
You are such an amazing guy! Thanks a lot! Love your video~
@statquest
@statquest 2 жыл бұрын
@@salilgupta9427 Thanks for the heads up! It should be working now. :)
@airpeguiV2
@airpeguiV2 2 жыл бұрын
Hi Josh, I am trying to get my PhD in EEE and as such, my background is in electrical and electronic engineering, not in machine learning or data science. Somehow, in my PhD I've ended up doing more ML and DS than EEE, and I can only infinitely thank you for these resources that you post on the internet for free, for the effort you put on them, and for your dedication. You are a marvel and you have helped me understand and apply concepts and models to my research, which hopefully will one day help society and the environment through a more efficient power grid, capable of accommodating more renewable energy sources and electrical machines. Thank you ∞!
@statquest
@statquest 2 жыл бұрын
Hooray!!! Thank you so much. I'm so glad to hear that my videos are helping you out. :)
@anthonyashwin3457
@anthonyashwin3457 Жыл бұрын
Triple Bam 💥
@rickymort135
@rickymort135 8 ай бұрын
Embrace, Extend, Extinguish? That's terrible
@rizkykiky7721
@rizkykiky7721 2 жыл бұрын
this channel is evolving from statistics and math to coding! didn't expect that and I absolutely love it!
@statquest
@statquest 2 жыл бұрын
BAM! :)
@charlesrios8542
@charlesrios8542 Жыл бұрын
Well he’s already covered all the stats in all his years lol this is what’s next
@CHERKE_JEMA5575
@CHERKE_JEMA5575 2 жыл бұрын
On my way to finishing your book...I would definitely recommend it to everyone! Love from Ethiopia, Africa
@statquest
@statquest 2 жыл бұрын
Awesome! Thank you!
@perudevlabs
@perudevlabs 8 ай бұрын
How are you not mainstream? This is best DX I've seen on ML so far... So focused on the important parts that need to be coded, is like the fastapi for deep learning.
@statquest
@statquest 8 ай бұрын
Thank you!
@jonahturner2969
@jonahturner2969 2 жыл бұрын
This video will really blow up in just a few months I think. The newest scene text recognition model I'm trying to implement uses Lightning extensively, more and more people will pick it up soon. Thank you for making such a clear explanation
@statquest
@statquest 2 жыл бұрын
Awesome!!! Thank you ! :)
@brockjohnson312
@brockjohnson312 2 жыл бұрын
yes jonah
@AslEroglu
@AslEroglu Жыл бұрын
Love your content, it helps me a lot! Very clear explanations, thank you. If anyone struggling to import lightning package, I wrote "import pytorch_lightning" instead "import lightning" and problem solved.
@statquest
@statquest Жыл бұрын
BAM! Thank you! :)
@AslEroglu
@AslEroglu Жыл бұрын
​@@statquest Warning from lightning installation page: "pip install pytorch-lightning has been deprecated and will stop being updated June 2023. Use pip install lightning instead." When you use the latter one, you can "import lightning".
@macknightxu2199
@macknightxu2199 Жыл бұрын
This NN sery is tremendously amazing, easy to understand while teaching a lot of concepts, processes. The best thing I find is the rhythm they keeps by using bam, double bam, triple bam, tiny bam because normally, learners will lose their mind when learning for a long time with the puzzles like where I am, what I know, where to go. Good job! BR
@statquest
@statquest Жыл бұрын
Thank you! :)
@exxzxxe
@exxzxxe Жыл бұрын
Josh, you are a math genius in addition to being an outstanding singer!
@statquest
@statquest Жыл бұрын
Thank you so much! :)
@protovici1476
@protovici1476 2 жыл бұрын
The comic approach and extremely good content makes this the best Lightning video I've ever scene.
@statquest
@statquest 2 жыл бұрын
Thanks!
@ClaseS-1010
@ClaseS-1010 Жыл бұрын
@@statquest BAM!
@Luxcium
@Luxcium Жыл бұрын
Wow 😮 I didn't knew I had to watch *The StatQuest Introduction To PyTorch* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh I will go watch the other videos first and then I will back propagate to this video...
@statquest
@statquest Жыл бұрын
Thanks!
@ruchiraina2215
@ruchiraina2215 2 жыл бұрын
Finally I understood the basic code structure of Neural Networks Using PyTorch. Thanks for that. There is a request - Would you please create the same model using Tensorflow? That would be very helpful to compare these frameworks.
@statquest
@statquest 2 жыл бұрын
I'll keep that in mind.
@zainabkhan2475
@zainabkhan2475 2 жыл бұрын
Thanks for all your videos they are all precious
@statquest
@statquest 2 жыл бұрын
Thank you!
@massimoc7494
@massimoc7494 4 ай бұрын
I thought I had finished watching your videos after I passed my statistics exam, and here we go again!
@statquest
@statquest 4 ай бұрын
bam! :)
@shaktishivalingam3880
@shaktishivalingam3880 Жыл бұрын
You are amazing, Thank you for helping us out with your videos it has helped me a lot
@statquest
@statquest Жыл бұрын
Thanks!
@nadavnesher8641
@nadavnesher8641 Жыл бұрын
Totally awesome!! Great explanations! I love your channel🚀 Thanks so much for your videos 🦾
@statquest
@statquest Жыл бұрын
Thank you!
@ArchithaKishoreSings
@ArchithaKishoreSings 2 жыл бұрын
Love the PyTorch content ❤
@statquest
@statquest 2 жыл бұрын
Thank you! :)
@fizipcfx
@fizipcfx 2 жыл бұрын
Thank you for this video, i would love to see more videos from pytorch ecosystem
@statquest
@statquest 2 жыл бұрын
More to come!
@anitasalamon9958
@anitasalamon9958 Жыл бұрын
Your NN videos are a gold mine - watched them all in 2 days. I only wish I had a mentor like you during my PhD journey. Got an idea and would love your insights. Here's the brief: I want to to train the NN based on the fully annotated scRNA-seq dataset (A) with multi-dimensional inputs (genes) and 15 outputs - 15 different cell types/annotations/labels. Then I want to take a new scRNA-seq dataset (B) and use the trained model to annotate it (transfer the labels). Now I want to "improve" this model by adding an additional feature from the dataset B. This feature contains information about the origin of each input- origin X or non-X. I would end up with 30 outputs - 15 different cell types x 2 (origin X or non-X). From here, I would like to take this new "improved" model and use it to annotate scRNA-seq dataset (C). Do you think this is feasible, and do you have any advice on how to "improve" the model? Thanks again for the amazing content!
@statquest
@statquest Жыл бұрын
What you want to do sounds reasonable. I did a quick search for "transformer genome annotation" and I found this: www.nature.com/articles/s41467-023-35923-4 which might be interesting to you. My video on Transformers should come out soon, so that might help as well.
@anitasalamon9958
@anitasalamon9958 Жыл бұрын
@@statquest would you suggest building the neural network for dataset A? If so which model? and then using transfer learning for dataset B?
@statquest
@statquest Жыл бұрын
@@anitasalamon9958 Start by building a transformer model for dataset A
@divelix2666
@divelix2666 Жыл бұрын
Great video, as always. Thank you, Josh, for your hard work! Btw, while following video instructions I found out some things, that should be clarified: - 3:11 - instead of `lightning` there should be `pytorch_lightning` (I installed it with `conda install pytorch_lightning -c conda-forge`) - 15:10 - after we change lr from 0.1 to 0.00214, we need much more than 34 epochs to get desired -16 (more than 1000 epochs, so I can't understand how this lr can be considered better than initial 0.1)
@statquest
@statquest Жыл бұрын
Did you use my code or your own? In the free jupyter notebook, I give instructions on how to install lightning (not pytorch_lightning, which is legacy and could be deprecated soon): "pip install lightning". And I just re-ran my notebook and after changing the learning rate, it converged in 19 epochs.
@divelix2666
@divelix2666 Жыл бұрын
​@@statquest My own code (I tried to follow video step by step by myself). You are right, after I reinstalled with pip it works as `import lighting`. Btw, learning rate point is still valid.
@bobotran7792
@bobotran7792 2 жыл бұрын
Our savior returns
@statquest
@statquest 2 жыл бұрын
bam! :)
@James-hb8qu
@James-hb8qu Жыл бұрын
Maybe just me, but I found the model wouldn't train fast enough to work so I compared my "type along" code with the code in the repository. The difference was at 10:24 and was fixed when I added the 'times 100' to the input and label tensors, as in inputs = torch.tensor([0., 0.5, 1.] * 100) and labels = torch.tensor([0., 1., 0.] * 100)
@statquest
@statquest Жыл бұрын
Nice! The code in the repository should be updated to have the * 100 multiplier. When did you download it?
@James-hb8qu
@James-hb8qu Жыл бұрын
@@statquest Ah, my comment was ambiguous. The repository code works. I like to code along as I watch your videos and the video didn't have the *100 so I had the initial problem.
@statquest
@statquest Жыл бұрын
@@James-hb8qu Hooray! That makes me feel a little better.
@giligili9923
@giligili9923 Жыл бұрын
I got the same problem. Could someone explain why the *100? Also, I can see that we need more epochs to arrive at the optimum value, but each epoch runs much slower than before. Does this is worth in another context or what was the problem?
@statquest
@statquest Жыл бұрын
@@giligili9923 Are you using my notebook or your own typed in code? The *100 tricks the NN into thinking it has more data than it really has, and as a result, runs smoother.
@younesselhamzaoui6783
@younesselhamzaoui6783 2 жыл бұрын
Excelent. Thank you so much!
@statquest
@statquest 2 жыл бұрын
Thanks!
@Celbe
@Celbe 10 ай бұрын
Hi Josh, first of all I would like to express my gratitude for the excellent material you have made available. I have a question: what is the criteria for some classes to have the audio track in another language? I love it when they exist in Portuguese.😊😊😊 Greetings from Brazil!!! 👋
@statquest
@statquest 10 ай бұрын
I'm trying to create Portuguese tracks for all of my neural network videos. It takes a lot of time, but I hope to finish sometime soon.
@mhalton
@mhalton 4 ай бұрын
16:21 I don't see the 'batch_idx' parameter/argument of the 'training_step' function being used at all within the function.
@statquest
@statquest 4 ай бұрын
true. We're not using it.
@ZahidHasan-cc8tf
@ZahidHasan-cc8tf 2 жыл бұрын
Triple Bam!!! Hooray!!
@statquest
@statquest 2 жыл бұрын
:)
@ramiwehbi-ni9kw
@ramiwehbi-ni9kw 4 ай бұрын
Hi Josh Thank you for these learning tutorials. A question, there is no either way to build a more complex model rather doing all the links between the neuron nodes manually?
@statquest
@statquest 4 ай бұрын
There are lots and lots of easier ways to create neural networks. This was just an introduction. To learn other methods, check out the "coding neural networks" links on this page: statquest.org/video-index/
@ramiwehbi-ni9kw
@ramiwehbi-ni9kw 4 ай бұрын
​@@statquestthank you
@shamshersingh9680
@shamshersingh9680 7 ай бұрын
Hi Josh, can you please please please make a video on Autoencoders and Variational Autoencoders. Specially from the anomally detection perspective. I have searched youtube and other channels enough but could not find an explanation at par with you. I am pretty sure you must be having really busy schedule. But if you can find time and make a video, I will be honestly obliged to you.
@statquest
@statquest 7 ай бұрын
I'll keep that in mind, but I can't promise anything in the near future.
@bhagatpandey369
@bhagatpandey369 Жыл бұрын
Thank you so much...!!!
@statquest
@statquest Жыл бұрын
You are welcome!
@romanemul1
@romanemul1 Жыл бұрын
Thanks for this video.
@statquest
@statquest Жыл бұрын
You bet!
@heeheehaha45
@heeheehaha45 Жыл бұрын
Dear Josh, Thankyou for your amazing video. I have a observation from the code: inside class BasicNN_train(nn.Module), I changed the requires_grad parameter of w11 into True #self.w11 = nn.Parameter(torch.tensor(2.7), requires_grad=False) self.w11 = nn.Parameter(torch.tensor(2.7), requires_grad=True) After this change, the graph of training result becomes a flat line and the Total loss becomes 1.0. And it seems that only changing variables other than bias of the NN will have this result. Isn't it strange? Thankyou!
@statquest
@statquest Жыл бұрын
This is such a simple model with such a small training dataset that, believe it or not, training it was super hard to do. I had to try about a million different starting conditions to get it to converge. Once I did, I wrote down all the optimized weights and biases, but I forgot to keep track of the original, random, starting values. So, unfortunately, I can't recreate the initial conditions that allow all of the weights and bises to be trained.
@ileshdhall
@ileshdhall Ай бұрын
Hey Josh, I would like to express my heartfelt gratitude for this amazing lecture, you really make all concepts really interesting and easy to understand. BAM :) However, I have a doubt, it would be really appreciated if you could clear it. I was trying to modify the code for this quest to backpropagate for all values of weights and biases. I used values from standard distribution for all weights and set all bias initially to 0 (Ps. I used exact values to try out as in you going bonkers with chain rule backpropagation quest). But the output isn't coming out as expected. Can u please share out some basic ideas behind how to set up proper values for initializing the weights and biases also along with what are good hyperparameter ranges (epochs and learning rate) for a Neural Network to know how to fine tune it for working it as expected. Thanks in advance :)
@statquest
@statquest Ай бұрын
This neural network, with it's incredibly simple design and training dataset, is actually very hard to train. I had to write a script that iterated through random initial values until I could find a set that worked. You might have to do the same thing.
@TechAbabeel
@TechAbabeel 2 жыл бұрын
This Channel is awesome 😎
@statquest
@statquest 2 жыл бұрын
Thank you!
@0807tanguy
@0807tanguy 8 ай бұрын
Great video Josh, you helped learn pytorch and lightning a LOT :) a note at 16:25: "And then it [the trainer] calls traning step again and repeats for each epoch that we requested" --> didn't you mean each batch for every epoch?
@statquest
@statquest 8 ай бұрын
Sure
@DrewMyersUk
@DrewMyersUk 2 жыл бұрын
I'm hoping the next in the series shows us an example of optimising all the things and not just the final bias.
@statquest
@statquest 2 жыл бұрын
Yep! That's exactly what we do in the next one.
@SaschaRobitzki
@SaschaRobitzki 10 ай бұрын
What's the new way of doing seed_everything(seed=42)? The old way throws the error ImportError: cannot import name 'seed_everything' from 'pytorch_lightning.utilities.seed'.
@SaschaRobitzki
@SaschaRobitzki 10 ай бұрын
Maybe it's not needed anymore, the lightning documentation for the latest version (2.1.3) recommends to use just torch.manual_seed(42).
@zhancao7909
@zhancao7909 Жыл бұрын
I tried to change another parameter to require training self.w00 = nn.Parameter(torch.tensor(1), requires_grad=True) The result was not correct, even final_bias is not close to -16 now after traning. What did I do wrong?
@statquest
@statquest Жыл бұрын
In this example, you can only optimize the final bias.
@ThinAirElon
@ThinAirElon 2 жыл бұрын
Infinite BAMS !
@statquest
@statquest 2 жыл бұрын
Yes!
@dylanlebrun-laurent668
@dylanlebrun-laurent668 Жыл бұрын
Hi josh, i'm currently following the video and unfortunately the code at : "14:05 Using Lightning to find a good Learning Rate" is no longer good for the job lightning says a new version of if has been released... And so making this all part of the lesson reeeeeeeaaaally hard to proceed. Can you please help on this one please? thanks a lot for what you're doing with your channel it's awesome BTW helped me a lot more than what you could ever imagine !
@statquest
@statquest Жыл бұрын
I've updated the jupyter notebook. Just download the latest version and you should be good to go: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@dylanlebrun-laurent668
@dylanlebrun-laurent668 Жыл бұрын
@@statquest thanks a lot ! You're a life saver, and quick responding too ! Thanks again ! Keep up the good work
@pappoos2
@pappoos2 5 ай бұрын
Hi Josh, it took me 3000 epoch to get the final bias value to -16 when using Lightning. Anything to take note of in this case?
@statquest
@statquest 5 ай бұрын
Were you using my code or did you type it in yourself?
@yalymevorach3973
@yalymevorach3973 Ай бұрын
Aren't you missing the output on the forward method on the final model?
@statquest
@statquest Ай бұрын
Yep! Good eye! The good news is that's just an omission in the video. The code itself is correct (and has a return value).
@yalymevorach3973
@yalymevorach3973 Ай бұрын
@@statquest yes I found it thank you! I took your Neural Network course (all 29 videos). They are amazing and Really helped me in my PhD. thank you 🙏
@statquest
@statquest Ай бұрын
@@yalymevorach3973 bam! Congratulations on watching all the video! :)
@imtim1243
@imtim1243 9 ай бұрын
Thanks Josh for the wonderful content! btw does anyone get tensor(-2.2926) as the final result for the final bias instead of -16? I did follow along the code...
@statquest
@statquest 9 ай бұрын
Are you using my code or did you type it in yourself? Mine is here: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning
@imtim1243
@imtim1243 9 ай бұрын
I typed it myself, but let me try yours, thank you so much!@@statquest
@Erezavnil
@Erezavnil 8 ай бұрын
When creating the sample data, multiply by 100 : torch.tensor([0.0 , 0.5, 1.0]*100)
@terp830
@terp830 Жыл бұрын
I have tried to adjust all the parameters of all weights and biases to 0 and changed Gradient Descent to True, but why the results turned out to be 0 for all output while in loop
@statquest
@statquest Жыл бұрын
In this case, we have so little data that it's actually quite hard to converge on all of the optimal weights and biases. However, I show how to to optimize everything in this video: kzbin.info/www/bejne/iHmqmouGqtSSpqs
@katetanjin
@katetanjin Жыл бұрын
Hi, Josh, thanks a lot for the super informative video! I tried that with 34 epochs, learning_rate=0.002 will result in final_bias=-2, which is still far from correct, with learning_rate=0.1, 34 epochs gives final_bias 16, that means 0.1 is a better learning rate value than 0.002, but why tuner gives 0.002? A side observation is that the downloaded notebook wrote that dataloader has 3 data points and repeated 100 times, which is different from what's shown in the video -- the dataloader simply has 3 data points in the video.
@statquest
@statquest Жыл бұрын
Hmmm... That strange. So, when you ran my code, which sets the learning rate to 0.002 it didn't work?
@katetanjin
@katetanjin Жыл бұрын
@@statquest Hi, Josh, thanks for checking out! The code in video didn't work out. More specifically, without *100 in dataloader, 0.002 learning rate and 34 epochs ends up with final_bias = -2 while 0.1 learning rate and 34 epochs ends up with final_bias=-16; with *100 in dataloader, both 0.1 and 0.002 learning rate end up with final_bias=-16. I feel like *100 in dataloader is somewhat cheating, because with 34 epochs, the model trainer in fact saw the data 3400 times? With such large number of iterations, I suspect most learning rate values will end up with final_bias=-16.
@statquest
@statquest Жыл бұрын
@@katetanjin I think the learning rate finder needs a lot of data in order to work. That's why we multiplied the data by 100 - to trick it into thinking we had more data than we really had. Anyway, it's just supposed to demonstrate how to use the tool and, in practice, you will probably have more than 3 data points and will not need to trick the learning rate finder.
@adh921ify
@adh921ify 9 ай бұрын
how do I optimize multiple parameters it does not seem to work if I just set another parameter to "requires_grad= True" is there something else I am missing???
@statquest
@statquest 9 ай бұрын
This model is so simple it's actually very difficult to train. So, once we get to more complicated models it will be easier to train more parameters.
@karag4487
@karag4487 2 жыл бұрын
More of these please
@statquest
@statquest 2 жыл бұрын
I'm working on them :)
@Bulgolgii
@Bulgolgii 2 жыл бұрын
Hi Josh, will you be doing a walkthrough of how Tab-Net works in the future? Thank you!
@statquest
@statquest 2 жыл бұрын
I keep it in mind.
@dimabear
@dimabear Жыл бұрын
When you calculate the optimal learning rate, what is it that you're actually maximizing/minimizing? I get when you're minimizing loss, you're finding the optimal weights/biases that minimize the loss. But when you try to find the optimal learning rate, what are you maximizing/minimizing and what are you calculating with respect to? For example, the first parameter to lr_find is model, which is BasicLightningTrain(). And BasicLightningTrain() has fixed parameters, as well as the final bias which was changed from -16 to 0. So does this mean that lr_find() used the fixed parameters? If so, I'm assuming if we had set final bias to a random value (instead of 0) we'd arrive at a different optimal learning rate? thanks!
@statquest
@statquest Жыл бұрын
For each candidate learning rate, we do a few iterations of backpropagation to see which one reduces the loss in a better way.
@mahammadodj
@mahammadodj 2 жыл бұрын
Thank you very much but how do we initialize those weights and biases?
@statquest
@statquest 2 жыл бұрын
We'll talk about that in future videos.
@barberaTP
@barberaTP 2 жыл бұрын
As always, great video! So, this also work fine with AMD GPUs and solve the problems like tensorflow only works (great) with Nvidia graphics cards?
@statquest
@statquest 2 жыл бұрын
I'm pretty sure this will work well for any GPUs that PyTorch can work with.
@barberaTP
@barberaTP 2 жыл бұрын
@@statquest ok, I will give a check on the documentation. Thanks 👍
@MargaridaBiscaia
@MargaridaBiscaia Жыл бұрын
Hello Josh! I'm not receiving any emails to download the code! Can you help please? Thank you so much"
@statquest
@statquest Жыл бұрын
Did you check your "spam" folder? I just tested it myself and the email went to my spam folder, so check there.
@MargaridaBiscaia
@MargaridaBiscaia Жыл бұрын
That's exactly it! Thank you@@statquest
@kaanzt
@kaanzt Жыл бұрын
in "lr_find_results" part I have written the same exact code that you wrote but when i write "trainer.tuner.lr_find" it does not recognize tuner when i type "trainer.". And when i run the code after writing the same code, it says Trainer object has no attribute "tuner". I am sure that i have everything updated. I also checked the documentations but couldn't find any solution. Can you help for solving this issue?
@statquest
@statquest Жыл бұрын
Please download the Jupyter Notebook that is paired with this video. Lightning has updated how this works, so I updated the notebook: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@computerconcepts3352
@computerconcepts3352 2 жыл бұрын
Ooo0Oooo new video! Noice 👍
@statquest
@statquest 2 жыл бұрын
bam! :)
@clearwavepro100
@clearwavepro100 2 жыл бұрын
Nice!
@statquest
@statquest 2 жыл бұрын
Thanks!
@Rhine_e71
@Rhine_e71 8 ай бұрын
Sorry, but it seems that the tuner and trainer have been decoupled and the library had a lot of changes. Could you show us the updated code for that? Really appreciate it
@statquest
@statquest 8 ай бұрын
One of the benefits of downloading my code, rather than typing it in yourself, is that you get the updates. In case you missed the link for the code, here it is: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning?view=public&section=all
@burakalkan4137
@burakalkan4137 Жыл бұрын
I think 34 epochs not enough for learning rate 0.00214, in my tests I had to increase it way more epochs to get final_bias right
@statquest
@statquest Жыл бұрын
Did you download my code or did you type in your own? I'll look into this because they've updated Lightning since I created this video.
@burakalkan4137
@burakalkan4137 Жыл бұрын
@@statquest I didnt download just typed on my own, I had to give it 1000 epochs to get to the -16 val, what do you think the reason is? ( BTW great fan of your work, keep it up! )
@burakalkan4137
@burakalkan4137 Жыл бұрын
OK after setting the max_epochs to 1k I get -15.789505004882812 which is still not there, with 2k it gives out correct value (-16.xx) if I set lr to 0.1 34 epochs are enough, maybe they really changed something.
@statquest
@statquest Жыл бұрын
@@burakalkan4137 There could be a lot of things going on. I'll look into it.
@studynotslack
@studynotslack Жыл бұрын
Hi josh can u also teach on keras, its a lot more beginner friendly
@statquest
@statquest Жыл бұрын
I'll keep that in mind.
@vladmirbc8712
@vladmirbc8712 10 ай бұрын
I've got something strange here :) First of all I've added this in forward method, because I've got an error without it: input_to_final_relu = scaled_top_relu_output + scaled_bottom_relu_output + self.final_bias output = F.relu(input_to_final_relu) return output Then with model = BasicLightningTrain() trainer = L.Trainer(max_epochs=34) tuner = L.pytorch.tuner.Tuner(trainer) I've got model.final_bias.data = -2.1706 then I've changed max_epochs=5000 And only after that I've got correct model.final_bias.data = -16.0098 and I can't figure out why learning rate is the same as yours: "lr_find() suggests 0.00214 for the learning rate" However, an interesting fact is that I'm always suggested the same learning rate, regardless of whether I change the number of epochs or not
@vladmirbc8712
@vladmirbc8712 10 ай бұрын
I've figured it out. I really can't understand how 34 epochs can be sufficient for training with a learning rate (lr) of 0.00214. It seems like you might not have applied model.learning_rate = 0.00214. With model.learning_rate = 0.1, 34 epochs are indeed sufficient, but it seems nearly impossible to find the global optimum with a learning rate of only 0.00214 over 34 epochs
@statquest
@statquest 10 ай бұрын
Are you using my code or did you type it in yourself?
@vladmirbc8712
@vladmirbc8712 10 ай бұрын
@@statquest sorry, you're right! I was typing code according to the video, but now I've checked your notebook and have found some differences with my code (for example, here inputs = torch.tensor([0., 0.5, 1.] * 100)). So, everything is correct, thanks!
@RandyKumamoto
@RandyKumamoto Жыл бұрын
I think you missed one thing: the pip install pytorch-lightning command :)
@statquest
@statquest Жыл бұрын
That's a valid point! To be honest, I've always just assumed people would download the jupyter notebooks, which contain all of the installation instructions, but it's become clear that I should include them in the video as well.
@goncalofernandes1845
@goncalofernandes1845 2 жыл бұрын
Anyone having problems with the final cells from the code? The lr_find_results.suggestion() does converge to 0.00214, but then the trainer.fit() predicts a final_bias value of -2.1706, I've messed around with the code so it might just be me, still I can't seem to understand what's going on :/ . Anyway great work as usual Josh! Thank you for all the hard work!
@statquest
@statquest 2 жыл бұрын
Try downloading a fresh copy and and then running it without any changes. Does it work or are you running into the same problem?
@junyuzhang4627
@junyuzhang4627 2 жыл бұрын
I have the same problem with you
@GG-fb1kz
@GG-fb1kz 2 жыл бұрын
Having the same problem, the final bias i got is -2.17, not -16. Appreciate if anyone else can try it and shed some light. Thanks.
@statquest
@statquest 2 жыл бұрын
@@GG-fb1kz I'll take a look and let you know if I update the code.
@statquest
@statquest 2 жыл бұрын
So, I just reran everything and I get -16...so this is a mystery. However, I added a few lines to set the random number generators, so this should take care of any oddities that result from SGD. So, please download the new code here: github.com/StatQuest/pytorch_lightning_tutorials/raw/main/building_nns_with_pytorch_and_lightning_v1.1.zip
@eshenwarawita1228
@eshenwarawita1228 3 ай бұрын
do u have a paid course in pytorch? if so where can i purchase it
@statquest
@statquest 3 ай бұрын
Not yet.
@Rahul-oy1vo
@Rahul-oy1vo 2 жыл бұрын
Want more on PyTorch , Josh😭.
@statquest
@statquest 2 жыл бұрын
Working on it! :)
@Rahul-oy1vo
@Rahul-oy1vo 2 жыл бұрын
@@statquest Little bit quicky please Josh, you're the only savior we got.
@MyChannel-xe5dl
@MyChannel-xe5dl 2 жыл бұрын
I am facing a problem while installing lightning in my conda environment. its taking a lot of time. can you please help me
@statquest
@statquest 2 жыл бұрын
I think you figured out a work around.
@anshvashisht8519
@anshvashisht8519 Жыл бұрын
where is the link for notebook for this video?
@statquest
@statquest Жыл бұрын
bit.ly/3S9VdLu
@hsstp
@hsstp Жыл бұрын
Please provide the link to download the code, Thanks
@statquest
@statquest Жыл бұрын
The link to the code is in a pinned comment, but here it is as well: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@hsstp
@hsstp Жыл бұрын
@@statquest Thanks very much. Another request, It would be great if you uploaded a video about the diffusion model with code.
@giovannimeono8802
@giovannimeono8802 2 жыл бұрын
can we get one with keras and tensorflow ?
@statquest
@statquest 2 жыл бұрын
Unfortunately, probably not in the near future. I'm going to go through a whole PyTorch series (from simple to super fancy and in the cloud) first.
@macknightxu2199
@macknightxu2199 Жыл бұрын
Hi, at 16:40, how to set the epoch number? BR
@statquest
@statquest Жыл бұрын
I talk about setting epochs and steps in my video on coding Long Short-Term Memory neural networks: kzbin.info/www/bejne/iHmqmouGqtSSpqs
@macknightxu2199
@macknightxu2199 Жыл бұрын
@@statquest Looks good; this video is just added to the series. Cheers.
@gauravthakur9386
@gauravthakur9386 Жыл бұрын
Hi Josh, the link to the code in the description isn't opening for me. Is there a workaround?
@statquest
@statquest Жыл бұрын
Oops! I wonder what happened. I'll look into it.
@statquest
@statquest Жыл бұрын
OK. Try it again. I think it working now.
@gauravthakur9386
@gauravthakur9386 Жыл бұрын
It works now, thanks!@@statquest
@bryanmccormack2836
@bryanmccormack2836 2 жыл бұрын
Anyone getting the following error: "module 'lightning' has no attribute 'LightningModule'"
@statquest
@statquest 2 жыл бұрын
Which version of lighting are you using? Also, feel free to contact me directly via my website: statquest.org/contact/
@bryanmccormack2836
@bryanmccormack2836 2 жыл бұрын
@@statquest Pytorch: '1.12.1' and the most recent Lightning: 3.2.0
@statquest
@statquest 2 жыл бұрын
@@bryanmccormack2836 Hmm... The version for Lightning seems a little strange since the latest version is 2022.10.7. Try "pip install lightning --upgrade" to see if you can get the new version.
@davidlu1003
@davidlu1003 Ай бұрын
I just wonder if we had 1,000,000,000 weights and bias then you would code from self.w1 to self.w1,000,000,000, which is crazy and a huge lot of duplicated work to code... I think there must be another way to code the large number of weights and bias.😁😁😁
@statquest
@statquest Ай бұрын
Yes. The most common way is to use nn.Linear(), which will create weights and biases for you. For more details, see my other PyTorch tutorials, which can be found here: statquest.org/video-index/
@davidlu1003
@davidlu1003 Ай бұрын
@@statquest Thx💗💗💗
@fustigate8933
@fustigate8933 2 жыл бұрын
First
@statquest
@statquest 2 жыл бұрын
BAM! :)
@arhammehmood9963
@arhammehmood9963 7 ай бұрын
@statquest the updated code for finding new learning rate is not working
@statquest
@statquest 7 ай бұрын
Did you type in your own code or follow the link to my code? lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning I just re-ran my code and it worked fine.
@itsbxntley2970
@itsbxntley2970 Жыл бұрын
trainer = L.Trainer(max_epochs=34, accelerator="auto", devices="auto") 1)## Now let's find the optimal learning rate tuner = L.pytorch.tuner.Tuner(trainer) lr_find_results = tuner.lr_find(model, train_dataloaders=dataloader, # the training data min_lr=0.001, # minimum learning rate max_lr=1.0, # maximum learning rate early_stop_threshold=None) # setting this to "None" tests all 100 candidate rates 2)# lr_find_results = trainer.tuner.lr_find(model, # train_dataloaders=dataloader, # the training data # min_lr=0.001, # minimum learning rate # max_lr=1.0, # maximum learning rate # early_stop_threshold=None) # setting this to "None" tests all 100 candidate rates For some reason finding learning rate with method 2 brings up the error AttributeError: 'Trainer' object has no attribute 'lr_find' ...what could be the issue?...already tried updating lightning
@statquest
@statquest Жыл бұрын
So they just released PyTorch 2.0 which reorganized the code. I've updated the notebook to reflect this change. However, updating the video is much harder. I'll put a note in a pinned comment.
Long Short-Term Memory with PyTorch + Lightning
33:24
StatQuest with Josh Starmer
Рет қаралды 68 М.
The StatQuest Introduction to PyTorch
23:22
StatQuest with Josh Starmer
Рет қаралды 161 М.
The IMPOSSIBLE Puzzle..
00:55
Stokes Twins
Рет қаралды 167 МЛН
Players vs Pitch 🤯
00:26
LE FOOT EN VIDÉO
Рет қаралды 133 МЛН
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
AI can't cross this line and we don't know why.
24:07
Welch Labs
Рет қаралды 1,3 МЛН
Dynamic Deep Learning | Richard Sutton
1:04:32
ICARL
Рет қаралды 6 М.
Sequence-to-Sequence (seq2seq) Encoder-Decoder Neural Networks, Clearly Explained!!!
16:50
StatQuest with Josh Starmer
Рет қаралды 207 М.
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 1,3 МЛН
Building a Neural Network with PyTorch in 15 Minutes | Coding Challenge
20:34
The Essential Main Ideas of Neural Networks
18:54
StatQuest with Josh Starmer
Рет қаралды 988 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 378 М.
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,3 МЛН
The IMPOSSIBLE Puzzle..
00:55
Stokes Twins
Рет қаралды 167 МЛН