Get the code/Jupyter Notebook here: lightning.ai/lightning-ai/studios/statquest-long-short-term-memory-lstm-with-pytorch-lightning?view=public§ion=all To learn more about Lightning: lightning.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@duttaoindril Жыл бұрын
Still waiting on the last one in the series - attention.
@statquest Жыл бұрын
@@duttaoindril I'm still working on it.
@naruto-yy4xk Жыл бұрын
@@statquest LLM ? BAM??? Please.....😅😅
@statquest Жыл бұрын
@@naruto-yy4xk I'm working on it.
@naruto-yy4xk Жыл бұрын
@@statquest Bam....
@DeanRGAnderson Жыл бұрын
I am 71 yr old engr. grad from UCLA in 1975. Binge watched first 21 videos from Josh Starmer's Neural Networks/Deep Learning playlist in 2 days. Wonderful experience. Josh is an excellent teacher. I have no previous experience with neural networks, but after watching these videos, I feel ready to experiment with NN.
@statquest Жыл бұрын
BAM!!! Enjoy! :)
@DeanRGAnderson Жыл бұрын
@@statquest Just told my grandson (EE BYU and now grad student at BYU for MSEE) to watch this same playlist. Are you a univ. professor? My grandson is doing his 3rd summer intern with me. We will be doing on-device ASR with a new type of microphone enhancing SNR for lips to microphone distances up to 8 meters. (Scotty: "Hello computer" - Star Trek 4)
@statquest Жыл бұрын
@@DeanRGAnderson That's really cool!!! My next video (coming out on Monday) is about how neural networks can be used to translate one language (like english) to another language (like spanish). I'm pretty excited about it. I'm not a professor - I used to be one - in genetics at UNC-Chapel Hill - but now I try to spend as much time making videos as possible. I visited Utah (Salt Lake City) for the first time last summer - it was one of the most beautiful places I've ever been. I loved hiking in the hills that surrounded the city. Good luck with your project! It sounds great.
@gustavoalves97405 күн бұрын
Joshua, you are an incredible person! I just find out about your channel and all of your musics. I'm addicted to the way you explain complex things in such simple ways. I'm from Brazil (Christmas in Rio!!), doing my PhD here in the USA, and now you are one of my inspirations. Thank you!
@statquest4 күн бұрын
Muito obrigado! :)
@clementhironimus3 ай бұрын
Thanks, Josh! Great example of First-Principles Thinking-breaking down concepts to their core fundamentals. I truly appreciate the clarity and have been enjoying your Neural Network series!
@statquest3 ай бұрын
TRIPLE BAM!!! Thank you very much for supporting StatQuest!!! It really means a lot to me that you care enough to contribute.
@markfchapmani9 ай бұрын
This is just great Josh. You have a real ability to explain these complex concepts in an understandable way.
@statquest9 ай бұрын
Thank you!
@hbb21st Жыл бұрын
StatQuest as an English teaching video has been running successfully for long, clear and logic, my son and like it. :)
@statquest Жыл бұрын
BAM! :)
@ppradhan Жыл бұрын
Thank you Josh. I owe you a lot. My binge watching ML series ended today. Holy smoke, I almost spent the whole month of March! Soon I will buy your book "The StatQuest Illustrated Guide To Machine Learning". I am inspired by your work. e^BAM!!!
@statquest Жыл бұрын
Thank you very much! I hope you enjoy the book. I'm super proud of it because it incorporates a lot of lessons I learned from making the videos and also provides more of a coherent flow from topic to topic.
@wjchicago Жыл бұрын
cannot be cleaner and clearer than this!
@statquest Жыл бұрын
Thanks!
@jessicas2978 Жыл бұрын
This tutorial is amazing, and I finally know how to code LSTM. Super helpful to my projects. Thank you so much!
@statquest Жыл бұрын
Glad it helped!
@caiyu538 Жыл бұрын
Great, great ,great. Thank you so much for your great lectures.
@statquest Жыл бұрын
Glad you like them!
@SaschaRobitzki10 ай бұрын
Great video! Especially the LSTMbyHand; I need more of that.
@statquest10 ай бұрын
Thanks!
@wesleyfman Жыл бұрын
Que surpresa boa! Seus vídeos são incríveis, continue assim! Um grande abraço do Brasil! DOUBLE BAM!
@statquest Жыл бұрын
Muito obrigado!!! :)
@DanteNoguez Жыл бұрын
An incredible video, and the Spanish audio version sounds really good as well. DOUBLE BAM!!
@statquest Жыл бұрын
HOORAY!!! I'm so glad that is working out. I hope to add that to more of my videos.
@danielbadawi5623 Жыл бұрын
What a video WOW. Very useful. Please never ever stop making videos.
@statquest Жыл бұрын
More to come!
@danielbadawi5623 Жыл бұрын
@@statquest a special request if you please could do a tutorial on ConvLSTM.
@statquest Жыл бұрын
@@danielbadawi5623 I'll keep that in mind! However, for now the next steps are 1) Word Embedding, Attention and then Transformers.
@aliaamir9778 Жыл бұрын
Just loving your every video , Clearly explained every Concept of ML and DL, please make one video on GRU algorithm as well.
@statquest Жыл бұрын
Thanks! I'll keep that in mind.
@laythherzallah3493 Жыл бұрын
You make the life easy , thank you thank you thank you
@statquest Жыл бұрын
Thanks!
@supermandrew88 Жыл бұрын
Hi Josh! In your introduction for the sample problem you mention that we're looking at stock prices for two different companies: A and B. If A and B didn't share the same values for days 2, 3, and 4, would you still use one LSTM model for both stocks? In other words, when would you instead choose to create a separate model for each stock? I'm sort of looking at a similar example problem: sales reps performances over time. Would you include all the reps performances in your dataset, or would you create a different model for each rep? Thank you for your time! Your book just arrived at my house a few days ago.😄
@statquest Жыл бұрын
It really depends on what you want. One thing that is nice about about fitting the LSTM to multiple stocks is that it helps prevent overfitting and this can help with making predictions in the long run.
@bleakmess Жыл бұрын
Hi Josh! Thank you for such a wonderful video series, and as I am familiar with all these concepts what's my next step? I have gone through your book as well. I wish there were some problem sheets or coding exercises to get a feel for the methods you discussed. Thanks a lot man and guide me if possible, please.
@statquest Жыл бұрын
I'm working on more PyTorch videos.
@bleakmess Жыл бұрын
@@statquest Awesome.
@kanakornАй бұрын
Please explain the GRU too. Thanks 🙏🙏🙏
@statquestАй бұрын
I'll keep that in mind.
@arshia1308Ай бұрын
Thank you for all your amazing videos they've really helped me a lot. Even bought your book for reference and to support! While trying to get tensorboard working though unfortunately i get this error: AttributeError: `np.string_` was removed in the NumPy 2.0 release. Use `np.bytes_` instead.. Did you mean: 'strings'?
@statquestАй бұрын
Are you using my code, or did you type it in yourself?
@DumplingWarrior2 ай бұрын
Could we add something to get an idea of what the best learning rate to use for training the LSTM model just like we did in another video of a simple nerual net? Just curious if it's doable here as well and wonder how that's gonna affect the result in terms of speed and accuracy.
@statquest2 ай бұрын
I'll keep that in mind.
@chandanbp Жыл бұрын
All this content for free. Triple BAM!!!
@statquest Жыл бұрын
Yes!
@brahimmatougui1195 Жыл бұрын
hope to see a new video soon. and hope you are doing well too
@statquest Жыл бұрын
The next video in this series comes out... right now! :)
@brahimmatougui1195 Жыл бұрын
@@statquest I have that feeling that the new video is coming out hhhhh
@statquest Жыл бұрын
@@brahimmatougui1195 Here's the link: kzbin.info/www/bejne/h5eTZ4t6jr12jqs
@Eduardo_Trader_Investidor9 ай бұрын
Muito bom!
@statquest9 ай бұрын
Muito obrigado!
@harryzheng654Ай бұрын
Great videos! Just one question: why the lightninglstm is much better than lstmbyhand with much less epochs?
@statquestАй бұрын
I'm not exactly certain why it trains faster, but I do know that nn.LSTM has extra bias terms - I'm not sure why - and they are undocumented - you have to look at the code to know they are there - but maybe those play a role.
@panayiotisgeorgiou16096 ай бұрын
Hey, I have just watched a bunch of your videos. Very, Very nice work. Really to the point and super informative. I am currently making a statistical model analysis for a machine learning algorithm, but I am super confused with what should I consider as True negative and False Negative. Because the algorithm makes only positive claims. Patient comes and has a disease A, but the System shows a Disease B. What's that? A FN or a TN ? I am absolutely confused, and I'm taking a long shot here.
@statquest6 ай бұрын
To be honest, an algorithm that only makes positive claims doesn't sound very useful. Instead, you should consider 3 possible outputs - disease A, disease B and neither disease A or B. To learn how to interpret 3 possible outputs, see: kzbin.info/www/bejne/gZXWoWmppNZ0bdEsi=Yei-8SWZiY42tcwt&t=288
@shamshersingh96806 ай бұрын
Hi Josh, in this line of code in training_step method --> output_i = self.forward(input_i[0]), why have we passed input_i[0] and not input_i. I presume, we are passing data of both companies in single batch. If we pass input_i([0]) we are passing only the first input into the forward method in each training step.
@statquest6 ай бұрын
Why do you presume that we are passing data from both companies in a single batch?
@statquest6 ай бұрын
Adding "print(input_i)" to the training_step() shows that each batch consists of just the values from a single company. See... batch_idx: 0 tensor([[0.0000, 0.5000, 0.2500, 1.0000]]) batch_idx: 1 tensor([[1.0000, 0.5000, 0.2500, 1.0000]]) To be honest, the best way to answer these questions is to fiddle with the code. You can learn a lot more that way much faster. The reason why I've put the code in a Lightning Studio is that you can make a copy of it and then run it and play around with it. And if it completely breaks, you just make another copy. It's super easy.
@shamshersingh96806 ай бұрын
@@statquest Thanks a lot Josh... to be honest. when I started to deep dive in deep learning.. it scared me.. but your videos have just ignited my curiosity all over again. Thanks a lot Josh. Thanks a lot.
@statquest6 ай бұрын
@@shamshersingh9680 Happy to help! But now you need to get your feet wet and dive in. :) (Just trying to encourage you!)
@shamshersingh96806 ай бұрын
@@statquest Yeah done that already. Happy to say that now I pretty comfortable with neural networks and related work. Recently created a image classification models (although pretty basic ones using MNIST, CIFAR10 and CIFAR100 datasets). Just an update:- if someone is using Jupyter notebook then following command can load tensorboard in notebook itself: - %load_ext tensorboard %tensorboard --logdir=lightning_logs No need to go to Home --> File --> New --> Terminal --> tensorboard --logdir=lightning_logs. The magic commands will upload the tensorboard display in the notebook itself. Hope it helps.
@nancyboukamel4428 ай бұрын
Thank you Josh Stammer!! Can you please do a video for transformer encoder using lstm
@statquest8 ай бұрын
Hmm...Transformers don't use LSTMs...so are you thinking of something else? Here's a video about transformers: kzbin.info/www/bejne/sKm0qoeBbdaor7s
@magicfox94 Жыл бұрын
Infinite BAM for you!
@statquest Жыл бұрын
Thank you! :)
@supermandrew88 Жыл бұрын
Hi Josh! Sorry for another question. I've got a dataset that is slightly more complex than the stock example data you used here. I'm trying to use LSTM to predict the result of a sales lead before it runs. For each data point, I've got: the sales rep, the date of the lead, the product pitched, the zip code, and the result (sold or not sold). I've already got my program set up to evaluate different encoding methods for each of those variables. After watching this video, I'm trying to visualize how I would use my data in an LSTM. You created two arrays of data: one for company A and one for company B, where each index in the array corresponded to a date in your dataset. So for my dataset, I can create n arrays of data, where n is my number of sales reps. My issue arises in the fact that I have an uneven amount of data points per date. For instance, sales rep Bob was able to run 3 appointments on May 1 and 1 appointment on May 2. Likewise, sales rep Alice ran 0 appointments on May 1 (I guess she's off on Mondays), and 4 appointments on May 2. How would I preprocess this data for an LSTM? I'm assuming each data point would be an array of encodings: [sales rep, product, ...], but if one rep ran 4 leads on a day and another ran 1 (or even 0), do I just "pad" my data with a value like -1? So on day 2, Bob might be: [[encoded data point 1],-1,-1,-1], and Alice is: [[data point 1],[data point 2],...] ? Finally, I noticed sequence length wasn't really included in this video. Is that because our sequence length is 1 in this case? Do we still capture temporal relationships with a sequence length equal to 1? Sorry for the lengthy question. I appreciate any insights! P.S. I hope you create a 2nd StatQuest book that includes topics like LSTMs, Transformers, etc. I'd love to buy it! 😊
@statquest Жыл бұрын
The whole idea of using LSTMs (or any recurrent neural network) is to allow for different sequence lengths for the input values. So "Bob" can have a sequence of 10 values and "Alice" can have a sequence of 5 values. That's fine. What I would encourage you to do is to look at my word embedding video: kzbin.info/www/bejne/rJq9o4Kkf8ifj5I and consider adding an embedding layer to the inputs to the LSTMs. You could have one embedding layer encode the employee etc. NOTE: You don't have to pre-train the embedding layer as suggested in the video. You can just add the embedding layer (with nn.Embedding()) to your model and train everything at once.
@supermandrew88 Жыл бұрын
@@statquest Thanks, I will take a look now! I actually used your encoding video in order to implement the weighted mean target encoding for the names of the reps. You mentioned that it's okay for Bob to have a sequence of 10 values and Alice to have a sequence of 5. Just to confirm, this would be like Stock A having multiple and a different amount of values on day 1 compared to Stock B. This should be okay?
@kareemullaashrafali747610 ай бұрын
simply wow
@statquest10 ай бұрын
Thanks!
@henryhsu95178 ай бұрын
Thank you Josh. This tutorial is amazing. I have a question about the number of parameters. In the LSTMbyHand model, there are 12 parameters. Instead, there are 16 parameters in the LightningLSTM model. My understanding is that [wlr1, wpr1, wp1, wo1] could be viewed as lstm.weight_ih_l0, and, [wlr2, wpr2, wp2, wo2] could be viewed as lstm.weight_hh_l0, and, [blr1, bpr1, bp1, bo1] could be viewed as lstm.bias_ih_10. Is it correct? If true, how do we realize lstem.bias_hh_l0 in the LSTMbyHand model?
@statquest8 ай бұрын
Instead of adding the input and short term memories (the input and hidden state) together before adding the bias terms, you and have each one have it's own bias term before adding them together.
@henryhsu95178 ай бұрын
@@statquest BAMs!!! Thanks Josh. I learned a lot of details about LSTM from your tutorial!
@Tomas-kv7mw Жыл бұрын
Great videos! Will there be a transformer/attention video?
@statquest Жыл бұрын
Yes, soon!
@NelsonPeace-l4f Жыл бұрын
Another incredible video by this incredible educator. Josh...any videos on transformer models in the pipline?
@statquest Жыл бұрын
they are definitely in the pipeline.
@NelsonPeace-l4f Жыл бұрын
@@statquest Looking forward to it!!
@duttaoindril Жыл бұрын
Literally waiting with baited breath since my assignment is due soon 😂
@statquest Жыл бұрын
@@duttaoindril I'm working as quickly as I can, but it's still at least a month away.
@HaozheJiang Жыл бұрын
Hey Josh, Thanks for the video. I got one question: in the training step, for the self.forward(), why do you take input_i[0] as input instead of just input_i?
@statquest Жыл бұрын
That makes sure that the "dimensions" of the tensor are correct. I hope to cover this topic soon.
@HaozheJiang Жыл бұрын
Does it mean you take only one feature (e.g. price) of the input? @@statquest
@statquest Жыл бұрын
@@HaozheJiang In this case, the whole LSTM is set up to only accept a single input, however, an LSTM can accept multiple inputs if we configure it that way. No, the reason we have input_i[0] is simply to remove some extra brackets from the data so that the tensor has the correct dimension (and I'll explain this better in a new video soon).
@HaozheJiang Жыл бұрын
well understood. Thank you Josh!@@statquest
@terryliu36357 ай бұрын
Thanks, Josh, for another great video! Another quick question, why is that the case using the 2nd approach, only with 300 epochs we are able to get good prediction accuracy, but the 1st approach has to take more epochs?
@statquest7 ай бұрын
This question is answered at 30:51
@Bbdu75yg Жыл бұрын
AWESOMEEEEEEE
@statquest Жыл бұрын
You're welcome 😊
@huseyngorbani6544 Жыл бұрын
Hi, thanks for the video. Been waiting video about transformers and its implementation. Please kindly share.
@statquest Жыл бұрын
I'm working on it.
@reeljojo9229 Жыл бұрын
Thanks Josh!!! You are incredible! I learned a lot from your tutorial! Now I understand how to use PyTorch and Lightning to optimize LSTM by hand, but I couldn't use these methods well for the dataset. Maybe you have any tutorials to recommend? Thanks again!
@statquest Жыл бұрын
What dataset are you referring to?
@reeljojo9229 Жыл бұрын
@@statquest Like stock dataset, with open, low, high, close columns etc.
@statquest Жыл бұрын
@@reeljojo9229 Ok. I'll keep that in mind.
@reeljojo9229 Жыл бұрын
@@statquest You are the best!!!!
@ChristosKaskouras Жыл бұрын
What are you doing out there cannot be described! Thanks a lot for all the videos! I have a couple of questions though. The first, regarding the model training. Let's suppose that I want to create a list of the of the losses for every epoch. Is that possible? Can I somehow have the trainer in a for loop? The second, when I try to load the TensorBoard an error page appears saying "No dashboards are active for the current data set."
@statquest Жыл бұрын
You should be able to make a list of the losses...and I'm bummed you are getting an error. Are you using my specific jupyter notebook or have you written your own code?
@ChristosKaskouras Жыл бұрын
@@statquest Even when I am using your code I am getting the error. I guess it might be something with the installation of lightning since I cannot execute the line "from pytorch_lightning.utilities.seed import seed_everything", I receive ImportError. I tried to reinstall the packages and to run the code from IDE (Spyder) but still it did not work
@ChristosKaskouras Жыл бұрын
@@statquest Finally I managed to open the TransferBoard when I run Anaconda as administrator.
@ChristosKaskouras Жыл бұрын
@@statquest I am trying to modify your code to be be adjusted in my problem. I need to design a NN which will take as input a timeseries and will predict another timeseries, which means that for each value of input there should be a value for output (I have data for both to do the training). But when I change the inputs and labels, it seems that it works but all the predictions are 'tensor([0.])'. What I am doing is that I set as inputs a nested list of my input data and as labels a list of outputs. Is that something you can help me with?
@statquest Жыл бұрын
@@ChristosKaskouras The way I debug stuff like this is that I add a bunch of print statements to the "forward" method and call it directly with some data and make sure everything is working as anticipated.
@z4br4k98 Жыл бұрын
Would it not be better to initialize the parameters using Xavier? Otherwise we might introduce vanishing gradients
@statquest Жыл бұрын
Perhaps. However, things work as-is in this example.
@jonaskarlsson590113 күн бұрын
DOUBLE TRIPLE MEGA ULTRA SUPER BAAAAAM!!!!
@statquest13 күн бұрын
:)
@michaeldouglas7641 Жыл бұрын
When will hear from the GOAT ML/DS educator on the topics of 1) transformers, 2) GNNs and 3) VAE's (in descending order of importance)...?
@statquest Жыл бұрын
My video on transformers is currently available for early access to channel members and patreon supporters.
@ethansmith7608 Жыл бұрын
can you make a video breaking down what constitutes a BAM, and what characteristics can qualify said BAM as DOUBLE or even TRIPLE BAM?
@statquest Жыл бұрын
Sure! Here it is: kzbin.info/www/bejne/n2XMhqmgqKx2g8U
@ethansmith7608 Жыл бұрын
@@statquest legend!
@namyashah8 ай бұрын
In this exact same code, what changes would I have to make if I want a Bidirectional LSTM and I want to predict more then 2 classes?
@statquest8 ай бұрын
If you want to predict more than 2 classes, you can run the output through a fully connected layer and then through a softmax layer.
@suzhenkang Жыл бұрын
Could you make tranform video cant wait to see it
@statquest Жыл бұрын
I'm working on it.
@suzhenkang Жыл бұрын
@@statquest Cool cant wait
@suzhenkang Жыл бұрын
@@statquest And GAN
@suzhenkang Жыл бұрын
@@statquest More complicated , your explaination is better
@x11y22z33me Жыл бұрын
Hi Josh. I have a question about why these predictions work so well. I have seen your LSTM video as well, but don't have an intuitive feel for this. For example, company A goes up from 0 to 1 in 4 steps. Why would a model expect it to come down to zero the next day? If I were to guess, I would expect the value to remain high for day 5, and even if it reduces maybe go to 0.5 from 1 since the previous jump down was from 0.5 to 0.25.
@statquest Жыл бұрын
The model was trained specifically with this data, so it's just replaying what it was trained on.
@x11y22z33me Жыл бұрын
@@statquest Oh okay, makes sense. Thanks for your reply, and thanks for all you do.
@azmyin Жыл бұрын
Dr. Starmer, I manually wrote the code following your tutorial but when I get to 19:50, I am getting the "grad can be implicitly created only for scalar outputs" error and its stopping the training process. I have pytorch with CUDA 12.1 support and the latest version of lightning installed
@statquest Жыл бұрын
Please download and try the code that I wrote first.
@macknightxu2199 Жыл бұрын
Hi, will there be new videos in this series of NN? BR
@statquest Жыл бұрын
Yes, a lot more.
@jehannemottier76972 ай бұрын
hi! do i need a GPU available ? or is the use of CPU sufficient?
@statquest2 ай бұрын
This tutorial runs just fine on a CPU.
@asjadnabeel Жыл бұрын
BAMs!! .... Is Statquest video uploaded on transformers ?!.. I couldn't locate on playlists .... Waiting for it....
@statquest Жыл бұрын
Not yet. I'm working on it.
@asjadnabeel Жыл бұрын
@@statquest Ok .. Thanks brother..
@cheynin Жыл бұрын
what "temp" using for in line "lstm_out, temp = self.lstm(input_trans)"?
@statquest Жыл бұрын
It's a duple that contains the final hidden state (short-term memory) and the final cell state (long-term memory). In other words, we could have used that first part of the duple instead of lstm_out[-1] if we wanted to.
@duttaoindril Жыл бұрын
Still waiting on the last one in the series - attention.
@statquest Жыл бұрын
Still working on it.
@SaschaRobitzki10 ай бұрын
Why is the LSTMbyHand's training_step not using the batch_idx?
@statquest10 ай бұрын
Because we don't need to know the index.
@SaschaRobitzki10 ай бұрын
@@statquest I was just wondering because you specifically mentioned batch_idx in the video, so I thought you actually had planned making use of it.
@statquest10 ай бұрын
@@SaschaRobitzki I mention it because you need to include it, not because you need to use it.
@jingwentang676811 ай бұрын
Thank you for making the video. Any one knows where I can download the jupyter notebook ?(from the given link I did not find it)
@statquest11 ай бұрын
Sorry about that. Everything just recently changed and I need to update things today. For now, you can find the code here: github.com/StatQuest/pytorch_lightning_tutorials/blob/main/README.md
It is amazing how you promptly replied to me. Thank you so much!@@statquest
@ptcita16 Жыл бұрын
Amazing video, thanks Josh! Wanted to work with the TensorBoard but keeps getting error :s It does not generate any URL. Does that ever happen to you? Thanks in advance!
@statquest Жыл бұрын
I'm sorry you're having trouble... Are you using my notebook or have you written your own code? Have you correctly navigated to where the "lightning_logs" directory is before running tensorboard?
@ptcita16 Жыл бұрын
@@statquest Thanks for replying ! I was using you code and made sure I was on the folder that has "lighting_logs". Was searching and seems to be realted to a TensorFlow issue in my machine. Is there a specific version I must have for it to work?
@statquest Жыл бұрын
@@ptcita16 That I don't know, but you can try to update tensorboard with "pip install tensorboard --upgrade"
@statquest Жыл бұрын
@@ptcita16 Also, can you give me the command line that you are using to get tensorboard running?
@statquest Жыл бұрын
@@ptcita16 Is it possible that you don't have TensorBoard installed to begin with? This seems strange, but it might be the case. You can type "which tensorboard" on the command line to find out.
@BruceHartpence Жыл бұрын
Nice video as always Josh. I have a quick question: the example seems to break the dimensions of nn.LSTM (input must have 3 dimensions, got 2) and the expected are ([batch_size, seq_len, nb_features]. Any thoughts?
@statquest Жыл бұрын
Are you using my jupyter notebook or your own code?
@BruceHartpence Жыл бұрын
@@statquest Good morning! I am using your code but in a standard Python39 Windows install with the latest torch and lightning. As one might expect, LSTMbyHand works fine without the call to nn.LSTM. I was just puzzling through creating a single example as in your model(torch.tensor([0.,.5,.25,1.])).detach(). Training has a similar problem.
@statquest Жыл бұрын
@@BruceHartpence The code you copied and pasted looks different from mine, which is model(torch.tensor([0., 0.5, 0.25, 1.])).detach() (where I explicitly add the 0's before each decimal point.) I know that's not the problem with your code, but it suggests that you are writing your own rather than using my jupyter notebook. Is that correct?
@BruceHartpence Жыл бұрын
@@statquest Well, it's your code from the video. If I typed something wrong i am going to feel silly all day. I'll check later today.
@statquest Жыл бұрын
@@BruceHartpence Let me know - I use a mac and it's possible there are differences that need to be worked out between OSs (I hope not).
@tribuiduonguc7788 Жыл бұрын
Is it true that for the number of output time step(e.g. I want to predict value for the next 5 days), we need to specify the corresponding hidden size as the video mentioned (hidden size = 5)?
@statquest Жыл бұрын
If you want to predict a value for the next 5 days, you just need to unroll the LSTM 5 times. This is different from wanting 5 outputs per day, which is what the hidden size parameter determines.
@tribuiduonguc7788 Жыл бұрын
@@statquest thank you for your rapid response
@Shahawir Жыл бұрын
Hello, If you have panel data, imbalanced one, I mean like the companies you have, but suppose you have 500 companies, and companies data are not equal( some has 11 data points, some has 20, some has 34.. will LSTM be good here? What about if you want to incorporate some other categorical variables, does LSTM allow this? I need help with this. If anyone know any resources/keywords, or anything that can help me solve this problem, please do not hesitate to comment, Thank in advance
@statquest Жыл бұрын
The whole idea of LSTMs (and all Recurrent Neural Networks) is to be able to work with different amounts of data associated with each sample or company or whatever it is you are using. If you have categorical data ,then you probably need to one-hot-encode the data. For details, see: kzbin.info/www/bejne/a2mcn3Z9mrx6Z9k
@Shahawir Жыл бұрын
@@statquest Thanks a lot for taking from your time to answer my question. You saved me, literally…🤝🤝
@statquest Жыл бұрын
@@Shahawir bam!
@prashlovessamosa Жыл бұрын
Hey man can you update your playlists because some of your last videos aren't in the playlists.
@statquest Жыл бұрын
I'll do that today.
@ZealotfeathersGorgonoth Жыл бұрын
I've been trying to get tensor board to work in VSCode and am having trouble. Does this set up not work for newer versions of python?
@statquest Жыл бұрын
Hmmm.... It should work, even with newer versions of Python.
@ZealotfeathersGorgonoth Жыл бұрын
@@statquest I got it to work! I am new to the data science / Machine Learning world and I was wondering if you would make a video about your coding set up, what IDE's you recommend or how to set up a productive environment and the choices between using Juypter Notebooks, VSCode, etc. Would be really interesting to see how you do it!
@statquest Жыл бұрын
@@ZealotfeathersGorgonoth Awesome! BAM! I'll keep that in mind for a topic. Most people I know use VSCode (I use notebooks since I think they are good for teaching and learning.) But there are a ton of other things that go into a good environment.
@luizcarlosazevedo9558 Жыл бұрын
i could only import lightning with pytorch_lightning, is it the same as lightning package?
@statquest Жыл бұрын
Hmm... Try it and see if it works. However, I suspect it is different. You may need to update Lightning and PyTorch Lightning.
@Zoro_Onigiri03 Жыл бұрын
Hello statquest, statquest website seems to be down. It says "error establishing data connection".
@statquest Жыл бұрын
Yep. It went down last night. It's back up.
@Zoro_Onigiri03 Жыл бұрын
@@statquest Thanks 👍
@NguyễnTiệp-c8c3 ай бұрын
"Why don't we see the stage of forget?"
@statquest3 ай бұрын
That's the first stage. I call it the "% Long Term to Remember" because that is what that section does - it calculates the % of the long term memory that should be remembered. Why people call it a "forget" stage is a mystery. Percentage to forget = 1 - percentage to remember, so forgetting and remembering are two different things, and in that stage we remember.
@Derelicty5 ай бұрын
Hey!! i'm doing a forecast prediction, my dataset timestamps were collected every 30 seconds, the last line of the data is the "now" , but how could i use it to predict the "now + 1"? i searched for something like a lookahead but i couldnt figure out so, its possible to train the model to always predict the next line? if so, i would love to read your explanation ❤
@Derelicty5 ай бұрын
oh, the expected behavior when i dump this model can be to only retrieve the next value ( now + 1 ), is The vídeo already doing it? if so, how could i put a full dataset and return The prediction for each timestamp, so i could plot it (train and test, predicted and real), would it be a loop through each timestamp, predict the next, append to the predictions and when the next iteration start, use the real value of last timestamp predicted (that whas unknow on the last iteration) to predict the next and so on?
@statquest5 ай бұрын
I'm not sure I fully understand your problem, however, typically you decide how many time stamps you want to use to make predictions, then separate your data into training and testing datasets, then train the model and then test it.
@Derelicty5 ай бұрын
@@statquest I'm sorry, I was traveling at the time, I wanted to ask if your implementation only returned the prediction of the next value, but I saw that you decide to take predictions[-1], I already have an LSTM implemented and I thought your teaching was very good! thanks for your answer!
@beegdigit98112 ай бұрын
You should note that what you're doing here is overfitting the model to training data, so that people don't get lost.
@statquest2 ай бұрын
Ok.
@kimjong-un4521Ай бұрын
I can't download your code
@statquestАй бұрын
Are you unable to make an account on Lightning AI?
@kimjong-un4521Ай бұрын
@@statquest I downloaded. Thankyou
@IvarGarnes-o3o10 ай бұрын
Hi Josh, I have now watched your 20+ videos on NN and I have learnd a lot. Thanks a lot for a very good setup!!! As I am new to this, there are still many things I do not understand or cannot figure out myself. So I will ask you: When I use your nn.LSTM model on the 'stock' data and print the daily LSTM output after 300 epochs, I see that the output values are very different from input data; ie [0, 0.5, 0.25, 1], [1.0, 0.5, 0.25, 1.0]. The output i get for the same days are: Epoch 299: 0%| | 0/2 [00:00
@statquest10 ай бұрын
The goal is to only predict what happens on day 5, so that is the only value we use in the loss function.
@LuizHenrique-qr3lt Жыл бұрын
Hey Josh send a "salve" tô the cível people ouro squad ia called darthcivel. Great vídeo like everyone else, congratulations!
@statquest Жыл бұрын
Thank you! However, I can't quite make out what your comment is. What is a "salve"?
@LuizHenrique-qr3lt Жыл бұрын
@@statquest "Salve" is like a greeting in Brazil, when you arrive somewhere with other people you say "salve" it's like "hi, how are you?"