I don't usually comment on videos but this series of videos require a big THANKS YOU, you deserve far more subscribers for your good content, please keep it up. They are great and concise, good job :).
@HeatonResearch5 жыл бұрын
Thanks! Glad they are helpful.
@rchuso5 жыл бұрын
Thanks again. Looking forward to the next one. I'm getting two separate "minimum" values from my early-stopping, and I'm taking the halfway between value. I'm also following your earlier work and using the sklearn to train 5 times using a different 80% each time, and taking an average value slightly weighted to the higher numbers for training the entire dataset.
@HeatonResearch5 жыл бұрын
Very good advice, thanks for sharing.
@DanielWeikert5 жыл бұрын
Well explained. Thank you so much for your videos Jeff. Highly appreciated!
@miguelchevres46694 жыл бұрын
Hey Proff. Jeff! I wish I could attend your lectures at Washington University! You are a great professor and patient with your explanations. Thank you!
@Diamond_Hanz5 жыл бұрын
Thanks again, sir!
@yepnah35144 жыл бұрын
Good tutorial. Question, after training and saving a model, is there a way to multiply the trained weights before loading them into a new model so I can see how it performs w/o training again? for example: (trained weights)*.95
@chaitanyaharde62385 жыл бұрын
Thank You so much Sir.
@dutchy57523 жыл бұрын
hi, so if I would know nothing about the model setup and I would just load the model can I also extract the feature setup in this case, x? Is this also stored in the h5 file?
@gurudaki5 жыл бұрын
Great tutorials Mr Jeff. One question...Why don't you scale the x(input) values?
@ronaldssebadduka68373 жыл бұрын
thank you for this video. now hwo about i load the model and use a different dataset. Not the same. I wanna test it on a different dataset.
@btdtech24495 жыл бұрын
Hi Jeff, Thank you for the great content! Although I have a question about the dataset. I thought it was bad practice to predict on the dataset you train the model with. Perhaps the code is written like that for simplicity sake but in real life, should the dataset be split into train_X, train_Y, test_X and test_Y using the 20/80 rule you mentioned in section 2.2? Thanks again1
@HeatonResearch5 жыл бұрын
Yes, it is. However, this is a VERY simple example on how to save a neural network, load it again, and prove that was done correctly. Anything else I left out for simplification.
@gauravbhandup5 жыл бұрын
Hi jeff does your kaggl-util package works perfectly?
@HeatonResearch5 жыл бұрын
It works well, I would never claim anything to be perfect. :)