Optimizing Neural Network Structures with Keras-Tuner

  Рет қаралды 96,868

sentdex

sentdex

Күн бұрын

Пікірлер
@bhaskersriharshasuri7359
@bhaskersriharshasuri7359 5 жыл бұрын
"A paper bag can solve MNIST ". That should be a quote on a T-shirt.
@AD-bz2ci
@AD-bz2ci 5 жыл бұрын
I would buy it. Please make.
@sparshgupta2931
@sparshgupta2931 4 жыл бұрын
What does that mean??
@asdfasdfuhf
@asdfasdfuhf 4 жыл бұрын
@@sparshgupta2931 That basically anything can be trained to recognize numbers from the dataset named MNIST: www.google.com/search?q=what+is+mnist&oq=what+is+mnist+&aqs=chrome..69i57.2387j0j7&sourceid=chrome&ie=UTF-8
@Quitoss
@Quitoss 3 жыл бұрын
I’m a paper bag
@deojeetsarkar2006
@deojeetsarkar2006 5 жыл бұрын
Thanks for everything sentdex, your name'll always find a folder in my PC.
@JordanMetroidManiac
@JordanMetroidManiac 3 жыл бұрын
26:30 The number of possible hyperparameter combinations with that search space is 8^2 + 8^3 + 8^4 + 8^5 = 37440. So, of course, a random search on that could take up to 37440 trials to find the best possible combination of hyperparameters. There are usually subsets of combinations that are "alike" and would achieve similar performance, so you wouldn't need to set max_trials = 37400, but more like max_trials = 100.
@abrahamowos
@abrahamowos 2 жыл бұрын
I actually came to the comment session to find this. Thank you for posting this.
@hassenzarroug9159
@hassenzarroug9159 4 жыл бұрын
Seriously man, you are the reason why i love machine learning! you make it look easy and fun which is the exact opposit of what my teachers are doing! Thank you so much and God bless you!
@ajaysingh8887
@ajaysingh8887 5 жыл бұрын
Finally, this is what I was looking for.
@Accarvd
@Accarvd 4 жыл бұрын
Probably one of the best KZbin videos (on this topic)
@riadhgharbi7985
@riadhgharbi7985 4 жыл бұрын
keep up the likes and comments lads, we need more of his content, support our guy here xD
@Neogohan1
@Neogohan1 4 жыл бұрын
Both Kite AND Keras Tuner were things I've been wanting for awhile as part of learning TF, and you managed to knock em both out in the one vid. Very useful stuff! Thanks!
@beyhan9191
@beyhan9191 5 жыл бұрын
Zero unlike! You’re doing great things
@bhuvaneshs.k638
@bhuvaneshs.k638 5 жыл бұрын
Thanks for this.... Very helpful.... U r the guy for machine learning in python. Thqs.... !!
@TheRedProject
@TheRedProject 4 жыл бұрын
I started using Kite a month ago. I love it.
@lilprotakeit
@lilprotakeit 5 жыл бұрын
Hi Sir, Your videos are the reason why i am continuing and surviving as a data engineer. I would be grateful if you can create a series on Apache Airflow as its a heavily used framework for data engineering. Please do consider.
@_nttai
@_nttai 5 жыл бұрын
I'm glad I found your channel
@fcolecumberri
@fcolecumberri 4 жыл бұрын
You should add this to your keras tutorial playlist. Thanks for this and for that tutorial
@Evan_242
@Evan_242 4 жыл бұрын
This kite things looks awesome, I will definitely check it out. Thanks Harrison, hope you're doing well :)
@Evan_242
@Evan_242 4 жыл бұрын
I download it, it's awesome ! :)
@taylormcclenny1416
@taylormcclenny1416 5 жыл бұрын
Doing God's work, my friend!
@fuba44
@fuba44 5 жыл бұрын
This was great! I will go play with it right now. thank you!
@jumpthecagemma4987
@jumpthecagemma4987 5 жыл бұрын
Last comment - this tuner only works if calling keras directly from tensorflow Example: tf.keras.add.layers(etcc) Calling Keras on it's own provides an error about compiling a model Bad Example: Keras.add.layers(etc) Hope this helps
@GauravKumar-ch3xn
@GauravKumar-ch3xn 4 жыл бұрын
There is a default hyper-parameter tuning available in TensorFlow, does the same thing with some pretty visualization also while attached to tensor-board, What would be interesting to see if any of these packages apply Bayesian Optimization also , that would be nicer
@programerahmed4470
@programerahmed4470 2 жыл бұрын
Great video: How can I force Keras Tuner to use default hyperparameter values for the first optimization iteration
@MrLiquimatter
@MrLiquimatter 5 жыл бұрын
sold on Kite!
@usamatahir7091
@usamatahir7091 4 жыл бұрын
I love you Harrison!
@amirmasoudkiakojouri6655
@amirmasoudkiakojouri6655 3 жыл бұрын
Thank you for your clear description. I have problems with kerastuning installation and import it for tuning. Would you please let me know how to install it? When I want to install kerastuner in the terminal, I see an error as below: ERROR: Could not find a version that satisfies the requirement kerastuner (from versions: none) ERROR: No matching distribution found for kerastuner
@kacperkubicki1101
@kacperkubicki1101 5 жыл бұрын
Woah, first time my uni classes were faster to teach me something than sentdex. I might reconsider my lack of faith in their purpose ;) have you tried talos for hyperparameters optimization? We've been using it during classes and tbh it seems nicer to me than keras tuner.
@sentdex
@sentdex 5 жыл бұрын
Nice, I'll check out Talos.
@mrfizzybubbs3909
@mrfizzybubbs3909 5 жыл бұрын
​@@sentdex It might also be worthwhile to also check out the hyperopt library.
@Manu-jc2sx
@Manu-jc2sx 3 жыл бұрын
What optimization method is the best one? There are many, like keras tuner, Hyperopt, Talos etc..
@neatpolygons8500
@neatpolygons8500 4 жыл бұрын
oh yeah, Kite. It's fricking genius and I use it with vim
@nileshmishra3796
@nileshmishra3796 5 жыл бұрын
Awesome man, you never disappoint :)
@kerolesmonsef4179
@kerolesmonsef4179 5 жыл бұрын
you are great . Thank you
@meandkg
@meandkg 3 жыл бұрын
What about cross validation? Does it support optimizing for the average score of say 5 fold cross validation? Or does it just optimize on one fold?
@sankamadushan7940
@sankamadushan7940 5 жыл бұрын
Good job Sentdex. This is great. Save lot of time.
@rchuso
@rchuso 3 жыл бұрын
I've been using Bayesian-Optimization, and this looks a lot like that.
@eranfeit
@eranfeit 2 жыл бұрын
Thank you for great video
@moniquemarinslp
@moniquemarinslp 5 жыл бұрын
Great stuff! Thumbs up for the tutorial and Kite (also quite cool)!
@ankitganeshpurkar
@ankitganeshpurkar 3 жыл бұрын
Hi sir, This tutorial is simple and effective. I have query when i am applying this random search the codes runs well. But the number of layer is something else and actually layers are different in numbers. The both number don't tally most of the time. Example number of layer in a model is 7 but total layer shown is 18. What could be the problem?
@interpro
@interpro 4 жыл бұрын
Great tutorial! Thanks much!
@RojinaPanta1
@RojinaPanta1 3 жыл бұрын
how can we carry out search on train_on_batch dataset ?
@kaustubhkulkarni
@kaustubhkulkarni 4 жыл бұрын
How do we save and checkpoint the kerastuner random models?
@alberto.polini
@alberto.polini 5 жыл бұрын
Thank you sentdex i love your videos
@jorgeespinoza3938
@jorgeespinoza3938 5 жыл бұрын
Pharmaceutical companies should be dreaming of having an actual physical tuner for their compounds, although I beleive the length of their testing takes bit more than just 19 seconds.
@maliksalman1907
@maliksalman1907 2 жыл бұрын
Sir, I need to ask you about the firefly algorithm to optimize CNN model.
@oliverpolden
@oliverpolden 4 жыл бұрын
How does keras-tuner compare with Tensorboard's hparams? Seems hparams would be better for analysis within Tensorboard?
@felixmuller9062
@felixmuller9062 2 жыл бұрын
First of all thank you very much for this amazing video. Helped me a lot! I still have a quastion. Is it possible to give the Coice function a "none" as a value? I´m aiming for a HP-Optimization where I want to try different regularizers. One option shall be that I don´t use any optimizer. Is this possible with keras_tuner?
@angelazhang9082
@angelazhang9082 2 жыл бұрын
Thanks for the thorough video. I've been trying to figure out a way to find batch_size that the tuner found the best results with, but I've been unsuccessful. Can you comment on that? I watched your video several times and don't think you mentioned anything about batch size, which is a very common parameter to test with. I looked up several articles and haven't found any information on that either. I also haven't found any information on how to add batch size as a parameter for the tuner. So the only thing I can think of is to run the tuner multiple times for the varying batch sizes, but I'm sure there's a better way.
@meandkg
@meandkg 3 жыл бұрын
so.... Keras Tuner is better than writing for loops and testing manually? Can it get stuck in local optima?
@chaitanyasharma6270
@chaitanyasharma6270 3 жыл бұрын
why did you remove maxpooling , is there a way to add some maxpooling layers?
@wadyn95
@wadyn95 5 жыл бұрын
Dear Sentdex, could you introduce tensorflow object detection API? TF updated up to 2.0 and there is no fully working tutorial now... I got too many errors while trying to use that stuff
@sentdex
@sentdex 5 жыл бұрын
Yeah I would like to revisit the object detection stuff, but other topics keep getting in the way :D ...one day...
@Yisi.voyager
@Yisi.voyager 4 жыл бұрын
Does the keras tuner tell you how many layers is the most optimal?
@siddheshwarimishra8042
@siddheshwarimishra8042 4 жыл бұрын
Respected sir, please tell me how to use the swarm optimization technique in the pre-trained model. and please suggest me can I use multiple pre-trained networks with multiple nature-based optimization techniques for multiple inputs. please.....
@Zifox20
@Zifox20 5 жыл бұрын
Interesting feature, thanks !
@iskrabesamrtna
@iskrabesamrtna 3 жыл бұрын
I still cant figure out how is even possible to have -1 in reshaping while creating x and y train and test labels
@ggpopa1319
@ggpopa1319 5 жыл бұрын
But then why don't use an optimiser like Adam or SGD to optimise the hyperparameters too?
@joeboyle7390
@joeboyle7390 5 жыл бұрын
Because evaluating the function (training an entire model) is incredibly computationally expensive compared to evaluating a single epoch. Tldr its too slow and the function is probably not convex!
@gouki1001
@gouki1001 4 жыл бұрын
Is it a norm to use keras tuner and keras callbacks to optimise? OR these are two methods not needing to utilize each other
@sriadityab4794
@sriadityab4794 3 жыл бұрын
Can you tell me how to perform cross-validation/hyper parameter tuning for time series forecasting using LSTM?
@MultiNarutoGamer
@MultiNarutoGamer 4 жыл бұрын
@sentdex Is it possible to tell the model to try it with and without max pooling? Or with different activation functions?
@deepakkumarjoshi
@deepakkumarjoshi 4 жыл бұрын
Thanks for the great work, how do we plot the result to compare, actual, predicted datasets after using the tuner?
@rogervaldivia7033
@rogervaldivia7033 3 жыл бұрын
Thanks for the video! Do you know if its possible to optimize to cross validation error?
@jumpthecagemma4987
@jumpthecagemma4987 5 жыл бұрын
What playlist will this be added to?
@marmar321
@marmar321 3 жыл бұрын
I forgot to save the pickle file for my test. By any way, is it possible to do a load summary in a previous run of keras tuner without pickle? Thanks
@yoannrey5286
@yoannrey5286 3 жыл бұрын
Hello ! Thanks for the video :) One question, did you manage to use Keras-tuner with Tensorboard ?
@pushkarajpalnitkar1695
@pushkarajpalnitkar1695 4 жыл бұрын
Graet video! Can anyone please suggest me the number of epochs to use in the search? More specifically will using more number of epochs helps the search? Or small say 1-3 epochs are sufficient for comparison of model performance?
@FrostEnceladus
@FrostEnceladus 4 жыл бұрын
How do you know when you are using too many or too few neurons? And how do you solve the number of neuron per layer from the number of layers needed. That's my problem
@51nibbler
@51nibbler 2 жыл бұрын
thx for good explain
@mattb9823
@mattb9823 4 жыл бұрын
This is awesome. I've been learning ML for about a month, paid for a couple courses on Udemy but I seem to be learning more from your channel when trying to debug and optimize things. Quick question, is there any way to integrate TensorBoard with RandomSearch?
@oliverpolden
@oliverpolden 4 жыл бұрын
I have exactly this question. I'm just about to try but I assume you can just assign each hyperparameter to a variable and construct your Tensorboard name from those and of course remember to use the variables in your model definition. I don't see why that wouldn't work.
@nirbhay_raghav
@nirbhay_raghav 2 жыл бұрын
I believe tensorboard has a "what-if" option. You need to provide your model with data directories. It would not exactly be a random search but it is better than nothing. Check it out , you may find it useful.
@nmana9759
@nmana9759 4 жыл бұрын
Can this tuner used for RNN, Please answer thank you
@patrickduhirwenzivugira4729
@patrickduhirwenzivugira4729 3 жыл бұрын
Thank you for the great video. How can I also tune the optimizers (let's say ['Adam, RMSprop]) with dynamic learning rates? Many tutorials keep it fixed. Thank you.
@walisyed4625
@walisyed4625 5 жыл бұрын
Very useful, thanks
@Yourbitchiscrazy
@Yourbitchiscrazy 4 жыл бұрын
Can and if how do, you use tensorboard and keras tuner together?
@jakaseptiadi1752
@jakaseptiadi1752 4 жыл бұрын
I'm thinking about changing keras optimizer algorithm during training. Is it possible in keras?
@guermouimawloud1782
@guermouimawloud1782 4 жыл бұрын
How can we define Dropout for each layer!!!!
@shayekhbinislam
@shayekhbinislam 5 жыл бұрын
What is the best counterpart of keras tuner for pytorch?
@leonshamsschaal
@leonshamsschaal 5 жыл бұрын
@sentdex can we have a building nn from scratch?
@sentdex
@sentdex 5 жыл бұрын
It's coming!
@nano7586
@nano7586 5 жыл бұрын
I ALWAYS wondered how there is no optimizer for hyperparameters. People working with neural networks and machine learning but talking about "trial and error" when it comes to HYPER and not HYPO parameters. This always really confused me. It's basically like applying a neural network to the neural network. Sure, it takes a long time and is CPU/GPU expensive, but if needed you can run it overnight or even for longer times. But that also overfits your model to the validation data you are using for optimization, right? Anyways, thanks so much for sharing!
@1991kushagra
@1991kushagra 4 жыл бұрын
That was really an awesome video. Hats off. I have an additional doubt in this. What if we want to use cross validation also together with random search? In scikit learn we can do that by randomizedsearchCV, is there any way in Keras also?
@alberro96
@alberro96 Жыл бұрын
How could I implement this with CNN? I'm working with my own dataset adn it seems like the keras tuners don't like the tf.data.Datasets yet. They're still expecting (x_train, y_train), (x_test, y_test). Is my thinking correct there? Essentially I'm loading my data using tf.keras.preprocessing.image_dataset_from_directory and would like to feed this into the tune. How could I split my own data in (x_train, y_train), (x_test, y_test)?
@coder3652
@coder3652 2 жыл бұрын
Thanks for video
@Harriswilliam94
@Harriswilliam94 5 жыл бұрын
Can you change the random search objective to an f-test?
@francescaalfieri5187
@francescaalfieri5187 4 жыл бұрын
Thanks for this video!!! I have a question, is there a way to check the value assumed by the variable hp.Int("inputs_unit") in every step? I have already tried to use debug with no success.
@12mkamran
@12mkamran 5 жыл бұрын
Yesss. 😍😍😍
@mdashad439
@mdashad439 5 жыл бұрын
Best Python Tutorial ever very understandable.
@abhishek_ar97
@abhishek_ar97 5 жыл бұрын
GridSearchCV?
@tingyizhu3691
@tingyizhu3691 4 жыл бұрын
R package has plot_tune function to have a nice visualization of the tuning results. Does python have similar thing?
@gianlucavernia9444
@gianlucavernia9444 5 жыл бұрын
Hey Sentdex are you going to continue the quantum programming series or is it finished?
@paulzimmer914
@paulzimmer914 4 жыл бұрын
Every time that this code is run it performs 1 tiral with 1 set of parameters right? and then all the trials are saved as pickle files.
@paulzimmer914
@paulzimmer914 4 жыл бұрын
nevermind it seems that since my max trials was set to 1 it would only do one test
@cargouvu
@cargouvu 5 жыл бұрын
Hey guys... not sure where to post this. I hope someone from the community can help. How come the random seeds change when we change the number? Like, the dataset is different when we do randomseed ==5 and when we do randomseed==10.
@sentdex
@sentdex 5 жыл бұрын
Random number generators work off a seed. We can set that seed to get repeatable results with random.
@TheMaytschi
@TheMaytschi 3 жыл бұрын
Great video!! @sentdex or anyone else: I am using the tuner for RNN with stacked LSTM layers, but for some reason the tuner does not converge whereas if I try the same architecture during normal fitting, it converges. Any idea why this could happen?
@lakeguy65616
@lakeguy65616 4 жыл бұрын
Is there a tuner for pytorch? Thank you
@jm10oct
@jm10oct 3 жыл бұрын
WOW!!! that might have just made my project 3 months shorter!!!!
@riyabanerjee2656
@riyabanerjee2656 4 жыл бұрын
I get the error "RuntimeError: Model-building function did not return a valid Keras Model instance, found ". Any idea what I should? I googled it, and this was written: "If you want to return more than one Keras Model, you'll have to override Tuner or BaseTuner. In this case, I recommend overriding BaseTuner, since Tuner assumes a single Keras Model but BaseTuner works for any arbitrary object(s). The methods you'll need to override are BaseTuner.run_trial, BaseTuner.save_model, and BaseTuner.load_model The docstring of BaseTuner.run_trial should have enough info to get you started with how to do this, if not please let me know: github.com/keras-team/keras-tuner/blob/master/kerastuner/engine/base_tuner.py#L134" I did not quite understand the error. Any idea?
@princeofexcess
@princeofexcess 4 жыл бұрын
could anyone give me a link to the old video with loops?
@minazulkhan8287
@minazulkhan8287 5 жыл бұрын
hi dear i m working on tkinter . i used ur code for multiple windows using tkinter ...the code works fine but when i used inbuild function to display current time in second window it gave error " module tkinter has no attrribute time" the code line is : localtime =time.asctime(time.localtime(time.time()) the next line includes label with a term text= localtime plz give aolution soon
@taylormcclenny1416
@taylormcclenny1416 5 жыл бұрын
Are you going to do any more videos in this vein/series?
@sentdex
@sentdex 5 жыл бұрын
I will likely be making use of keras tuner in figure videos, so probably not another dedicated video on just tuning, but likely moreso just using it inside of tutorials.
@taylormcclenny1416
@taylormcclenny1416 5 жыл бұрын
@@sentdex Right on! Thanks again man!
@chaimaaessayeh8929
@chaimaaessayeh8929 4 жыл бұрын
Very interesting!! Is there a way to apply this same technique on a reinforcement learning model? like the one you build in another video series?
@luispintoc
@luispintoc 2 жыл бұрын
You'd use the bayesian optimizer instead of the random search
@rezan6971
@rezan6971 5 жыл бұрын
would you please take a look at fastapi and make a tutorial, todo app with react maybe(for someone who already knows react) or at least the back end of it without frontend
@andris788
@andris788 4 жыл бұрын
Would this work if you have a mixed input NN? I'm trying to implement this to mine. It has a CNN and an MLP combined in a final dense layer. Keras-Tuner doesn't like if I divide X_train to [X_train_cnn, X_train_mlp].
@edeneden97
@edeneden97 5 жыл бұрын
Is it random search or does it use some genetic algorithm / other RL stuff?
@david-vr1ty
@david-vr1ty 4 жыл бұрын
Nice tutorial! While watching I came up with some questions regarding overfitting/generalization: 1. Does Keras-Tuner searches for the best model considering overfitting? We specify the parameters for training (epochs & batch size), so is Keras-Tuner somehow considering overfitting in the model comparison or is it just comparing the acc of each model after the specified epchos rigardingless the number of epoch leads to overfitting or not? 2. If it does not, is the tuner still usefull? 3. If it does, can we show the number of epochs used for each model in the model report? Thx in advance ;)
@omarabobakr2292
@omarabobakr2292 4 жыл бұрын
david I don’t know about whether or not Keras tuner does that, but callbacks in keras might help with this task. You can let your model train with a high number of epochs, but after each epoch the model will save its weights to a ckpt file in your drive. When training is done you could load the weights of each epoch to your model and evaluate your test data.
@pushkarajpalnitkar1695
@pushkarajpalnitkar1695 4 жыл бұрын
@@omarabobakr2292 Agree but callbacks argument is only available while executing fit, predict or evaluate methods. We are not using none of these methods here. So how and where can I use earlystopping while using tuner?
@spitfire-dragonboatita9610
@spitfire-dragonboatita9610 5 жыл бұрын
I have a problem, when i put "hp"into the build_model function's argument it gives an error: "NameError: name 'hp' is not defined"; I've already import keras and I've following step by step your tutorial...but it doesn't work :(
@matt_t937
@matt_t937 3 жыл бұрын
Hi! thank you for the quality of your videos, you are doing an awesome job! I wanted to ask you if there you know how to tune keras models hyperparameters using Sklearn TimeSeriesSplit cross validation method and not just a shuffling cross validation like in yuor model. I tried to use Sklearn tuner but it doesn't work with my deep learning model however I really really need that cv option... help me please I need to finish up my Bachelor thesis, I can pay :)
@luizhenriquesilvajunior5449
@luizhenriquesilvajunior5449 5 жыл бұрын
I'm getting the following error: tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.
@luizhenriquesilvajunior5449
@luizhenriquesilvajunior5449 5 жыл бұрын
I solved it changing objective='accuracy' to objective='val_acc',
@manikanta3977
@manikanta3977 5 жыл бұрын
Hai can you make a video on data science how to start learning data science ..
@rafaelstevenson
@rafaelstevenson 3 жыл бұрын
Hello, i seems to have problem in using keras tuner that the result shows disagreement , if you understand and care to help here is the detailed issue statement in stack overflow questions/66783048/keras-tuner-uses-for-i-in-rangehp-intn-layers-1-3-but-does-not-show-agre
Visualizing Neural Network Internals
53:41
sentdex
Рет қаралды 47 М.
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 108 МЛН
Леон киллер и Оля Полякова 😹
00:42
Канал Смеха
Рет қаралды 4,7 МЛН
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
Keras Preprocessing Layers
37:14
TensorFlow
Рет қаралды 34 М.
Python Keras Custom Loss Function and Gradient Tape
54:22
Jeff Heaton
Рет қаралды 10 М.
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 210 М.
Hyperparameter Optimization: This Tutorial Is All You Need
59:33
Abhishek Thakur
Рет қаралды 109 М.
Coding a Neural Network from Scratch in C: No Libraries Required
57:28
Nicolai Nielsen
Рет қаралды 31 М.
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,4 МЛН
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 399 М.
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 108 МЛН