"A paper bag can solve MNIST ". That should be a quote on a T-shirt.
@AD-bz2ci4 жыл бұрын
I would buy it. Please make.
@sparshgupta29314 жыл бұрын
What does that mean??
@asdfasdfuhf4 жыл бұрын
@@sparshgupta2931 That basically anything can be trained to recognize numbers from the dataset named MNIST: www.google.com/search?q=what+is+mnist&oq=what+is+mnist+&aqs=chrome..69i57.2387j0j7&sourceid=chrome&ie=UTF-8
@Quitoss2 жыл бұрын
I’m a paper bag
@deojeetsarkar20064 жыл бұрын
Thanks for everything sentdex, your name'll always find a folder in my PC.
@JordanMetroidManiac2 жыл бұрын
26:30 The number of possible hyperparameter combinations with that search space is 8^2 + 8^3 + 8^4 + 8^5 = 37440. So, of course, a random search on that could take up to 37440 trials to find the best possible combination of hyperparameters. There are usually subsets of combinations that are "alike" and would achieve similar performance, so you wouldn't need to set max_trials = 37400, but more like max_trials = 100.
@abrahamowos2 жыл бұрын
I actually came to the comment session to find this. Thank you for posting this.
@hassenzarroug91594 жыл бұрын
Seriously man, you are the reason why i love machine learning! you make it look easy and fun which is the exact opposit of what my teachers are doing! Thank you so much and God bless you!
@Accarvd4 жыл бұрын
Probably one of the best KZbin videos (on this topic)
@ajaysingh88874 жыл бұрын
Finally, this is what I was looking for.
@Neogohan14 жыл бұрын
Both Kite AND Keras Tuner were things I've been wanting for awhile as part of learning TF, and you managed to knock em both out in the one vid. Very useful stuff! Thanks!
@jumpthecagemma49874 жыл бұрын
Last comment - this tuner only works if calling keras directly from tensorflow Example: tf.keras.add.layers(etcc) Calling Keras on it's own provides an error about compiling a model Bad Example: Keras.add.layers(etc) Hope this helps
@riadhgharbi79854 жыл бұрын
keep up the likes and comments lads, we need more of his content, support our guy here xD
@lilprotakeit4 жыл бұрын
Hi Sir, Your videos are the reason why i am continuing and surviving as a data engineer. I would be grateful if you can create a series on Apache Airflow as its a heavily used framework for data engineering. Please do consider.
@fcolecumberri4 жыл бұрын
You should add this to your keras tutorial playlist. Thanks for this and for that tutorial
@beyhan91914 жыл бұрын
Zero unlike! You’re doing great things
@TheRedProject4 жыл бұрын
I started using Kite a month ago. I love it.
@Evan_2424 жыл бұрын
This kite things looks awesome, I will definitely check it out. Thanks Harrison, hope you're doing well :)
@Evan_2424 жыл бұрын
I download it, it's awesome ! :)
@kacperkubicki11014 жыл бұрын
Woah, first time my uni classes were faster to teach me something than sentdex. I might reconsider my lack of faith in their purpose ;) have you tried talos for hyperparameters optimization? We've been using it during classes and tbh it seems nicer to me than keras tuner.
@sentdex4 жыл бұрын
Nice, I'll check out Talos.
@mrfizzybubbs39094 жыл бұрын
@@sentdex It might also be worthwhile to also check out the hyperopt library.
@Manu-jc2sx3 жыл бұрын
What optimization method is the best one? There are many, like keras tuner, Hyperopt, Talos etc..
@GauravKumar-ch3xn4 жыл бұрын
There is a default hyper-parameter tuning available in TensorFlow, does the same thing with some pretty visualization also while attached to tensor-board, What would be interesting to see if any of these packages apply Bayesian Optimization also , that would be nicer
@bhuvaneshs.k6384 жыл бұрын
Thanks for this.... Very helpful.... U r the guy for machine learning in python. Thqs.... !!
@_nttai4 жыл бұрын
I'm glad I found your channel
@fuba444 жыл бұрын
This was great! I will go play with it right now. thank you!
@rchuso3 жыл бұрын
I've been using Bayesian-Optimization, and this looks a lot like that.
@meandkg3 жыл бұрын
What about cross validation? Does it support optimizing for the average score of say 5 fold cross validation? Or does it just optimize on one fold?
@amirmasoudkiakojouri66553 жыл бұрын
Thank you for your clear description. I have problems with kerastuning installation and import it for tuning. Would you please let me know how to install it? When I want to install kerastuner in the terminal, I see an error as below: ERROR: Could not find a version that satisfies the requirement kerastuner (from versions: none) ERROR: No matching distribution found for kerastuner
@taylormcclenny14164 жыл бұрын
Doing God's work, my friend!
@ankitganeshpurkar3 жыл бұрын
Hi sir, This tutorial is simple and effective. I have query when i am applying this random search the codes runs well. But the number of layer is something else and actually layers are different in numbers. The both number don't tally most of the time. Example number of layer in a model is 7 but total layer shown is 18. What could be the problem?
@wadyn954 жыл бұрын
Dear Sentdex, could you introduce tensorflow object detection API? TF updated up to 2.0 and there is no fully working tutorial now... I got too many errors while trying to use that stuff
@sentdex4 жыл бұрын
Yeah I would like to revisit the object detection stuff, but other topics keep getting in the way :D ...one day...
@nileshmishra37964 жыл бұрын
Awesome man, you never disappoint :)
@moniquemarinslp4 жыл бұрын
Great stuff! Thumbs up for the tutorial and Kite (also quite cool)!
@usamatahir70914 жыл бұрын
I love you Harrison!
@jorgeespinoza39384 жыл бұрын
Pharmaceutical companies should be dreaming of having an actual physical tuner for their compounds, although I beleive the length of their testing takes bit more than just 19 seconds.
@sankamadushan79404 жыл бұрын
Good job Sentdex. This is great. Save lot of time.
@interpro4 жыл бұрын
Great tutorial! Thanks much!
@MrLiquimatter4 жыл бұрын
sold on Kite!
@yoannrey52863 жыл бұрын
Hello ! Thanks for the video :) One question, did you manage to use Keras-tuner with Tensorboard ?
@programerahmed44702 жыл бұрын
Great video: How can I force Keras Tuner to use default hyperparameter values for the first optimization iteration
@kerolesmonsef41794 жыл бұрын
you are great . Thank you
@angelazhang90822 жыл бұрын
Thanks for the thorough video. I've been trying to figure out a way to find batch_size that the tuner found the best results with, but I've been unsuccessful. Can you comment on that? I watched your video several times and don't think you mentioned anything about batch size, which is a very common parameter to test with. I looked up several articles and haven't found any information on that either. I also haven't found any information on how to add batch size as a parameter for the tuner. So the only thing I can think of is to run the tuner multiple times for the varying batch sizes, but I'm sure there's a better way.
@felixmuller90622 жыл бұрын
First of all thank you very much for this amazing video. Helped me a lot! I still have a quastion. Is it possible to give the Coice function a "none" as a value? I´m aiming for a HP-Optimization where I want to try different regularizers. One option shall be that I don´t use any optimizer. Is this possible with keras_tuner?
@eranfeit2 жыл бұрын
Thank you for great video
@mattb98234 жыл бұрын
This is awesome. I've been learning ML for about a month, paid for a couple courses on Udemy but I seem to be learning more from your channel when trying to debug and optimize things. Quick question, is there any way to integrate TensorBoard with RandomSearch?
@oliverpolden4 жыл бұрын
I have exactly this question. I'm just about to try but I assume you can just assign each hyperparameter to a variable and construct your Tensorboard name from those and of course remember to use the variables in your model definition. I don't see why that wouldn't work.
@nirbhay_raghav2 жыл бұрын
I believe tensorboard has a "what-if" option. You need to provide your model with data directories. It would not exactly be a random search but it is better than nothing. Check it out , you may find it useful.
@oliverpolden4 жыл бұрын
How does keras-tuner compare with Tensorboard's hparams? Seems hparams would be better for analysis within Tensorboard?
@riyabanerjee26564 жыл бұрын
I get the error "RuntimeError: Model-building function did not return a valid Keras Model instance, found ". Any idea what I should? I googled it, and this was written: "If you want to return more than one Keras Model, you'll have to override Tuner or BaseTuner. In this case, I recommend overriding BaseTuner, since Tuner assumes a single Keras Model but BaseTuner works for any arbitrary object(s). The methods you'll need to override are BaseTuner.run_trial, BaseTuner.save_model, and BaseTuner.load_model The docstring of BaseTuner.run_trial should have enough info to get you started with how to do this, if not please let me know: github.com/keras-team/keras-tuner/blob/master/kerastuner/engine/base_tuner.py#L134" I did not quite understand the error. Any idea?
@neatpolygons85004 жыл бұрын
oh yeah, Kite. It's fricking genius and I use it with vim
@nano75864 жыл бұрын
I ALWAYS wondered how there is no optimizer for hyperparameters. People working with neural networks and machine learning but talking about "trial and error" when it comes to HYPER and not HYPO parameters. This always really confused me. It's basically like applying a neural network to the neural network. Sure, it takes a long time and is CPU/GPU expensive, but if needed you can run it overnight or even for longer times. But that also overfits your model to the validation data you are using for optimization, right? Anyways, thanks so much for sharing!
@gouki10014 жыл бұрын
Is it a norm to use keras tuner and keras callbacks to optimise? OR these are two methods not needing to utilize each other
@1991kushagra4 жыл бұрын
That was really an awesome video. Hats off. I have an additional doubt in this. What if we want to use cross validation also together with random search? In scikit learn we can do that by randomizedsearchCV, is there any way in Keras also?
@MultiNarutoGamer4 жыл бұрын
@sentdex Is it possible to tell the model to try it with and without max pooling? Or with different activation functions?
@mdashad4394 жыл бұрын
Best Python Tutorial ever very understandable.
@ggpopa13194 жыл бұрын
But then why don't use an optimiser like Adam or SGD to optimise the hyperparameters too?
@joeboyle73904 жыл бұрын
Because evaluating the function (training an entire model) is incredibly computationally expensive compared to evaluating a single epoch. Tldr its too slow and the function is probably not convex!
@francescaalfieri51874 жыл бұрын
Thanks for this video!!! I have a question, is there a way to check the value assumed by the variable hp.Int("inputs_unit") in every step? I have already tried to use debug with no success.
@Yisi.voyager4 жыл бұрын
Does the keras tuner tell you how many layers is the most optimal?
@meandkg3 жыл бұрын
so.... Keras Tuner is better than writing for loops and testing manually? Can it get stuck in local optima?
@gianlucavernia94444 жыл бұрын
Hey Sentdex are you going to continue the quantum programming series or is it finished?
@marmar3213 жыл бұрын
I forgot to save the pickle file for my test. By any way, is it possible to do a load summary in a previous run of keras tuner without pickle? Thanks
@tingyizhu36914 жыл бұрын
R package has plot_tune function to have a nice visualization of the tuning results. Does python have similar thing?
@andris7884 жыл бұрын
Would this work if you have a mixed input NN? I'm trying to implement this to mine. It has a CNN and an MLP combined in a final dense layer. Keras-Tuner doesn't like if I divide X_train to [X_train_cnn, X_train_mlp].
@patrickduhirwenzivugira47293 жыл бұрын
Thank you for the great video. How can I also tune the optimizers (let's say ['Adam, RMSprop]) with dynamic learning rates? Many tutorials keep it fixed. Thank you.
@siddheshwarimishra80424 жыл бұрын
Respected sir, please tell me how to use the swarm optimization technique in the pre-trained model. and please suggest me can I use multiple pre-trained networks with multiple nature-based optimization techniques for multiple inputs. please.....
@FrostEnceladus4 жыл бұрын
How do you know when you are using too many or too few neurons? And how do you solve the number of neuron per layer from the number of layers needed. That's my problem
@alberto.polini4 жыл бұрын
Thank you sentdex i love your videos
@maliksalman19072 жыл бұрын
Sir, I need to ask you about the firefly algorithm to optimize CNN model.
@Zifox204 жыл бұрын
Interesting feature, thanks !
@coder36522 жыл бұрын
Thanks for video
@sriadityab47943 жыл бұрын
Can you tell me how to perform cross-validation/hyper parameter tuning for time series forecasting using LSTM?
@neoblackcyptron2 жыл бұрын
Sorry I am worse than a paper bag. I could not solve the fashion MNIST problem manually by different layers sizes and depths for a non-CNN FCN, I could not cross the 90% val_accuracy. That is why I am going to use the keras tuner.
@deepakkumarjoshi4 жыл бұрын
Thanks for the great work, how do we plot the result to compare, actual, predicted datasets after using the tuner?
@jumpthecagemma49874 жыл бұрын
Also, can someone properly explain to me what .reshape(-1,28,28,1) does? I know the 28,28 re shapes the x and y sizing and the positive 1 at the end makes it all one dimensional, but am i missing something @sentdex?
@TheMaytschi3 жыл бұрын
Great video!! @sentdex or anyone else: I am using the tuner for RNN with stacked LSTM layers, but for some reason the tuner does not converge whereas if I try the same architecture during normal fitting, it converges. Any idea why this could happen?
@david-vr1ty4 жыл бұрын
Nice tutorial! While watching I came up with some questions regarding overfitting/generalization: 1. Does Keras-Tuner searches for the best model considering overfitting? We specify the parameters for training (epochs & batch size), so is Keras-Tuner somehow considering overfitting in the model comparison or is it just comparing the acc of each model after the specified epchos rigardingless the number of epoch leads to overfitting or not? 2. If it does not, is the tuner still usefull? 3. If it does, can we show the number of epochs used for each model in the model report? Thx in advance ;)
@omarabobakr22924 жыл бұрын
david I don’t know about whether or not Keras tuner does that, but callbacks in keras might help with this task. You can let your model train with a high number of epochs, but after each epoch the model will save its weights to a ckpt file in your drive. When training is done you could load the weights of each epoch to your model and evaluate your test data.
@pushkarajpalnitkar16954 жыл бұрын
@@omarabobakr2292 Agree but callbacks argument is only available while executing fit, predict or evaluate methods. We are not using none of these methods here. So how and where can I use earlystopping while using tuner?
@alberro96 Жыл бұрын
How could I implement this with CNN? I'm working with my own dataset adn it seems like the keras tuners don't like the tf.data.Datasets yet. They're still expecting (x_train, y_train), (x_test, y_test). Is my thinking correct there? Essentially I'm loading my data using tf.keras.preprocessing.image_dataset_from_directory and would like to feed this into the tune. How could I split my own data in (x_train, y_train), (x_test, y_test)?
@51nibbler2 жыл бұрын
thx for good explain
@walisyed46254 жыл бұрын
Very useful, thanks
@jm10oct3 жыл бұрын
WOW!!! that might have just made my project 3 months shorter!!!!
@jakaseptiadi17524 жыл бұрын
I'm thinking about changing keras optimizer algorithm during training. Is it possible in keras?
@chaitanyasharma62702 жыл бұрын
why did you remove maxpooling , is there a way to add some maxpooling layers?
@rogervaldivia70333 жыл бұрын
Thanks for the video! Do you know if its possible to optimize to cross validation error?
@edeneden974 жыл бұрын
Is it random search or does it use some genetic algorithm / other RL stuff?
@yazanmajzoub65823 жыл бұрын
really..... thanks
@pushkarajpalnitkar16954 жыл бұрын
Graet video! Can anyone please suggest me the number of epochs to use in the search? More specifically will using more number of epochs helps the search? Or small say 1-3 epochs are sufficient for comparison of model performance?
@spitfire-dragonboatita96104 жыл бұрын
I have a problem, when i put "hp"into the build_model function's argument it gives an error: "NameError: name 'hp' is not defined"; I've already import keras and I've following step by step your tutorial...but it doesn't work :(
@Gavinnnnnnnnnnnnnnn4 жыл бұрын
this is just like grid search cross validation, which has existed for years
@sentdex4 жыл бұрын
I don't think I or anyone claimed it was a novel concept, just that many people are likely doing it either manually or by their own new code every time and there's a lib to help.
@chaimaaessayeh89294 жыл бұрын
Very interesting!! Is there a way to apply this same technique on a reinforcement learning model? like the one you build in another video series?
@luispintoc2 жыл бұрын
You'd use the bayesian optimizer instead of the random search
@matt_t9373 жыл бұрын
Hi! thank you for the quality of your videos, you are doing an awesome job! I wanted to ask you if there you know how to tune keras models hyperparameters using Sklearn TimeSeriesSplit cross validation method and not just a shuffling cross validation like in yuor model. I tried to use Sklearn tuner but it doesn't work with my deep learning model however I really really need that cv option... help me please I need to finish up my Bachelor thesis, I can pay :)
@davidcristobal71524 жыл бұрын
Hi sentdex, nice video. Is there any way to integrate this keras-tuner with keras-rl (reinforcement learning) and custom environments with open ai gym interface?
@sentdex4 жыл бұрын
I am not sure, but I don't think so at this stage.
@kaustubhkulkarni4 жыл бұрын
How do we save and checkpoint the kerastuner random models?
@ahsanrao61644 жыл бұрын
If someone has issues during running this code then remove this line from your model model.add(MaxPooling2D(pool_size=(2, 2))) My code was not working, and when I remove this line, it works fine now.
@thomhomsma54743 жыл бұрын
How do you use tensorboard after running the search?
@rafaelstevenson3 жыл бұрын
Hello, i seems to have problem in using keras tuner that the result shows disagreement , if you understand and care to help here is the detailed issue statement in stack overflow questions/66783048/keras-tuner-uses-for-i-in-rangehp-intn-layers-1-3-but-does-not-show-agre
@plutoboy88274 жыл бұрын
Thank You !!!!!!!!!
@iskrabesamrtna3 жыл бұрын
I still cant figure out how is even possible to have -1 in reshaping while creating x and y train and test labels
@erosennin9504 жыл бұрын
I would enjoy a kaggle-challenge playlist :D what do you think?
@sentdex4 жыл бұрын
I've thought about it a few times but Kaggle comps often have overly burdensome rules associated with their datasets...and that tends to scare me off from doing a series
@erosennin9504 жыл бұрын
@@sentdex got ya :(
@leonshamsschaal4 жыл бұрын
@sentdex can we have a building nn from scratch?
@sentdex4 жыл бұрын
It's coming!
@RojinaPanta13 жыл бұрын
how can we carry out search on train_on_batch dataset ?
@rezan69714 жыл бұрын
would you please take a look at fastapi and make a tutorial, todo app with react maybe(for someone who already knows react) or at least the back end of it without frontend
@minazulkhan82874 жыл бұрын
hi dear i m working on tkinter . i used ur code for multiple windows using tkinter ...the code works fine but when i used inbuild function to display current time in second window it gave error " module tkinter has no attrribute time" the code line is : localtime =time.asctime(time.localtime(time.time()) the next line includes label with a term text= localtime plz give aolution soon
@varunnagpal22584 жыл бұрын
I ran it with varying number of layers, but it shows strange mismatch between number of layers reported and value of num_layers [Trial summary] |-Trial ID: 79cd7bb6146b4c243eb2bc51f19985de |-Score: 0.8444444537162781 |-Best step: 0 > Hyperparameters: |-Conv2D_0: 448 |-Conv2D_1: 448 |-Conv2D_2: 512 |-learning_rate: 0.0001 |-num_layers: 1 |-rate: 0.5 You can see there are three Conv2D layers and yet it shows num_layers as 1 ...why ? def build_model(hp): model = tf.keras.Sequential(); model.add(base_model); for i in range(hp.Int('num_layers', 1, 2)): model.add(tf.keras.layers.Conv2D(filters=hp.Int('Conv2D_' + str(i), min_value=32, max_value=512, step=32), kernel_size=3, activation='relu')); model.add(tf.keras.layers.Dropout(hp.Choice('rate', [0.3, 0.5]))); model.add(tf.keras.layers.GlobalAveragePooling2D()); model.add(tf.keras.layers.Flatten()); model.add(tf.keras.layers.Dropout(0.2)); model.add(tf.keras.layers.Dense(5, activation='softmax')); model.compile(optimizer=tf.keras.optimizers.RMSprop(hp.Choice('learning_rate', [1e-4, 1e-5])), loss='categorical_crossentropy', metrics=['accuracy']); return model
@nmana97594 жыл бұрын
Can this tuner used for RNN, Please answer thank you
@Yourbitchiscrazy4 жыл бұрын
Can and if how do, you use tensorboard and keras tuner together?
@jumpthecagemma49874 жыл бұрын
What playlist will this be added to?
@shayekhbinislam4 жыл бұрын
What is the best counterpart of keras tuner for pytorch?
@jasonproconsult95254 жыл бұрын
I'm using Spyder which prints tuner.results_summary() as an . Seems no workaround yet from Spyder dev :(