Deep Learning Hyperparameter Tuning in Python, TensorFlow & Keras

  Рет қаралды 23,316

Greg Hogg

Greg Hogg

Күн бұрын

Пікірлер
@GregHogg
@GregHogg Жыл бұрын
Take my courses at mlnow.ai/!
@masteronepiece6559
@masteronepiece6559 7 ай бұрын
Next time summarize the results in a table in the last of the video. We're busy to watch the whole video.
@billybobandboshow
@billybobandboshow 3 жыл бұрын
Thank you for this video! I have been learning about deep learning algorithms over the holiday break! Hope we see more videos from you! I love your channel and content! Keep up the awesome work, happy holidays and happy new year! :)
@GregHogg
@GregHogg 3 жыл бұрын
You're very welcome and thanks so much for the kind words! Awesome work, happy new year!!
@iftekharanam8980
@iftekharanam8980 6 ай бұрын
That was excellent. Need more videos on DL
@tomaszzielonka9808
@tomaszzielonka9808 Жыл бұрын
@GreggHogg Hi, I got stuck with keras tuner. It seems that code below will only only create the function 'model_builder' once. If I change anything like add Dropout layer and rerun the function it keeps displaying the comment (see below the code), like it was consistenly reaching to the first version of function. Any clues on how to fix that? I would like to experiment with the 'model_builder' function (add/remove layers, dropouts, etc) and then observe what parameters tuner generates. def model_builder(hp) : model = Sequential() hp_activation = hp.Choice('activation', values = ['relu', 'tanh']) hp_layer_1 = hp.Int('layer_1', min_value = 2, max_value = 32, step = 2) hp_layer_2 = hp.Int('layer_2', min_value = 2, max_value = 32, step = 2) hp_learning_rate = hp.Choice('learning_rate', values = [1e-2, 1e-3, 1e-4]) model.add(Dense(units = hp_layer_1, activation = hp_activation)) model.add(Dense(units = hp_layer_2, activation = hp_activation)) model.add(Dense(units = 1, activation = 'sigmoid')) model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate = hp_learning_rate), loss = 'binary_crossentropy', metrics = [tf.keras.metrics.Recall()]) return model tuner = kt.Hyperband(model_builder, objective = kt.Objective("val_recall", direction = "max"), max_epochs = 50, factor = 3, seed = 42) Comment : Reloading Tuner from .\untitled_project\tuner0.json
@tigjuli
@tigjuli 2 жыл бұрын
Simple explanation, awesome video!
@GregHogg
@GregHogg 2 жыл бұрын
Thank you!
@rudrathakkar56
@rudrathakkar56 3 жыл бұрын
Thank you . I am learning deep learning .This helped me much
@GregHogg
@GregHogg 3 жыл бұрын
Perfect - Really glad to hear it!
@arsheyajain7055
@arsheyajain7055 3 жыл бұрын
Awesome video!!
@GregHogg
@GregHogg 3 жыл бұрын
Thanks a bunch Arsheya! Hope you're having a great holiday break :)
@dakshbhatnagar
@dakshbhatnagar 2 жыл бұрын
Great Video Man but tbh I was actually expecting some sort of automation of the hyperparameter tuning.
@GregHogg
@GregHogg 2 жыл бұрын
kzbin.info/www/bejne/bH_JYqttprmbiJo
@dakshbhatnagar
@dakshbhatnagar 2 жыл бұрын
@@GregHogg thanks
@prabinbasyal1049
@prabinbasyal1049 3 жыл бұрын
Can you suggest data science course? I already read numpy,pandas and matplotlib.
@GregHogg
@GregHogg 3 жыл бұрын
Awesome! IBM Data science is a great intro. Big big fan of Andrew Ng's deep learning as well.
@haneulkim4902
@haneulkim4902 2 жыл бұрын
Thanks for an amazing video! Is there way to tune hyperparameters like in sklearn w/o using keras-tuner?
@GregHogg
@GregHogg 2 жыл бұрын
You're very welcome! I'm sure there is, although I don't believe I've done it before
@tigjuli
@tigjuli 2 жыл бұрын
yes, there is. you have to define a model as a function and use KerasClassifier from keras as a wrapper to work with sklearn's GridSearch or Ramdomized search. I'm sure there are videos on youtube
@luisalbertoburbano9295
@luisalbertoburbano9295 Жыл бұрын
good afternoon, I have a task and I have not been able to create the keras tuner for 5000 rows with 4 columns, in each column the numbers are random from 0 to 9 and I need an output of only 4 numbers this is the code # Initialising the RNN model = Sequential() # Adding the input layer and the LSTM layer model.add(Bidirectional(LSTM(neurons1, input_shape=(window_length, number_of_features), return_sequences=True))) # Adding a first Dropout layer model.add(Dropout(0.2)) # Adding a second LSTM layer model.add(Bidirectional(LSTM(neurons2, input_shape=(window_length, number_of_features), return_sequences=True))) # Adding a second Dropout layer model.add(Dropout(0.2)) # Adding a third LSTM layer model.add(Bidirectional(LSTM(neurons3, input_shape=(window_length, number_of_features), return_sequences=True))) # Adding a fourth LSTM layer model.add(Bidirectional(LSTM(neurons4, input_shape=(window_length, number_of_features), return_sequences=False))) # Adding a fourth Dropout layer model.add(Dropout(0.2)) # Adding the first output layer with ReLU activation function model.add(Dense(output_neurons, activation='relu')) # Adding the last output layer with softmax activation function model.add(Dense(number_of_features, activation='softmax')) Thank you very much
@AllanAlmeidaOficial
@AllanAlmeidaOficial Жыл бұрын
GPT, Google, Stack Overflow...
@BB-2383
@BB-2383 8 ай бұрын
Side comment - we divide x by 255, because the image is grayscale. An RGB of white is (255,255,255), so we are converting the values upon dividing to (1,1,1), then leaving black as (0,0,0). So, an important note when training images is first convert the images to grayscale.
@GregHogg
@GregHogg 8 ай бұрын
Yes thank you ☺️
@no-name168
@no-name168 Ай бұрын
it has nothing to do with rgb. rgb is 3 channels, grayscale is 1. you scale both to get "normal" value range because apparently model's learning process works better on scaled values. also you do not always convert to grayscale
Mastering Hyperparameter Tuning with Optuna: Boost Your Machine Learning Models!
28:15
Wednesday VS Enid: Who is The Best Mommy? #shorts
0:14
Troom Oki Toki
Рет қаралды 50 МЛН
Andro, ELMAN, TONI, MONA - Зари (Official Audio)
2:53
RAAVA MUSIC
Рет қаралды 8 МЛН
$1 vs $500,000 Plane Ticket!
12:20
MrBeast
Рет қаралды 122 МЛН
Жездуха 42-серия
29:26
Million Show
Рет қаралды 2,6 МЛН
Auto-Tuning Hyperparameters with Optuna and PyTorch
24:05
PyTorch
Рет қаралды 47 М.
Hyperparameter Tuning For Neural Networks in Python
28:27
NeuralNine
Рет қаралды 3,5 М.
Deep Learning with Python, TensorFlow, and Keras tutorial
20:34
sentdex
Рет қаралды 1,3 МЛН
Hyperparameter Optimization with Ray Tune
28:40
Modzy
Рет қаралды 4,8 М.
Optimizing Neural Network Structures with Keras-Tuner
28:26
Hyperparameter Tuning of Machine Learning Model in Python
24:12
Data Professor
Рет қаралды 45 М.
Hands-On Hyperparameter Tuning with Scikit-Learn: Tips and Tricks
18:35
Ryan & Matt Data Science
Рет қаралды 4,9 М.
Wednesday VS Enid: Who is The Best Mommy? #shorts
0:14
Troom Oki Toki
Рет қаралды 50 МЛН