An Introduction to Distributed Hybrid Hyperparameter Optimization- Jun Liu | SciPy 2022

  Рет қаралды 473

Enthought

Enthought

Күн бұрын

Hyperparameter optimization in machine learning is commonly done on single search spaces, where the same search method is applied to all parameters. We introduce the concept of hybrid search space to combine various existing tuning methods in one optimization task through Fugue-tune. We will also demo how it simplifies the usage of Optuna and scales out on Spark.

Пікірлер
BAYGUYSTAN | 1 СЕРИЯ | bayGUYS
36:55
bayGUYS
Рет қаралды 1,9 МЛН
Sigma Kid Mistake #funny #sigma
00:17
CRAZY GREAPA
Рет қаралды 30 МЛН
Improving Random Sampling in Python- Pamphile Roy | SciPy 2022
30:18
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
Large Language Models explained briefly
7:58
3Blue1Brown
Рет қаралды 1,4 МЛН
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 585 М.
Model Predictive Control
12:13
Steve Brunton
Рет қаралды 278 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 451 М.