32. Bayesian Optimization

  Рет қаралды 2,831

Taylor Sparks

Taylor Sparks

Күн бұрын

Пікірлер
@franciscochagas-hp8yy
@franciscochagas-hp8yy 3 ай бұрын
professor, i'm study industrial chemistry in a university on brazil, and i go to learn more about this, i think is so much interesting in the chemical industry. i learn about this in other articles provide in cambridge, is so usefull
@TaylorSparks
@TaylorSparks 3 ай бұрын
@@franciscochagas-hp8yy it is such a great topic. Best of luck in learning it and applying it to your research
@vrhstpso
@vrhstpso 5 ай бұрын
Thank you professor 🙏. If we knew the objective function ( for example mean squared error equation) would it be logical to make use of bayesian optimization to find the minumum of that objective function which is computed for aimed optimized parameters in a given domain?
@TaylorSparks
@TaylorSparks 5 ай бұрын
@@vrhstpso sure, but it is more often the case that we don't know the objective function and we are instead trying to learn it using a surrogate model such as a gaussian process.
@vrhstpso
@vrhstpso 5 ай бұрын
@@TaylorSparks thank you.
@jinlongsu7308
@jinlongsu7308 5 ай бұрын
An excellent video. Thank you so much professor! 😀May I ask that if I want to optimise two or more properties together (e.g., strength and ductility), would it be better if I use a Multi-task Gaussian Process (MTGP) model, rather than several individual GP models? Because the trade-off between these properties can be learned by the MTGP model? By the way, may I check that can the iteration of the bayesian optimisation process also be termed as Active Learning, if I am not wrong? Many thanks for your attention. 🙏
@TaylorSparks
@TaylorSparks 5 ай бұрын
@@jinlongsu7308 I would simply do a multi objective optimization. Multitask is typically used when one property has more statistical strength than the other and the two properties are related their by allowing you to lean on the statistical strength of the one with the larger data set
@jinlongsu7308
@jinlongsu7308 5 ай бұрын
@@TaylorSparks Noted. Many thanks for your prompt reply! 😀
31. Gaussian Processes
20:29
Taylor Sparks
Рет қаралды 941
Automated Machine Learning - Tree Parzen Estimator (TPE)
13:18
Мясо вегана? 🧐 @Whatthefshow
01:01
История одного вокалиста
Рет қаралды 7 МЛН
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4,2 МЛН
Episode 91: High Entropy Alloys
40:35
Taylor Sparks
Рет қаралды 3 М.
Making complex Bayesian Optimization simple with Honegumi
14:46
Taylor Sparks
Рет қаралды 791
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,8 МЛН
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 437 М.
The Dome Paradox: A Loophole in Newton's Laws
22:59
Up and Atom
Рет қаралды 1,7 МЛН
Gaussian Processes
23:47
Mutual Information
Рет қаралды 140 М.
Bayesian Hyperparameter Tuning | Hidden Gems of Data Science
15:27
Selva Prabhakaran (ML+)
Рет қаралды 1,6 М.
Bayes' Theorem (with Example!)
17:43
Steve Brunton
Рет қаралды 10 М.