Evolutionary algorithm: basic components and the main loop

  Рет қаралды 1,091

Maciej Komosinski

Maciej Komosinski

Күн бұрын

Пікірлер
@preetamsharma5220
@preetamsharma5220 4 жыл бұрын
Question: what if I am interested in finding the local minima on that 3d plot (multi dimensional landscape). How can I find the local minima?
@Maciej-Komosinski
@Maciej-Komosinski 4 жыл бұрын
You use local search algorithms ("greedy" or "steepest") as explained in earlier lectures. Did you watch these earlier videos?
@preetamsharma5220
@preetamsharma5220 4 жыл бұрын
@@Maciej-Komosinski yes I have watched as greedy algorithm works on a lot of Calculas calculation and we have to choose a arbitrary point for solving it. Do we really care about solving or thinking to find local minima manually?
@Maciej-Komosinski
@Maciej-Komosinski 4 жыл бұрын
@@preetamsharma5220 If you have a continuous function, then you can indeed use gradient methods. But here we speak about a more general approach to optimization where the space of solutions does not have to be continuous. Metaheuristic algorithms, as discussed in this class, will also work for combinatorial optimization problems where calculus will not help. No, we definitely will not look for local minima manually (though I am not sure what you mean by "manually").
@preetamsharma5220
@preetamsharma5220 4 жыл бұрын
@@Maciej-Komosinski Now I am clear, Manually means without help of programming. Thanks for response
@preetamsharma5220
@preetamsharma5220 4 жыл бұрын
If we use the different fitness functions, increasing the rate of mutations to maintain a diverse population of solution what is the probability that if we are scarfing short term fitness to obtain longer term fitness. Is there any chance that it can't convert into longer term fitness as the "no free lunch theorem proves that there is no general solution to this problem.
@Maciej-Komosinski
@Maciej-Komosinski 4 жыл бұрын
The more intense the mutation, the more diversification we introduce in the population thus preventing (or delaying) convergence. Larger mutations increase exploration and decrease exploitation.
@preetamsharma5220
@preetamsharma5220 4 жыл бұрын
@@Maciej-Komosinski thanks
Evolutionary algorithms: selection techniques
55:40
Maciej Komosinski
Рет қаралды 1 М.
2024's Biggest Breakthroughs in Math
15:13
Quanta Magazine
Рет қаралды 535 М.
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН
Quilt Challenge, No Skills, Just Luck#Funnyfamily #Partygames #Funny
00:32
Family Games Media
Рет қаралды 55 МЛН
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.
What P vs NP is actually about
17:58
Polylog
Рет қаралды 142 М.
The Dome Paradox: A Loophole in Newton's Laws
22:59
Up and Atom
Рет қаралды 493 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 401 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,1 МЛН
The Only Unbreakable Law
53:25
Molly Rocket
Рет қаралды 345 М.
How might LLMs store facts | DL7
22:43
3Blue1Brown
Рет қаралды 860 М.
Learning classifier systems. Evolving rules. Evolutionary machine learning.
1:06:59
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН