This is the best series of videos to actually understand the optimization procedure.
@anniehamalian30772 жыл бұрын
you can check www.youtube.com/@decisionbrain to know more about optimization
@grimonce7 жыл бұрын
Really helpful and neat explanation, thanks for the video :)
@mitjadrab65296 жыл бұрын
What is the simulation program shown at 1:30?
@alphaopt20246 жыл бұрын
Hi Mitja, I used Algodoo for the simulation: www.algodoo.com/
@domaminakoi56303 жыл бұрын
Do you have experience on when to you which of the gradient free algorithms? PSO has worked best for me in the past. Haven't been succesful implementing a simulated annealing with good results yet.
@adelsayyahi9665 Жыл бұрын
Thank you, what is the name of the algoodo tolbox you used for simulated annealing?
@twisties.seeker4 жыл бұрын
Thank you for an amazing explanation.
@alvarorodriguez85756 жыл бұрын
Hello, thank you for the video, I have a question, for multi objective optimization the same classification applies or it is different? Especially looking forward to Buildings Multi-disciplinary and multi-objective optimization problems, Thank you !
@andrea-mj9ce2 жыл бұрын
The Nelder-Mead method is not explained long enough to understand it.
@metaprog46and24 жыл бұрын
Nice video. Your explanation synchs well with the graphics (which are awesome themselves - which design / video maker software did you use?)
@alphaopt20244 жыл бұрын
Powerpoint if you can believe it. You can do a lot with the morph transition.
@metaprog46and24 жыл бұрын
@@alphaopt2024 Wow. Color me surprised. I'll have to get over my natural disdain for PPT lol. Thanks for the response!
@vi5hnupradeep3 жыл бұрын
Thank you so much 💯
@where-is-my-mind.6 жыл бұрын
gradient-based optimisation also doesn't guarantee an optimal solution.
@where-is-my-mind.6 жыл бұрын
@Dat Boi When you say "guarantee an optimal solution", I presume you mean global optimum. I don't know where you've learnt that but that's not correct. If you have a paper to back that up please reference it so I can take a look too. To start with, there are infinite number of optimisation problems and gradient-based optimisation can only solve a handful of it. Because real world problems are hardly differentiable, hence why the derivative-free or non-gradient optimisation algorithms emerged. Now going back to what you said about the second-order optimisation, it is more "efficient" in terms of convergence in comparison to first-order optimisation however, optimality of the solution has nothing to do with the speed of convergence. Just like first-order methods, second-order methods are also very likely to be stuck in local minimas so it doesn't guarantee an optimal solution. In fact, I've seen many studies where they've obtained better result optimising a specific problem with gradient descent instead of a 2nd order method. So to wrap up, it's not as simple as you've stated.
@parg22444 жыл бұрын
@@where-is-my-mind. Hi! could you recommend a book about optimization?