Brent's Minimization Method

  Рет қаралды 8,698

Oscar Veliz

Oscar Veliz

Күн бұрын

Пікірлер: 18
@AJ-et3vf
@AJ-et3vf 3 жыл бұрын
Awesome video man! Love your videos about numerical analysis as a numerical analysis enthusiast! You cover algorithms for solving nonlinear equations (and minimization it's related cousin) that aren't usually covered in most/usual numerical analysis channels.
@martingarciavazquez7366
@martingarciavazquez7366 3 жыл бұрын
Thank you very much for posting this video, I found it incredibly useful!
@АлександрКрупин-й3м
@АлександрКрупин-й3м 3 жыл бұрын
Thanks for your videos! I like the way of your explanations via visual illustrations, that's really helpful. Also it's great that you refer to the history of algorithms which you are exploring. Are you planning to record some videos about multidimensional minimization methods, starting from basic things and then, for example, exploring the Broyden-Fletcher-Goldfarb-Shanno algorithm?
@OscarVeliz
@OscarVeliz 3 жыл бұрын
I'm glad you've found these videos helpful. I do plan on covering multivariable minimization and optimization, although there is a lot in the queue ahead of them.
@alexandrevachon541
@alexandrevachon541 3 жыл бұрын
All of the minimization techniques that were described in this series work not just with unimodal functions, but also with *multimodal* functions. They can also be used to find critical points.
@slobodanjenko6701
@slobodanjenko6701 2 жыл бұрын
In that case you can expect to get one of many local minima, not necessarily the global one?
@guillaumeleclerc3346
@guillaumeleclerc3346 6 ай бұрын
Thanks!
@OscarVeliz
@OscarVeliz 6 ай бұрын
Thank you sooo much @guillaumeleclerc3346 for my first ever Super Thanks!!
@guillaumeleclerc3346
@guillaumeleclerc3346 6 ай бұрын
@@OscarVeliz you deserve it for sure, this channel replaced reference textbook for me when I have to implement solvers 🙂
@alexandrevachon541
@alexandrevachon541 4 жыл бұрын
I think I found something related to minimization, something about the Nelder-Mead method and Powell's method. I think these should be worth exploring. I wonder if there's not only just combining both golden section search and successive parabolic interpolation, but also the latter with dichotomous search, ternary search or Fibonacci search... These three methods are also guaranteed to find the result. But they are slower than successive parabolic interpolation or Brent's minimization method. But it would be worth a try.
@OscarVeliz
@OscarVeliz 4 жыл бұрын
I might add those two... we'll see if there is more interest from folks. I don't see much use in combing Ternary with SPI since GSS is just better hands-down. Fibonacci is slightly more efficient than GSS but also adds the overhead of needing the sequence so maybe not worth it. Dichotomous will save on iterations at the expense of function calls, but if the method works similarly to Brent then it would eventually just switch to using SPI.
@alexandrevachon541
@alexandrevachon541 3 жыл бұрын
@@OscarVeliz There's also Newton's method: it finds the minimum of a function (also the roots/zeroes of the derivative). We can also extend Newton's method to higher variables, by using the Hessian matrix. If the Hessian is unavailable or too expensive to compute, then quasi-Newton methods such as Broyden's method, the Davidon-Fletcher-Powell (DFP) method, the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm or the symmetric rank-one (SR1) method, can be used to approximate the Hessian at each iteration. Definitely something worth checking out.
@alexandrevachon541
@alexandrevachon541 3 жыл бұрын
@@OscarVeliz Function calls are just the slightest of my worries.
@tszhinchan7631
@tszhinchan7631 2 жыл бұрын
Thank you for your great tutorial. May I know how we can compute p and q stated in the video?
@OscarVeliz
@OscarVeliz 2 жыл бұрын
Let p be the numerator from 3:22 and q be the denominator. The 2 from the ½ goes with q. You'll notice that some of those terms are repeated in both, so it is useful to save those to variables to avoid recomputing, which is what I did for the example code in the linked repo.
@johnphilmore7269
@johnphilmore7269 2 жыл бұрын
Hey Oscar! Question: Where does the (3-sqrt(5))/2 come from at 2:50? That would mean that v,w,x are all bigger than b when initialing the method. That doesn’t seem right, but I may be wrong
@johnphilmore7269
@johnphilmore7269 2 жыл бұрын
Never mind I got the math wrong we good
@OscarVeliz
@OscarVeliz 2 жыл бұрын
That is from golden-section search. Take a look at 4:30 at where c is placed. The factor (3-sqrt(5))/2 is aboult 0.381 -- v, w, and x go where c would normally be. Meanwhile u goes where d would normally be.
Brent's Method
14:01
Oscar Veliz
Рет қаралды 30 М.
Ternary Search
9:26
Oscar Veliz
Рет қаралды 10 М.
風船をキャッチしろ!🎈 Balloon catch Challenges
00:57
はじめしゃちょー(hajime)
Рет қаралды 85 МЛН
Walking on LEGO Be Like... #shorts #mingweirocks
00:41
mingweirocks
Рет қаралды 7 МЛН
Long Nails 💅🏻 #shorts
00:50
Mr DegrEE
Рет қаралды 4,5 МЛН
Constrained Optimization: Intuition behind the Lagrangian
10:49
Successive Parabolic Interpolation - Jarratt's Method
9:21
Oscar Veliz
Рет қаралды 4,2 М.
Maze Solving - Computerphile
17:15
Computerphile
Рет қаралды 1,1 МЛН
Golden-section Search
9:10
Oscar Veliz
Рет қаралды 20 М.
Visually Explained: Newton's Method in Optimization
11:26
Visually Explained
Рет қаралды 108 М.
Finding Zeros of Functions In Python (Bisection Method and Scipy)
15:26
Levenberg-Marquardt Algorithm
57:14
Engineering Educator Academy
Рет қаралды 25 М.
Generalized Bisection Method for Systems of Nonlinear Equations
9:50
風船をキャッチしろ!🎈 Balloon catch Challenges
00:57
はじめしゃちょー(hajime)
Рет қаралды 85 МЛН