L1.2 - Introduction to unconstrained optimization: first- and second-order conditions (vector case)

  Рет қаралды 21,632

aa4cc

aa4cc

Күн бұрын

Пікірлер: 15
@Dominus_Ryder
@Dominus_Ryder 7 жыл бұрын
this lecture series was just what I needed, at a time when I needed it the most! Appreciate it!
@AlenT6969
@AlenT6969 14 күн бұрын
thank you for this informative video, it'll help me in the final-term exam this Friday
@welidbenchouche
@welidbenchouche 5 жыл бұрын
hey, great lectures, just one thing , i think you forgot to add the square at the HESSIAN matrix at #5:30 , the one in the box
@aa4cc
@aa4cc 5 жыл бұрын
True, the upper index 2 is missing in the box. Note that it should not be interpretted as SQUARING. It is just one possible notation for the matrix of second (mixed) derivatives.
@xiaohaoyuan3658
@xiaohaoyuan3658 4 жыл бұрын
Thanks for the great lecture, I got a little question at #6:55 about the second-order dominate condition (right-hand side of the inequality), whether it should be O(a^3) or O(a^2)? I noted that in the previous lecture it seems to be O(a^3).
@aa4cc
@aa4cc 4 жыл бұрын
You are perfectly right. It should be O(alpha^3).
@xiaohaoyuan3658
@xiaohaoyuan3658 4 жыл бұрын
@@aa4cc Thanks for your reply. Have a nice day!
@sharachchandrabhat8428
@sharachchandrabhat8428 2 жыл бұрын
Great lecture! I have a question about the Caveat. The value alpha''(0) = d1^2 + ... , where d = (d1, d2) and alpha''(0) means alpha'' evaluated at alpha = 0. Since alpha''(0) = 0 for some d1, namely d1 = 0, the sufficient condition is not satisfied. Hence alpha = 0 need not be a minima. So, why was this assertion, that alpha = 0 minima when checked using the directional derivative, made?
@ahmedgailani533
@ahmedgailani533 5 жыл бұрын
thanks, what is meant by saying we stay within the distance of epsilon from the critical point.
@MaksymCzech
@MaksymCzech 4 жыл бұрын
Exactly that - you select epsilon and then stay in a neighborhood within distance that is less than epsilon from the critical point.
@lachlanpage7819
@lachlanpage7819 5 жыл бұрын
Why did you at first say the Hessian needed to be positive semidefinite and then in your final statement say it needs to be positive definite? Was this the difference between a necessary condition and a sufficient condition?
@aa4cc
@aa4cc 5 жыл бұрын
Indeed, the Hessian matrix being positive semidefinite is a necessary condition of optimality while the the stricter requirement of Hessian being positive definite is a sufficient condition of optimality.
@malekhammou7572
@malekhammou7572 4 жыл бұрын
What can we conclude if the first-order condition is satisfied and the second-order condition is not?
@aa4cc
@aa4cc 4 жыл бұрын
Do you mean second-order necessary condition? Or second-order sufficient one? In the former case, the point is then neither minimum nor maximum. Saddle point. To explain the latter case, consider two functions f(x)=x^3 and g(x)=x^4, both at x=0. Both satisfy the first-order necessary condition and second-order necessary condition at x=0 (first derivative equal zero and second derivative nonnegative). So far so good. But both fail to satisfy the second-order sufficient condition (second derivative strictly positive), and yet one of them is minimized at x=0 while the other is not (just picture the graphs). To detect this, we would have to study higher-order derivatives. For f(x), the third derivative is nozero. Recall that in Taylor's expansion, you have odd powers of independent variables with the third derivative. But that means that the corresponding contribution can have both signs and the point can by neither minimum nor maximum. For g(x), it is the fourth derivative that is nonzero and the corresponding term in Taylor only contains even powers of independent variables, hence for positive fourth derivative the function is minimized. Hope this helps.
@malekhammou7572
@malekhammou7572 4 жыл бұрын
@@aa4cc Thank you. Now I see things better!
It works #beatbox #tiktok
00:34
BeatboxJCOP
Рет қаралды 37 МЛН
The Best Band 😅 #toshleh #viralshort
00:11
Toshleh
Рет қаралды 22 МЛН
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 100 МЛН
Optimization: First & Second Order Condition
22:36
DiagKNOWstics Learning
Рет қаралды 32 М.
11. Unconstrained Optimization; Newton-Raphson and Trust Region Methods
53:30
Unconstrained Optimization
1:09:18
Christopher Lum
Рет қаралды 14 М.
Introduction to Trajectory Optimization
46:40
Matthew Kelly
Рет қаралды 92 М.
Lesson 31 2 Optimization Theory and First  and Second Order Conditions
19:51
Optimisation - First Order Optimality Condition
25:12
NPTEL-NOC IITM
Рет қаралды 2 М.
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,7 МЛН
It works #beatbox #tiktok
00:34
BeatboxJCOP
Рет қаралды 37 МЛН