Lecture 15 | Convex Optimization I (Stanford)

  Рет қаралды 53,867

Stanford

Stanford

Күн бұрын

Пікірлер: 20
@shiv093
@shiv093 5 жыл бұрын
2:11 Unconstrained minimization 4:00 Initial point and sublevel set 8:15 (10:15) Strong convexity and implications 13:45 Descent methods 16:46 Line search types 23:56 Gradient descent method 26:40 quadratic problem in R^2 29:26 nonquadratic example 29:42 a problem in R^100 32:03 Steepest descent method 34:23 examples 37:50 choice of norm for steepest descent 43:52 Newton step 48:31 Newton decrement 49:51 Newton's method 57:21 Classical convergence analysis 1:02:58 damped newton phase 1:04:51 conclusion 1:09:03 examples in R^2 1:10:42 example in R^10000
@mrweisu
@mrweisu 10 ай бұрын
The most beneficial lecture in the whole series so far! Super great!
@JaydeepDe
@JaydeepDe 12 жыл бұрын
Beautifully explained the concept of steepest decent for other norms..Thank you professor.
@divyabhatia9916
@divyabhatia9916 4 жыл бұрын
Dr. Boyd, You are just so amazing , funny, super intelligent and down to earth. Thanks so much for making these lectures open source. I learnt a lot from these. Thanks so-3 very much.
@michaelmellinger2324
@michaelmellinger2324 2 жыл бұрын
39:03 Want the norm to be consistent with the geometry of your sub level sets
@TheProgrammer10
@TheProgrammer10 4 жыл бұрын
very nice, this course never fails to intimidate me, but makes sense with time
@heizilyu
@heizilyu 13 жыл бұрын
The point that if a constraint is never active the problem is treated as unconstrained (07:54) is excellent!
@saeedbonab4246
@saeedbonab4246 6 жыл бұрын
32:09 should it be argmax instead of argmin?
@peiqiwang9284
@peiqiwang9284 5 жыл бұрын
argmin
@EulerGauss13
@EulerGauss13 8 ай бұрын
we're looking for maximum direction to - abla f(x) in that norm so it should be min for the direction to abla f(x)
@smolboii1183
@smolboii1183 Жыл бұрын
great lecture :d
@junhochoi156
@junhochoi156 4 жыл бұрын
so impressed
@swagatopablo
@swagatopablo 11 жыл бұрын
Even Stanford guys copy codes from the web?
@hotamohit
@hotamohit 6 жыл бұрын
yup, but then fail the course if they get caught
@meetsaiya5007
@meetsaiya5007 2 жыл бұрын
Everyone does. But the profs usually knows about these. So the questions are set in such a way you still can't do it. Remember the take home end exam announced in 1st lec, that's basically prof taunting, lets see what you can do. Have been victims of those xD
@annawilson3824
@annawilson3824 10 ай бұрын
1:03:00
@luiswilbert2377
@luiswilbert2377 Жыл бұрын
Genius
@toddflanagan5531
@toddflanagan5531 4 жыл бұрын
Ch. 9
@muratcan__22
@muratcan__22 6 жыл бұрын
perfect
@jingXD1228
@jingXD1228 12 жыл бұрын
Most hilarious prof ever
Lecture 16 | Convex Optimization I (Stanford)
1:13:59
Stanford
Рет қаралды 37 М.
2024's Biggest Breakthroughs in Math
15:13
Quanta Magazine
Рет қаралды 402 М.
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 42 МЛН
The Best Band 😅 #toshleh #viralshort
00:11
Toshleh
Рет қаралды 22 МЛН
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
Visually Explained: Newton's Method in Optimization
11:26
Visually Explained
Рет қаралды 112 М.
Lecture 10 | Convex Optimization I (Stanford)
1:17:55
Stanford
Рет қаралды 52 М.
Lecture 11 | Convex Optimization I (Stanford)
1:17:03
Stanford
Рет қаралды 41 М.
LEADERSHIP LAB: Writing Beyond the Academy 1.23.15
1:16:55
UChicago Social Sciences
Рет қаралды 1,6 МЛН
Demystifying the Higgs Boson with Leonard Susskind
1:15:08
Stanford
Рет қаралды 1 МЛН
Lecture 18 | Convex Optimization I (Stanford)
1:16:53
Stanford
Рет қаралды 35 М.
Lecture 13 | Convex Optimization I (Stanford)
1:15:17
Stanford
Рет қаралды 33 М.
MAE5790-1 Course introduction and overview
1:16:31
Cornell MAE
Рет қаралды 390 М.
Descent methods and line search: first Wolfe condition
12:43
Michel Bierlaire
Рет қаралды 18 М.
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 42 МЛН