Lecture 3 | Convex Optimization I (Stanford)

  Рет қаралды 215,379

Stanford

Stanford

Күн бұрын

Пікірлер: 59
@shiv093
@shiv093 5 жыл бұрын
10:10 Convex functions 12:50 Examples on R 15:23 Examples on R^n and R^mxn 20:09 Restriction of a convex function to a line 28:43 Extended-value extension 31:09 First order condition 35:39 Second-order conditions 37:20 Examples 49:38 Epigraph and sublevel set 52:10 Jensen's inequality 57:21 Operations that preserve convexity 59:17 Positive weighted sum & composition with affine function 1:02:05 Pointwise maximum 1:04:39 Pointwise Supremum 1:08:13 Composition with scalar functions 1:13:31 Vector composition
@nileshdixit9672
@nileshdixit9672 4 жыл бұрын
You really deserve more than just likes
@shiv093
@shiv093 4 жыл бұрын
@@nileshdixit9672 Honestly, I put them so that it helps me revise topics quickly. Happy that it is helping others too.
@mariomariovitiviti
@mariomariovitiviti 4 жыл бұрын
keep liking this one to keep it up
@jamalahmedhussein1341
@jamalahmedhussein1341 10 жыл бұрын
start from 10:10
@rogeryau3115
@rogeryau3115 7 жыл бұрын
No problem. I usually play at 1.5 speed so I will get ready after 10 minutes.
@samw.6550
@samw.6550 6 жыл бұрын
15:50 Norm 18:00 trace (inner product) 28:43 Extended-value extension 31:12 differentiable functions
@jackeown
@jackeown 4 жыл бұрын
In case anyone else was confused: at 41:20 the "softmax" he describes there is different from the "softmax" in deep learning. The deep learning softmax should probably be called something like "softargmax" instead.
@jaimelima2420
@jaimelima2420 3 жыл бұрын
The real 'inspirational' essence is what he says at 32:50 and keeps saying for a two minutes or so. Thanks for sharing.
@mhpt74
@mhpt74 4 жыл бұрын
Great teacher and wonderful sense of humor!
@benjamingoldstein14
@benjamingoldstein14 Ай бұрын
Professor Boyd mentions around 20:00 that the spectral norm of a matrix X is a very complicated function of X, as the square root of the largest eigenvalue of XTX. I would mention, however, that this norm has a very simple geometric interpretation -- it is the maximum factor that X can "stretch" a vector through multiplication. Just as the largest eigenvalue of a matrix is the maximum factor that the matrix can stretch a vector if you don't allow for rotation, the largest singular value is the most that matrix can stretch any vector if you do allow for rotation. It therefore also has the interpretation of the magnitude of the largest axis of the ellipse which is the image of the unit L2 ball under the action of (left) multiplication by X.
@janiceliu5473
@janiceliu5473 3 жыл бұрын
shouldn't the determinant of Hessian of the quadratic-over-linear function be just 0 but not greater or equal to 0? 41:00
@grunder20
@grunder20 13 жыл бұрын
Craving for more of this kind of stuff.
@moxopal5681
@moxopal5681 4 жыл бұрын
1:05:50 It is extremely usefull to know if you are studying control theory.
@DarkDomnu
@DarkDomnu 12 жыл бұрын
this guy is a winner.
@rogeryau3115
@rogeryau3115 7 жыл бұрын
2:06 That blink, that grimace.
@abhimanyu3244
@abhimanyu3244 6 жыл бұрын
Thank you, Prof. Boyd!
@manueljenkin95
@manueljenkin95 2 жыл бұрын
13:58, the condition r++ is important. X^3 is not convex in R I think.
@rabeamahfoud3225
@rabeamahfoud3225 7 жыл бұрын
I'm a Ph.D. student in electrical engineering. I'm studying this book by myself. Those lectures are so helpful for me to start using convex optimization. How can I get the home works professor Boyed is talking about???
@guoweih7339
@guoweih7339 5 жыл бұрын
You can find the textbook, assignment and solution on this page. see.stanford.edu/Course/EE364A/94
@revooshnoj4078
@revooshnoj4078 5 жыл бұрын
@@guoweih7339 thanks man
@happydrawing7309
@happydrawing7309 5 жыл бұрын
@@guoweih7339 thank you sir.
@rodfloripa10
@rodfloripa10 3 жыл бұрын
Did they change the assingments, since the answers are available?
@kenahoo
@kenahoo 2 жыл бұрын
@@rodfloripa10 No - students just have to realize that at this level, assignments are a tool for learning, not a tool for getting grades.
@personalchannel6382
@personalchannel6382 11 жыл бұрын
Professor Boyed is super smart and definitely a researcher who have done large number of rigorous proofs. But even then, no one comes close in conveying mathematics to engineering students to David Snider, my professor at University of South Florida, Not even Prof Boyed. Snider is retired now and he was the author of all his math books that we studied in grad school. However this course is Awesome :). I love the rigorousness of it, it is very helpful to PhD students to come up with proofs to their theorems.
@izzyece707
@izzyece707 9 жыл бұрын
abuhajara do you suggest a specific book about the conveying to understand it very well
@annawilson3824
@annawilson3824 Жыл бұрын
24:17 if have no idea whether the function is convex or not - generate a few lines, plot, and look!
@engr.aliarsalan2628
@engr.aliarsalan2628 7 жыл бұрын
Really Helpful.
@anamericanprofessor
@anamericanprofessor 6 жыл бұрын
Is there an active link to the class notes that are presented in these lectures? It would be more leisurely to watch the videos and then write down notes afterwards.
@fexbinder
@fexbinder 5 жыл бұрын
web.stanford.edu/~boyd/cvxbook/ The book and the lecture slides are public available on the stanford web.
@gamalzayed2247
@gamalzayed2247 Жыл бұрын
Thank you for this nice lecture ❤
@7nard
@7nard 15 жыл бұрын
Good lectures. However, if you are just dropping by like me and want to skip the chatter about admin and classroom issues, start at the 10:10 mark.
@emilywong4601
@emilywong4601 6 жыл бұрын
I studied optimization including linear algebra techniques at Golden Gate University in a major called Computational Decision Analysis from 1999 to 2003. We used SAS, unix and excel solver.
@filippovannella4957
@filippovannella4957 5 жыл бұрын
so?
@JTMoustache
@JTMoustache Жыл бұрын
Composition rule mnemonic 1) Rule is for determining if f has the SAME convexity as h - no rule for f to be opposite of h 2) h has to be monotone 3) the monotonicity should be the equality test of the convexities. If convexity of g == convexity of h -> then its should be increasing, else decreasing. Outside of that no simple rule.
@JTMoustache
@JTMoustache Жыл бұрын
Hm maybe not 😅
@ismailelezi
@ismailelezi 7 жыл бұрын
I feel like a noob. I am understanding the main points, but still, the examples are totally non-obvious. Of course, it is a '300' class, so I should have expected that.
@shiladityabiswas2803
@shiladityabiswas2803 4 жыл бұрын
if Ryan Reynold would become a prof
@cherishnguyen506
@cherishnguyen506 8 жыл бұрын
I understand that A*X = \lamda *X has \lamda as an eigenvalue. So, how could X^(-1/2)VX^(-1/2) has the eigenvalue \lamda?
@akshayramachandran7857
@akshayramachandran7857 5 жыл бұрын
Didn't understand your question-is it that 1.Why eigenvalue of X is same as that of X^(-1/2)VX^(-1/2)? OR 2.Why eigenvalue exists for X^(-1/2)VX^(-1/2)?
@Alex-if6mv
@Alex-if6mv Жыл бұрын
He tells: 'very painful' as if he knows a lo-o-ot about pain!😀
@Abuwesamful
@Abuwesamful 7 жыл бұрын
someone is asking what is diag(z)? then how can the professor be sure that, the students are following, if they do not actually understand what is this very primitive item in that equation?
@hayderatrah
@hayderatrah 11 жыл бұрын
The pace is ridiculously fast. The book (which is well-written) must be read first before can one cope with these videos.
@SequinBrain
@SequinBrain 3 жыл бұрын
what book?
@rashilalamichhane9750
@rashilalamichhane9750 2 жыл бұрын
49:48 epigraphs
@000HakunaMatata000
@000HakunaMatata000 13 жыл бұрын
@10:20
@DarkDomnu
@DarkDomnu 12 жыл бұрын
hilarious prof.
@saiftazir
@saiftazir 6 жыл бұрын
extended value extensions kzbin.info/www/bejne/oZSyoJeweayJasU
@summerland232
@summerland232 12 жыл бұрын
A lefty :D
@muratcan__22
@muratcan__22 6 жыл бұрын
32:00 vay aq
@learningsuper6785
@learningsuper6785 7 жыл бұрын
This guy is the kind of professor I would avoid taking classes from at all cost. He's spending too much time talking about stuff that's not helpful to the understanding of the subject, like debating with himself whether a concept is obvious or not, lots of hand-waving when he really should have drawn some graph on a piece of paper. His coursera convex opt. course is even worse. I'd recommend reading a book than watching his videos for learning the subject.
@maxwellstrange4572
@maxwellstrange4572 6 жыл бұрын
i think he's super entertaining and makes stuff make sense and seem interesting when it would otherwise seem dry
@ElektrikAkar
@ElektrikAkar 6 жыл бұрын
Like or not, he is the superhero of the convex optimization and has the best material. Even the trivial examples he is giving, may help one to broaden his/her perspective.
@sridharthiagarajan479
@sridharthiagarajan479 6 жыл бұрын
Disagree, it's quite entertaining, engaging and doesn't skimp on the important stuff
@wuzhai2009
@wuzhai2009 5 жыл бұрын
Disagree. Knowing that it is hard to make things precise shows that he knows way too much. At points when he you say he is 'hand-waving', I suggest you delve deeper and you will appreciate why he said what he said.
@daweiliu6452
@daweiliu6452 7 жыл бұрын
The first ten minutes is total crap.
Lecture 4 | Convex Optimization I (Stanford)
1:13:38
Stanford
Рет қаралды 149 М.
Lecture 1 | Convex Optimization I (Stanford)
1:20:33
Stanford
Рет қаралды 722 М.
The Best Band 😅 #toshleh #viralshort
00:11
Toshleh
Рет қаралды 22 МЛН
Try this prank with your friends 😂 @karina-kola
00:18
Andrey Grechka
Рет қаралды 9 МЛН
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН
Lecture 2 | Convex Optimization I (Stanford)
1:16:51
Stanford
Рет қаралды 281 М.
Convex optimization
12:18
Network20Q
Рет қаралды 61 М.
Lecture 5 | Convex Optimization I (Stanford)
1:16:10
Stanford
Рет қаралды 121 М.
Lecture 15 | Convex Optimization I (Stanford)
1:16:45
Stanford
Рет қаралды 53 М.
Lecture 8 | Convex Optimization I (Stanford)
1:16:30
Stanford
Рет қаралды 125 М.
Hardy's Integral
13:47
Michael Penn
Рет қаралды 19 М.
Lecture 9 | Convex Optimization I (Stanford)
1:16:35
Stanford
Рет қаралды 78 М.
Cosmology Lecture 1
1:35:47
Stanford
Рет қаралды 1,1 МЛН
The Best Band 😅 #toshleh #viralshort
00:11
Toshleh
Рет қаралды 22 МЛН