Lecture 11 | Semidefinite Programming (SDP) | Convex Optimization by Dr. Ahmad Bazzi

  Рет қаралды 58,709

Ahmad Bazzi

Ahmad Bazzi

Күн бұрын

Пікірлер: 45
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
Please make sure you subscribe to the channel and hit the the notification button to receive further notifications about the channel. Thank you all.
@belldemasi3870
@belldemasi3870 4 жыл бұрын
subscription done !! Thank you for you useful lecture :)
@deshawndarwin8901
@deshawndarwin8901 4 жыл бұрын
agree
@wiltonhardesty8588
@wiltonhardesty8588 4 жыл бұрын
subscribed :) been supporting you since day 1.
@jacobyaubrey4109
@jacobyaubrey4109 4 жыл бұрын
I have subscribed sir !!
@nautboermans2367
@nautboermans2367 5 жыл бұрын
You just spoon feed my brain with your clear explanation, thanks man!
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
You are most welcome, Elena :)
@leonardocortese8293
@leonardocortese8293 5 жыл бұрын
Thanks for this awesome is the first time I really understood how SDPs works.
@tuongnguyen9391
@tuongnguyen9391 4 жыл бұрын
this channel is so underrated ... your pedagogy style is right to the point. I really love your teaching style. Will you cover Difference of Convex programming in the future :)
@AhmadBazzi
@AhmadBazzi 4 жыл бұрын
Thanks a lot Tuong!! I appreciate it. I think DCPs are really cool with many applications in Machine learning ( see Kernel selections and SVMs) as well as Economics and Finance. I will try my best to dedicate some lectures for them. 😊
@tuongnguyen9391
@tuongnguyen9391 4 жыл бұрын
@@AhmadBazzi wow you really know a lot about it. Although DC programming was invented by vietnamese people not many book in my country even mention about its application :)))))
@shubhamsharma2202
@shubhamsharma2202 5 жыл бұрын
i know Schur's complement .. what i like is how easy you make it look in 26:11 !! thank you
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
you are welcome !!
@ElizabethMartinez-vj9fi
@ElizabethMartinez-vj9fi 5 жыл бұрын
Sir, your videos are terrific, especially this series along with the MATLAB always do very well in my math and statistics courses, but I am rarely satisfied with just getting the computations right-I seek an understanding of the math that extends beyond the numbers and an understanding of how the math applies in the real usually takes time to answer all of these questions on my own, but your videos clarify so much and extends my interest, so thanks.
@maid6191
@maid6191 5 жыл бұрын
Just Brilliant!! Ahmad Bazzi- You are a genius!
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
Thank you for the support :)
@blairstatler4634
@blairstatler4634 5 жыл бұрын
I wished My school UT Dallas had professors like I need not struggle on thankfully I found you.
@shubhamsharma2202
@shubhamsharma2202 5 жыл бұрын
thank u for the lecture sir
@vicenteisabelle5357
@vicenteisabelle5357 5 жыл бұрын
excellent are different and please continue posting videos
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
Sure thing, Vincente :)
@batigol1234567890
@batigol1234567890 4 жыл бұрын
33:28 the norm of a matrix (or more specifically the Frobinius norm) is the trace of what you are mentioning in minute 33:28; i.e., ||A||^2=trace(A^T A)=trace(A A^T), where A^T denotes the transpose of A.
@AhmadBazzi
@AhmadBazzi 4 жыл бұрын
forgot the trace here. You’re right
@hankpark4510
@hankpark4510 4 жыл бұрын
Brilliantly done otherwise one of the hardest concepts to explain! Just a quick question for the Eigenvalue Minimization Problem. At 32:19, V'_max * A(x) * V_max = lambda_max * ||V_max||^2, which makes me curious. I think the "lambda_max" should be derived from the whole matrix; tI - A(x), not from A(x) alone. Thus, it's confusing you decomposed ------------------------------------------------------------------------------------------------------------------------------------------------------- V'_max * ( tI - A(x) ) * V_max = t * ||V_max||^2 + V'_max * A(x) * V_max = t * ||V_max||^2 + lambda_max * ||V_max||^2 ------------------------------------------------------------------------------------------------------------------------------------------------------- I'd rather think ------------------------------------------------------------------------------------------------------------------------------------------------------- V'_max * ( tI - A(x) ) * V_max = lambda_max * ||V_max||^2 b/c ( tI - A(x) ) * V_max = lambda_max * V_max ------------------------------------------------------------------------------------------------------------------------------------------------------- So, if we stick to your approach, it would be more helpful to explain how A(x) can share the same eigenvalue, "lambda_max", with the whole matrix; "tI - A(x)". I don't understand why A(x) can be a similar matrix to ( tI-A(x) ). Thanks again!
@chick3n71
@chick3n71 3 жыл бұрын
The identity Id is invariant under any unitary or orthogonal transforms U, i.e., U Id U^\dagger = Id.
@sandeepayyagari
@sandeepayyagari 3 жыл бұрын
Hello Ahmad, Can you please shed some light on the efficienct MISDP solvers available. I am currently using YALMIP branch & bound..Thanks for your efforts. 🥰
@Scott_Raynor
@Scott_Raynor 3 жыл бұрын
How do you go about doing SDPs in cvx in matlab?
@zyzhang1130
@zyzhang1130 Жыл бұрын
32:09 you claim the 'larger than in semidefinite sense' and 'larger than or equal' is equivalent, but it seems the proof only goes one direction: at 32:09 why is the reverse direction (the inequality holds for v_max implies it holds for all alpha) also true?
@rosiebennett512
@rosiebennett512 5 жыл бұрын
Ahmad Bazzi, the next big thing ?
@sucramgnat8157
@sucramgnat8157 3 жыл бұрын
23:08 should that general inequality sign be reversed?
@zyzhang1130
@zyzhang1130 Жыл бұрын
the content is top tier but it is totally compromised by the recording quality. I'm not even kidding .
@AhmadBazzi
@AhmadBazzi Жыл бұрын
Thank you for the support and kind comment. I would like to apologize - this video was recorded when i started the channel, with very basic and poor equipment, with little video editing knowledge. The latest videos - especially on optimization - are higher quality.
@zyzhang1130
@zyzhang1130 Жыл бұрын
​@@AhmadBazzi I see. It was just frustrating because I was trying to follow the reasoning step by step and then encountered multiple glitches. I still learned a lot from it and I appreciate your effort.
@santhuathidi5987
@santhuathidi5987 4 жыл бұрын
sir, How Vmax multiplied by I gives norm of of vmax(32:11)
@abdowaraiet2169
@abdowaraiet2169 3 жыл бұрын
The norm of vmax is the result of multiplying vTmax by vmax from the other side of the bracket.
@lglgunlock
@lglgunlock 4 жыл бұрын
I think at 26:21 the formula is incorrect.
@AhmadBazzi
@AhmadBazzi 4 жыл бұрын
which one ? Schur's complement ?
@lglgunlock
@lglgunlock 4 жыл бұрын
Yes. Schur's complement. If A is positive definite, then "M is positive semi-definite" is equivalent to "M/A is positive semi-definite".
@AhmadBazzi
@AhmadBazzi 4 жыл бұрын
Schur complement is correct, please see here en.wikipedia.org/wiki/Schur_complement. As for the statement I mentioned, check the properties in the above link where it says "M is a positive-definite symmetric matrix if and only if D and the Schur complement of D are positive-definite."
@lglgunlock
@lglgunlock 4 жыл бұрын
Your writing and your verbal description are inconsistent, however, I think both your writing and your verbal description are incorrect. Your writing claims: M is PSD (positive simi-definite) A is PD (positive definite) and M/A is PD. This is incorrect, because, "M is PSD" is not the sufficient condition for "A is PD and M/A is PD". To get to "A is PD and M/A is PD", you need "M is PD", "M is PSD" is not enough. Your verbal description claims: M is PSD A is PSD and M/A is PSD. This is also incorrect, because, from "M is PSD", you cannot get to "A is PSD and M/A is PSD". The Schur complement only says: If "A is PD", then "M is PSD" "M/A is PSD". Note "A is PD" is a premise, it is not a conclusion that you can get from "M is PSD". I am not sure whether "A is PSD" is a conclusion that you can get from "M is PSD". It seems you can get this conclusion based on the generalized Schur complement. Anyway, you cannot get to "A is PSD and M/A is PSD" from "M is PSD".
@AhmadBazzi
@AhmadBazzi 4 жыл бұрын
Checking the second property in the wikidpedia link en.wikipedia.org/wiki/Schur_complement It says that M is a P.D symmetric matrix A and the Schur complement of A are P.D M is the block matrix as show in 26:21. It is symmetric and assumed to be P.S.D then this is equivalent to A and the Schur complement of A are P.S.D Where is the issue ?
@virebovipole6657
@virebovipole6657 5 жыл бұрын
My professor does the worst job at explaining convex optimization and in particular semidefinite guy is doing quite the opposite :)
@AhmadBazzi
@AhmadBazzi 5 жыл бұрын
I’m glad the explanation is of benefit to someone :)
Wait for the last one 🤣🤣 #shorts #minecraft
00:28
Cosmo Guy
Рет қаралды 12 МЛН
Flipping Robot vs Heavier And Heavier Objects
00:34
Mark Rober
Рет қаралды 59 МЛН
小蚂蚁会选到什么呢!#火影忍者 #佐助 #家庭
00:47
火影忍者一家
Рет қаралды 122 МЛН
Хасанның өзі эфирге шықты! “Қылмыстық топқа қатысым жоқ” дейді. Талғарда не болды? Халық сене ме?
09:25
Демократиялы Қазақстан / Демократический Казахстан
Рет қаралды 350 М.
The Art of Linear Programming
18:56
Tom S
Рет қаралды 678 М.
I interviewed an AI and here’s how it went !
26:35
Ahmad Bazzi
Рет қаралды 192 М.
Striding CUDA like i'm Johnnie Walker
11:07
Ahmad Bazzi
Рет қаралды 545 М.
Go Has Exceptions??
16:58
ThePrimeTime
Рет қаралды 29 М.
Wait for the last one 🤣🤣 #shorts #minecraft
00:28
Cosmo Guy
Рет қаралды 12 МЛН