Please subscribe to the channel to receive future notifications about the channel 🙏🏻 🙏🏻 🙏🏻
@jeanishimwe21205 жыл бұрын
Thanks Dr. You brought it on the right time. It helps understanding duality.
@AhmadBazzi4 жыл бұрын
you are most welcome :)
@aleocho7743 жыл бұрын
@@AhmadBazzi thanks a lot sir
@hbasu87792 жыл бұрын
In two-way partitioning example problem at @24:25, when W+V is not PSD, then how do you get the cost -infinity? I am asking this because x can only take values +1 or -1. If you say that when a constrained optimization becomes an unconstrained optimization problem, thanks to the Lagrange function, x does not come from domain D of feasible space of the original problem (which in this case takes on +1 or -1 values) but comes from euclidean space R^n, then isn't the domain of Lagrange function @3.38 wrong? if dom L(x,, : , :) = D * R^m* R^p then in the two-way partitioning example the cost cannot be inifnity,
@SaikSaketh5 жыл бұрын
Thanks. Right on time matching lectures at University. Thanks
@AhmadBazzi5 жыл бұрын
You are most welcome !!
@iamdrfly Жыл бұрын
at 31:52 should it be sup{y^T*x - ||x||} instead ... + ||x||?
@positivobro85442 жыл бұрын
At 20:15 , isn't the condition actually that c-lambda-A^T v >=0 ? Because if all values of the gradient of the lagrangian are positive and your feasible x's are constrained to be positive than you can only increase the function and the minimal x is x=0 giving the same dual function value as the =0 case. Right?
@mohammadsarkhosh9352 жыл бұрын
in 25:11 why inf xT (w+v)x isn't gradian x which is equal to 2*xT(w+x) . Where did Lamda come from when you wrote Lamda min?
@localcarbuyer87044 жыл бұрын
well described!!!
@AhmadBazzi3 жыл бұрын
Glad you think so!
@jhonatancollazosramirez85132 жыл бұрын
these exercise min 15:35 How would it be programmed in MatLab? thank you
@hbasu87792 жыл бұрын
why is g(\lambda,\mu) concave at @6.46?
@AK-yf4dp3 жыл бұрын
Thank you Dr,Ahmad for an amazing course!! I have a question, in the Matlab example the value of x obtain by fminunc solver was 0.1574 which clearly is not optimal value (x star) , what is the reason of that?
@danmcgloven81693 жыл бұрын
From my experience, it's better to leave a timestamp on the exact minute so you get faster responses from Ahmad. Notice how he responds on timestamped questions.
@santhuathidi59874 жыл бұрын
hello sir.. in 34:49 how f0*(y)= sum(exp(yi-1)) from f0(x) pls explain
@aleocho7743 жыл бұрын
The conjugate function of sum(xi log xi) is sum(exp(yi-1))
@saumyachaturvedi90652 жыл бұрын
Thanks, sir. Sir, how to proceed if we have more than one Lagrange multiplier, especially for Matlab. How to find the optimal value of Lagrange multipliers in that case?
@chrisliou23234 жыл бұрын
Hi, at 8:39 and 8:45, did you mean "upper bounded by" rather than "lower bounded by" ?
@AhmadBazzi4 жыл бұрын
yes you are right, Chris. Thanks for pointing this out !!
@ArmanAli-ww7ml2 жыл бұрын
what happen if we open infimum term with lagrangian function inside?
@thomasjefferson6225 Жыл бұрын
Anyone know why he converted x >= 0 to -x
@sepidehghasemi86312 жыл бұрын
thank you for amazing course, now I understand better😊 I need help about one example in Optimization course. Can you help me please?
@momofrozen44315 жыл бұрын
Can you please explain the derivative of vT(Ax-b)=ATv?
@AhmadBazzi5 жыл бұрын
Hey Momo, well the derivative of your expression is the same as the derivative of vTAx since vTb is independent of x. So now let’s deal with vTAx. Write down the expression as a summation then derive with respect to the kth entry of x, x_k. Finally stack all the derivatives one on top of the other.
@sepidehghasemi86312 жыл бұрын
If I join in your group, Can I send you my question about Optimization?
@imtryinghere12 жыл бұрын
is it me or does he go through the matlab code really fast? I guess because I am only a python scrub it is a bit fast for me to follow the code...
@ehsaneshaghi36133 жыл бұрын
There are too many distracting advertisements !
@danmcgloven81693 жыл бұрын
Why you hating man ? I think it's pretty okay if there's ads when ahmad dedicates time to get rewarded