“You’re not going to be solving it by hand.” *laughs then cries in graduate student*
@BrianBDouglas Жыл бұрын
😂😭
@vnagamohankrishnap1596 Жыл бұрын
You are a single piece, bro. You're explaining intuitions, makes me excited all the time.
@BrianBDouglas Жыл бұрын
I appreciate it!
@3d_chip5 ай бұрын
my god, two weeks of lectures explained in one video. you are great man.
@ryanfeng Жыл бұрын
Most inspiring video I ever seen. I got two takeaways: transferring none resolvable problem to an equivalent resolvable problem; gradient is a good way.
@AnythingGoesCodes8 ай бұрын
had an undergrad professor so determined to stop cheaters that he only allowed scientific calculators which didn't bother me until he expected us to do regression
@Joshjson Жыл бұрын
Wish this was the way it was explained in university. Liked and subbed
@BrianBDouglas Жыл бұрын
Thanks!
@MrPepto936 ай бұрын
I really have to learn to try ideas and equations with simple examples. I was so afraid Lagrange multipliers and Lagrange equation and its sense that I just dropped it off. How lucky that I just saw with the corner of my eye that thumbnail on my recommendation list with a characteristic Brianish drawing style with the "Lagrangian" word within the title. I knew before watching that you will help as always. Gosh you are a great educator man.
@faraway27 Жыл бұрын
Thanks Brian, I always look forward to new Tech Talks! Could you do a video on MPC? That would be awesome!
@BrianBDouglas Жыл бұрын
I appreciate it! MathWorks already has a Tech Talk series on MPC so I doubt I'll make one in the near future. kzbin.info/aero/PLn8PRpmsu08ozoeoXgxPSBKLyd4YEHww8. Perhaps one day when we revisit some of the older videos.
@griffinbur1118 Жыл бұрын
Great video. In the interest of being precise and thinking about what might trip up new learners, someone who's paying really close attention will find 2:45 confusing since you can't have " *thee* partial derivative with respect to both x_1 and x_2". Instead, the gradient is a vector of all of the partial derivativeS, plural, of f( *x* ), where the ith element of the gradient is the partial derivative of f with respect to the ith element of *x* Sorry for the pedantry, but from my own experience, the problem is that we often ask math students to pay close attention to exactly that kind of fine distinction in other contexts, so a description of the gradient that, taken literally, can't exist is likely to cause minor confusion for talented students. That said, phenomenal video. This would be very useful for teaching someone who has only a knack for scalar calculus one of the most important ideas in multivariable calculus quite efficiently.
@BrianBDouglas Жыл бұрын
Thanks for the clarification. I appreciate hearing this type of feedback because it helps me change the way I present future videos. Cheers!
@SarahImeneKhelil Жыл бұрын
Brian, can you do for us a summer school course for control engineers I'll be the first one to attend if it's you talking about the intuition behind control!
@harrytsai0420 Жыл бұрын
Nice video! Looking forward to the nonlinear constrained optimization part!
@nitinjotwani69 Жыл бұрын
Hey, could you recommend any non linear constrained optimization videos?
@Curious_Southerner2 ай бұрын
Thanks for this great video! 6:56 - I am a bit confused about interpreting the gradient of the constraint as it does not reflect the direction of maximum ascent of j(x) or c(x). So, how should I think about this?
@BrianBDouglas2 ай бұрын
Hello! It is pointing in the direction of the maximum ascent of c(x). The black line is when C(x) = 0. Every combination of x1 and x2 that are below that black line is negative, and every combination of them above the black line is positive. And therefore, if you are standing on the black line and you want to ascend the slope, you'd walk up and the to the right to increase the value of C(x).
@AngeloYeo Жыл бұрын
Great as always! 🎉
@BrianBDouglas Жыл бұрын
Thanks!
@blower054 ай бұрын
I am confused about the slope obtained by differentiation. They are the slopes of dz/dx(i) but not the projection to the x-y plane. Thus, I cannot understand how it can be parallel? However, they are parallel if the "projections" slopes , ie. dx(2)/dx(1) is calculated and used. However, it is just 0 and were not used in the calculation.
@kmishy Жыл бұрын
Great teaching❤
@BrianBDouglas Жыл бұрын
Thanks!
@Pedritox0953 Жыл бұрын
Great video!
@user-dp9yn7zf4l4 ай бұрын
5:45 the visual illusion make the dark line look curved .... XD
@saisatyam331422 күн бұрын
Super intutive😊❤
@MATLAB22 күн бұрын
Glad you liked it.
@razakawuni21383 ай бұрын
This is very helpful
@MATLAB3 ай бұрын
Glad you like it!
@DeepakRawat-t6s Жыл бұрын
Can't see the video
@HansScharler Жыл бұрын
It's working for me. What do you see?
@DeepakRawat-t6s Жыл бұрын
@@HansScharler I just see a black screen
@BrianBDouglas Жыл бұрын
Did you get it figured out?
@MrPepto936 ай бұрын
how do you type with eyes closed? :O
@acc3095 Жыл бұрын
❤❤❤❤❤ 🎉
@HeavenlyGodlyAngelic4 ай бұрын
I love this
@Mohdvaqui2 ай бұрын
nice
@MATLAB2 ай бұрын
Thanks for watching!
@Maarten-p8s7 күн бұрын
The conclusion at 2:18 is wrongly constructed. Take x⁴ and you'll see that the second derivative at x=0 also evaluates to 0, while we know it has a minimum there. You ought to inspect an ɛ environment and conclude from that. Since in the video, f'' is an odd function, that is what makes you conclude that it's a saddle rather than an extremum.