Quadratic Form Minimization: A Calculus-Based Derivation

  Рет қаралды 50,920

MathTheBeautiful

MathTheBeautiful

Күн бұрын

Пікірлер: 49
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
Go to LEM.MA/LA for videos, exercises, and to ask us questions directly.
@snnwstt
@snnwstt Жыл бұрын
1:18 Just as an observation, while it is usual to see the quadratic form as presented here, I find the following a little bit more ... elegant: 0.5 * [W] {x y z 1} With a line vector, { } a column vector and [ ] a matrix. Here W = 4 1 2 -2 1 8 5 -3 2 5 4 -4 -2 -3 -4 0 symmetric if A is symmetric. Note that the minus sign for the last column and the last line is due to the original subtraction. The 0 stands when the constant term is ... zero.
@vothiquynhyen09
@vothiquynhyen09 6 жыл бұрын
I have to say that I love your voice, and the passion you have for the subject.
@joshuaronisjr
@joshuaronisjr 5 жыл бұрын
He talks a little like Feynman
@ekandrot
@ekandrot 7 жыл бұрын
For your gradient descent, do you need the -b in there, eg x -> x - a(Ax-b) ? It seemed without that -b and a positive definite matrix A, zero is the only solution. But with -b then -1,-2,4 is the solution.
@MathTheBeautiful
@MathTheBeautiful 7 жыл бұрын
Yes, you are correct!
@ijustneedaname47
@ijustneedaname47 3 жыл бұрын
This video really helped tie these concepts together for me. I really appreciate your posting it.
@omedomedomedomedomed
@omedomedomedomedomed 4 жыл бұрын
To understand the least square derivation, I check this. Super helpful !!!
@jesalshah81
@jesalshah81 4 күн бұрын
Omg, I have never learnt math like this before! So great!
@MathTheBeautiful
@MathTheBeautiful 4 күн бұрын
I'm glad you feel this way!
@bryan-9742
@bryan-9742 5 жыл бұрын
this is so cool. Love this channel. I'm learning so much I should have learned years ago.
@sora290762594
@sora290762594 3 жыл бұрын
great way of explaining quadratic optimization
@joaquingiorgi5133
@joaquingiorgi5133 2 жыл бұрын
Made this concept easy to understand, thank you!
@Userjdanon
@Userjdanon 2 жыл бұрын
Great video. This was explained very intuitive.
@TuNguyen-ox5lt
@TuNguyen-ox5lt 7 жыл бұрын
Gradient descent is a technique used in machine learning nowadays to optimize a loss function . This video is great
@gerardogutierrez4911
@gerardogutierrez4911 4 жыл бұрын
Why does he talk like hes trying to get me to recapture the means of production from the bourgeoisie?
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
Because he is lenin in that direction
@jjgroup.investments
@jjgroup.investments 2 жыл бұрын
Thanks for this awesome video
@devrimturker
@devrimturker 3 жыл бұрын
Is there a relation between positive definite matrix and convex set
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
Yes, excellent intuition. The level set for a positive-definite quadratic form is a convex shape.
@DiegoAToala
@DiegoAToala 2 жыл бұрын
Thank you, so clear!
@ibrahimalotaibi2399
@ibrahimalotaibi2399 5 жыл бұрын
Monster of Math.
@s25412
@s25412 3 жыл бұрын
7:15 what if your matrix is positive semi-definite? Wouldn't there be a minimum?
@bobstephens97
@bobstephens97 Жыл бұрын
Awsome. Thank you.
@MathTheBeautiful
@MathTheBeautiful Жыл бұрын
Thank you!
@ashwinkraghu1646
@ashwinkraghu1646 4 жыл бұрын
Excellent teacher! and Life saver
@user-xt9js1jt6m
@user-xt9js1jt6m 4 жыл бұрын
Nice explanation sir You look like Jason Statham ❤️❤️❤️ I felt like action star is giving lecture on matrix❤️❤️🙏
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
I get that a lot when I wear a tight t-shirt.
@kreechapuphaiboon4886
@kreechapuphaiboon4886 6 жыл бұрын
Great lecture he explains so well.
@la_kukka
@la_kukka 4 жыл бұрын
Great explanation!! Thank You
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
Glad it was helpful!
@AliVeli-gr4fb
@AliVeli-gr4fb 7 жыл бұрын
thank you, it was a beautiful course
@MathTheBeautiful
@MathTheBeautiful 7 жыл бұрын
Thank you, Ali, I'm glad you're enjoying our videos. But why "was"?
@AliVeli-gr4fb
@AliVeli-gr4fb 7 жыл бұрын
MathTheBeautiful it is normal to say it in past tense in my language, so I thought in it, but wrote in English. so no real reason
@MathTheBeautiful
@MathTheBeautiful 7 жыл бұрын
:) I just wanted to convey that the course is ongoing!
@TheTacticalDood
@TheTacticalDood 5 жыл бұрын
@@MathTheBeautiful Is it still ongoing? This channel is amazing, it would be sad to see it stop!
@みの-c5c
@みの-c5c 4 жыл бұрын
This really helps a lot in understanding matrix derivative, and it's so clear. Thanks!!!
@somekindofbluestuff
@somekindofbluestuff 3 жыл бұрын
thank you!
@serkangoktas5502
@serkangoktas5502 4 жыл бұрын
I always knew that something was off with this derivation. I am relieved that this wasn't because of my lack of talent in math.
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
It's **never** you. It's always the textbook.
@johnfykhikc
@johnfykhikc 7 жыл бұрын
where i can found the statement ? i did an unsuccessful search
@kaursingh637
@kaursingh637 5 жыл бұрын
SIR - U R VERY CLEAR =PLEASE GIVE SHORT LECTUR
@joshuaronisjr
@joshuaronisjr 5 жыл бұрын
This is just a comment for me to look at in the future, but at some point, he says that A will mostly be filled with zeroes before we start Gaussian elimination. A will be the covariance matrix, (X^T X) (look at the next video, the least squares solution video). That it's mostly filled with zeroes indicates that most of the random variables (each column of X is a different random variable of the dataset) are independent of one another (or at least, if they ARE independent then their covariance will be 0). However, Gaussian elimination involves linearly combining rows. The matrices in between may NOT be sparse! As for computer storage...I don't know much about it, but maybe computers store zeroes in a different way, so that sparse matrices are easier to store? Actually, I guess this comment is more than for just me...why can computers store sparse matrices well?
@telraj
@telraj 3 жыл бұрын
Why skip the matrix calculus? It's not rocket science
@roaaabualgasim4882
@roaaabualgasim4882 3 жыл бұрын
I wont examples or meteial to illusstrate the idea of method of maximization and minimaization of function with constraint(lagrage multiplier ) and with no constraint(by quadratic form and hessian matrex) 😭
@marshall7253
@marshall7253 6 жыл бұрын
I love this guy
@darrenpeck156
@darrenpeck156 2 жыл бұрын
Absolute value has a minimum.
@gustavoexel5569
@gustavoexel5569 5 жыл бұрын
At 13:15 my chin literally felt
@ElizaberthUndEugen
@ElizaberthUndEugen 5 жыл бұрын
*dropped
The Least Squares Formula: A Derivation
10:31
MathTheBeautiful
Рет қаралды 127 М.
Lecture 27(A): Quadratic Forms: Symmetric matrices, geometric intuition
28:26
Арыстанның айқасы, Тәуіржанның шайқасы!
25:51
QosLike / ҚосЛайк / Косылайық
Рет қаралды 700 М.
Что-что Мурсдей говорит? 💭 #симбочка #симба #мурсдей
00:19
99.9% IMPOSSIBLE
00:24
STORROR
Рет қаралды 31 МЛН
人是不能做到吗?#火影忍者 #家人  #佐助
00:20
火影忍者一家
Рет қаралды 20 МЛН
Fourier Series Is Nothing but a Least Squares Problem!
14:09
MathTheBeautiful
Рет қаралды 18 М.
Equality-Constrained SQP
26:17
BYU FLOW Lab
Рет қаралды 6 М.
Math News: The Fish Bone Conjecture has been deboned!!
23:06
Dr. Trefor Bazett
Рет қаралды 168 М.
A Criterion for Positive Definiteness of a Symmetric Matrix
16:19
MathTheBeautiful
Рет қаралды 20 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
2024's Biggest Breakthroughs in Math
15:13
Quanta Magazine
Рет қаралды 385 М.
I differentiated the quadratic formula
9:59
blackpenredpen
Рет қаралды 247 М.
Where does “e” come from?
14:45
Ali the Dazzling
Рет қаралды 34 М.
Inner Product is Dot Product, Turned on Its Head
12:13
MathTheBeautiful
Рет қаралды 36 М.
Арыстанның айқасы, Тәуіржанның шайқасы!
25:51
QosLike / ҚосЛайк / Косылайық
Рет қаралды 700 М.