7:10 Small correction. You can't order the matrices inside the trace function however you want. You can only rearrange them cyclically. So, for instance, you can do JP⁻¹P or P⁻¹PJ but not JPP⁻¹.
@slavinojunepri76483 ай бұрын
Can the rearrangement be arbitrary if the matrices involved in the multiplication (which we are taking the trace) have the same dimensions? The cyclical rearrangement is obviously required if the dimensions are different.
@APaleDot3 ай бұрын
@@slavinojunepri7648 No, even if they are all square matrices, they have to be rearranged cyclically (unless they otherwise commute).
3 ай бұрын
from Morocco thank you very much son you took me far away 40 years in my past university studies
@aremathukr2 ай бұрын
I actually have this formula tattooed on my left forearm:) I remember seeing it for the first time in 3b1b video, and immediately trying to prove it in my head. I managed to do it for the case of the diagonal matrix when I was falling asleep, and when I woke up I knew how to generalize it for any square matrix. After I got the tattoo, this formula started appearing everywhere in my studying, which is funny) Thanks for the video
@Geenimetsuri2 ай бұрын
Thanks. I understood it. Excellent presentation.
@i.h.i.d97253 ай бұрын
I just started linear algebra this semester, and this video makes me excited about the subject.
@tomkerruish29823 ай бұрын
Alternatively, tr = ln○det○exp.
@BenfanichAbderrahmane3 ай бұрын
det=exp(tr(ln))
@considerthehumbleworm2 ай бұрын
Ah, what an elegant way of calculating trace!
@CrazyShores2 ай бұрын
Excellent explanation! Only the notation for elements of a matrix is a bit off, most books use lowercase letters
@alejrandom65923 ай бұрын
Note that f(PJP^-1) = P f(J) P^-1 for any f(•) that has a taylor series
@khastakachori1232 ай бұрын
What a beautiful result!
@JohnSmall3143 ай бұрын
Very nice and clear explanation
@thomasjefferson62253 ай бұрын
This is a very good video. I really enjoyed it.
@shehbazthakur12 ай бұрын
superb, loving it. subscribing.
@coshy27482 ай бұрын
Very clear. Good presentation.
@berlinisvictorious2 ай бұрын
Always loved me some linear algebra
@alejrandom65923 ай бұрын
Very nice and easy to follow
@andreagourion91042 ай бұрын
I think you can prove it with this method too : (lambda,v) is an eigenvalue/eigenvector tuple of A iff (e^{lambda}, v) is an eigenvalue/eigenvector tuple of exp(A) (provable with the definition of eigenvalue on the Taylor expansion, a bit more difficult for the other side). The other thing is that det(A)=lamba_1…lambda_n, tr(A)=lamba_1+…+lambda_n (provable with a root factorization of the characteristic polynomial of A). Finally plug exp(A) in the the det, use the two last equalities and you have the result.
@flyingtiger1232 ай бұрын
Good work ❤
@Ajay-ib1xk2 ай бұрын
Nice collection of topics
@nabla_mat3 ай бұрын
You’re back!
@nikkatalnikov2 ай бұрын
very clear, thank you very much
@ironbutterfly37012 ай бұрын
order matters if more than two terms inside trace, it allows cyclic rotation.
@soyoltoi3 ай бұрын
Very clear and easy to follow
@ulychun2 ай бұрын
Is there a geometric explanation to this equation?
@pahom23 ай бұрын
Does it only work for e or for any base?
@MuPrimeMath3 ай бұрын
The result holds for any positive base.
@ehudkotegaro3 ай бұрын
Take the function det(e^(At)) The derivative of det at matrix A applied to matrix X is det(A)* tr(A^-1*X). And so the derivative of this function is det(e^(At))*tr(e^(-At)*A*e^(At))=det(e^(At))*tr(A) Also, det(e^(A*0))=1 And so we get det(e^(At))=e^(t*tr(A)) And subsuiting in one, we get det(e^A)=e^tr(A)
@98danielray3 ай бұрын
I suppose you are using that both functions satisfy the same linear differential equation? f'(t) = trA f(t) and g'(t) = trA g(t) f(0)=g(0)=1 Otherwise, I dont see where you show that their derivatives are equal.
@ehudkotegaro3 ай бұрын
@@98danielray Yes, they are equal because the second function is the only solution for the differential equation that the left function fulfils.
@ilheussauer3232 ай бұрын
thank you!! how i wish you had a vid on hermitian matrices :(
@jceepf2 ай бұрын
I believe that you can prove this directly by using the formula exp(xA)= lim_(N->infinity} (1+xA/N)^N and expanding the deterninant in x/N for small x/N. The leading order would be related to the trace and the rest goes away as N-> infinity. Of course, the Jordan decomposition provides an elegant proof if you assume the existance of Jordan decomposition.
@m4gh32 ай бұрын
Jordan decomposition is always possiblw over an algebraically closed field, for example the complex numbers
@jceepf2 ай бұрын
@@m4gh3 Yes, I did not say it was wrong.
@dominiquelarchey-wendling58292 ай бұрын
Then you have to explain how to get the algebraic closure which is not as easy as this exercise...
@matteovissani10712 ай бұрын
Love this topic
@ManishSingh-gc5fvАй бұрын
It is only valid for square matrices and not for rectangular matrices.
@arpitdwivedi91752 ай бұрын
It would be shorter if we allow eigenvalues in the picture. Det(e^A) = product of eigenvalues of e^A = e^L1.... e^Ln = e^(L1+ ... +Ln) = e^(Tr A). Where L1,..., Ln are eigenvalues of A.
@aquamanGR2 ай бұрын
Hmm... I don't think J is *always* upper-triangular, as you have assumed. For example, if A is 2x2 with complex eigenvalues, then I don't think J can be upper-triangular.
@MuPrimeMath2 ай бұрын
In fact the Jordan canonical form is always upper triangular. We permit J to have complex entries, as the rest of the proof still holds.
@aquamanGR2 ай бұрын
@@MuPrimeMath OK I missed the "complex entries" thing although you did state it in the video. Makes perfect sense now, thanks!
@sirshabiswas30103 ай бұрын
Sir? This could be an insult to you but I was wondering if you could give me a tip on how to find the limits while finding the area using integration. Please don't mind. I really have a hard time figuring out the limits. And lots of love and support! :-)
@MuPrimeMath3 ай бұрын
Sorry, the question is not clear. It's hard for me to give advice on such general topics. I wish you the best of luck.
@sirshabiswas30103 ай бұрын
@@MuPrimeMath it's okay, no worries. Thanks for your reply. Keep uploading. Lots of love!
@alejrandom65923 ай бұрын
@@sirshabiswas3010if the limits aren't given, you might be referring to interesection points. Equate functions and solve for intersection, is my guess. For these topics there are good resources online. Good luck!
@sirshabiswas30103 ай бұрын
@@alejrandom6592 thanks! good luck too!
@csilval183 ай бұрын
Very cool video. Interesting, to the point, well explained... You should get more views
@licks1_3 ай бұрын
When we write A as PJP^-1, you said P is _any_ matrix. Just to clarify, I'm assuming you meant P is any _invertible_ matrix?
@MuPrimeMath3 ай бұрын
More specifically, P is a change-of-basis matrix, which is always invertible.
@alejrandom65923 ай бұрын
Yeah it's a slight mistake
@sagnikbiswas32682 ай бұрын
Im not great at linear algebra but isn't P just the matrix of eigenvectors
@wargreymon20242 ай бұрын
dude you see P^-1, that said invertible
@jakeaustria54453 ай бұрын
Thank You
@Mini_Wolf.2 ай бұрын
Fabulous
@spaceman3920012 ай бұрын
The infinite sum for e^A starts with I, the identity matrix, not the number 1
@MuPrimeMath2 ай бұрын
Yes. Here 1 denotes the multiplicative identity in the ring of matrices, which is the identity matrix.
@dgsndmt49632 ай бұрын
Nice...
@MrNerst2 ай бұрын
Further assumptions are needed. For example, that A is a square-matrix, never explicitly mentioned along the video, otherwise det(e^A) doesn't make any sense, as the determinant is defined only for square-matrices. A proof that the power-expansion of e^A is valid because A is a bounded linear operator in finite dimensions.
@wargreymon20242 ай бұрын
No, he did perfect. Whatever you said isn't defined on the domain of determinant to begin with, yours is excessive.
@MrNerst2 ай бұрын
@@wargreymon2024 First, study a little about endomorphisms and then share your opinion on social media.
@wargreymon20242 ай бұрын
ambiguous and excessive
@antormosabbir47502 ай бұрын
Wow!
@wargreymon20242 ай бұрын
👍🏻👍🏻👍🏻👍🏻👍🏻🔥🔥🔥🔥
@jaimeduncan61672 ай бұрын
He keep saying "any matrix", but the power of a Matrix, in general, is not well defined. For square Matrixes it is, A^n will always exist, but it's not the case for more random matrixes (m x n) than one can define. This is a mayor flaw of the video, in particular for a formal mathematics one.
@MyOneFiftiethOfADollar3 ай бұрын
Note that we could call this a Simply Beautiful solution, BUT not as beautiful as a Cowboy cheerleader.
@edofarmer8122 ай бұрын
Me on a first date
@newdayrising2 ай бұрын
The series expansion of e^A starts with the identity matrix, not 1.
@MuPrimeMath2 ай бұрын
Yes, here 1 denotes the multiplicative identity of the matrix ring.
@harrypewpew9012 ай бұрын
I do not know why, but these math/cs geeks creep me the f out, they give serial killer vibe.
@kingfrozen42572 ай бұрын
This proof is so wrong in many aspect I can't resolve it in a comment so I will just leave a dislike and tell viewers to not use this proof in their HW or any actual work they are doing!
@louis80412 ай бұрын
I 100% agree and I’ll do it myself : Given a matrix A and considered as a complex matrix, its caracteristic polynomial has n roots (Gauss theorem) if not counted with mutltiplicities so A is C-trigonalizable and its diagonal coefficients are its eigen values lambda. Thus exp(A) is similar to a triangular matrix which diagonal coefficients are exp(lambda). Taking the determinant gives the result. No need to talk about Jordan’s form (absolute non-trivial result which can’t be used like nothing!!)