Matrix Proof: det(exp A) = exp(Tr A)

  Рет қаралды 35,228

Mu Prime Math

Mu Prime Math

Күн бұрын

Пікірлер: 75
@APaleDot
@APaleDot 3 ай бұрын
7:10 Small correction. You can't order the matrices inside the trace function however you want. You can only rearrange them cyclically. So, for instance, you can do JP⁻¹P or P⁻¹PJ but not JPP⁻¹.
@slavinojunepri7648
@slavinojunepri7648 3 ай бұрын
Can the rearrangement be arbitrary if the matrices involved in the multiplication (which we are taking the trace) have the same dimensions? The cyclical rearrangement is obviously required if the dimensions are different.
@APaleDot
@APaleDot 3 ай бұрын
@@slavinojunepri7648 No, even if they are all square matrices, they have to be rearranged cyclically (unless they otherwise commute).
3 ай бұрын
from Morocco thank you very much son you took me far away 40 years in my past university studies
@aremathukr
@aremathukr 2 ай бұрын
I actually have this formula tattooed on my left forearm:) I remember seeing it for the first time in 3b1b video, and immediately trying to prove it in my head. I managed to do it for the case of the diagonal matrix when I was falling asleep, and when I woke up I knew how to generalize it for any square matrix. After I got the tattoo, this formula started appearing everywhere in my studying, which is funny) Thanks for the video
@Geenimetsuri
@Geenimetsuri 2 ай бұрын
Thanks. I understood it. Excellent presentation.
@i.h.i.d9725
@i.h.i.d9725 3 ай бұрын
I just started linear algebra this semester, and this video makes me excited about the subject.
@tomkerruish2982
@tomkerruish2982 3 ай бұрын
Alternatively, tr = ln○det○exp.
@BenfanichAbderrahmane
@BenfanichAbderrahmane 3 ай бұрын
det=exp(tr(ln))
@considerthehumbleworm
@considerthehumbleworm 2 ай бұрын
Ah, what an elegant way of calculating trace!
@CrazyShores
@CrazyShores 2 ай бұрын
Excellent explanation! Only the notation for elements of a matrix is a bit off, most books use lowercase letters
@alejrandom6592
@alejrandom6592 3 ай бұрын
Note that f(PJP^-1) = P f(J) P^-1 for any f(•) that has a taylor series
@khastakachori123
@khastakachori123 2 ай бұрын
What a beautiful result!
@JohnSmall314
@JohnSmall314 3 ай бұрын
Very nice and clear explanation
@thomasjefferson6225
@thomasjefferson6225 3 ай бұрын
This is a very good video. I really enjoyed it.
@shehbazthakur1
@shehbazthakur1 2 ай бұрын
superb, loving it. subscribing.
@coshy2748
@coshy2748 2 ай бұрын
Very clear. Good presentation.
@berlinisvictorious
@berlinisvictorious 2 ай бұрын
Always loved me some linear algebra
@alejrandom6592
@alejrandom6592 3 ай бұрын
Very nice and easy to follow
@andreagourion9104
@andreagourion9104 2 ай бұрын
I think you can prove it with this method too : (lambda,v) is an eigenvalue/eigenvector tuple of A iff (e^{lambda}, v) is an eigenvalue/eigenvector tuple of exp(A) (provable with the definition of eigenvalue on the Taylor expansion, a bit more difficult for the other side). The other thing is that det(A)=lamba_1…lambda_n, tr(A)=lamba_1+…+lambda_n (provable with a root factorization of the characteristic polynomial of A). Finally plug exp(A) in the the det, use the two last equalities and you have the result.
@flyingtiger123
@flyingtiger123 2 ай бұрын
Good work ❤
@Ajay-ib1xk
@Ajay-ib1xk 2 ай бұрын
Nice collection of topics
@nabla_mat
@nabla_mat 3 ай бұрын
You’re back!
@nikkatalnikov
@nikkatalnikov 2 ай бұрын
very clear, thank you very much
@ironbutterfly3701
@ironbutterfly3701 2 ай бұрын
order matters if more than two terms inside trace, it allows cyclic rotation.
@soyoltoi
@soyoltoi 3 ай бұрын
Very clear and easy to follow
@ulychun
@ulychun 2 ай бұрын
Is there a geometric explanation to this equation?
@pahom2
@pahom2 3 ай бұрын
Does it only work for e or for any base?
@MuPrimeMath
@MuPrimeMath 3 ай бұрын
The result holds for any positive base.
@ehudkotegaro
@ehudkotegaro 3 ай бұрын
Take the function det(e^(At)) The derivative of det at matrix A applied to matrix X is det(A)* tr(A^-1*X). And so the derivative of this function is det(e^(At))*tr(e^(-At)*A*e^(At))=det(e^(At))*tr(A) Also, det(e^(A*0))=1 And so we get det(e^(At))=e^(t*tr(A)) And subsuiting in one, we get det(e^A)=e^tr(A)
@98danielray
@98danielray 3 ай бұрын
I suppose you are using that both functions satisfy the same linear differential equation? f'(t) = trA f(t) and g'(t) = trA g(t) f(0)=g(0)=1 Otherwise, I dont see where you show that their derivatives are equal.
@ehudkotegaro
@ehudkotegaro 3 ай бұрын
@@98danielray Yes, they are equal because the second function is the only solution for the differential equation that the left function fulfils.
@ilheussauer323
@ilheussauer323 2 ай бұрын
thank you!! how i wish you had a vid on hermitian matrices :(
@jceepf
@jceepf 2 ай бұрын
I believe that you can prove this directly by using the formula exp(xA)= lim_(N->infinity} (1+xA/N)^N and expanding the deterninant in x/N for small x/N. The leading order would be related to the trace and the rest goes away as N-> infinity. Of course, the Jordan decomposition provides an elegant proof if you assume the existance of Jordan decomposition.
@m4gh3
@m4gh3 2 ай бұрын
Jordan decomposition is always possiblw over an algebraically closed field, for example the complex numbers
@jceepf
@jceepf 2 ай бұрын
@@m4gh3 Yes, I did not say it was wrong.
@dominiquelarchey-wendling5829
@dominiquelarchey-wendling5829 2 ай бұрын
Then you have to explain how to get the algebraic closure which is not as easy as this exercise...
@matteovissani1071
@matteovissani1071 2 ай бұрын
Love this topic
@ManishSingh-gc5fv
@ManishSingh-gc5fv Ай бұрын
It is only valid for square matrices and not for rectangular matrices.
@arpitdwivedi9175
@arpitdwivedi9175 2 ай бұрын
It would be shorter if we allow eigenvalues in the picture. Det(e^A) = product of eigenvalues of e^A = e^L1.... e^Ln = e^(L1+ ... +Ln) = e^(Tr A). Where L1,..., Ln are eigenvalues of A.
@aquamanGR
@aquamanGR 2 ай бұрын
Hmm... I don't think J is *always* upper-triangular, as you have assumed. For example, if A is 2x2 with complex eigenvalues, then I don't think J can be upper-triangular.
@MuPrimeMath
@MuPrimeMath 2 ай бұрын
In fact the Jordan canonical form is always upper triangular. We permit J to have complex entries, as the rest of the proof still holds.
@aquamanGR
@aquamanGR 2 ай бұрын
@@MuPrimeMath OK I missed the "complex entries" thing although you did state it in the video. Makes perfect sense now, thanks!
@sirshabiswas3010
@sirshabiswas3010 3 ай бұрын
Sir? This could be an insult to you but I was wondering if you could give me a tip on how to find the limits while finding the area using integration. Please don't mind. I really have a hard time figuring out the limits. And lots of love and support! :⁠-⁠)
@MuPrimeMath
@MuPrimeMath 3 ай бұрын
Sorry, the question is not clear. It's hard for me to give advice on such general topics. I wish you the best of luck.
@sirshabiswas3010
@sirshabiswas3010 3 ай бұрын
@@MuPrimeMath it's okay, no worries. Thanks for your reply. Keep uploading. Lots of love!
@alejrandom6592
@alejrandom6592 3 ай бұрын
​@@sirshabiswas3010if the limits aren't given, you might be referring to interesection points. Equate functions and solve for intersection, is my guess. For these topics there are good resources online. Good luck!
@sirshabiswas3010
@sirshabiswas3010 3 ай бұрын
@@alejrandom6592 thanks! good luck too!
@csilval18
@csilval18 3 ай бұрын
Very cool video. Interesting, to the point, well explained... You should get more views
@licks1_
@licks1_ 3 ай бұрын
When we write A as PJP^-1, you said P is _any_ matrix. Just to clarify, I'm assuming you meant P is any _invertible_ matrix?
@MuPrimeMath
@MuPrimeMath 3 ай бұрын
More specifically, P is a change-of-basis matrix, which is always invertible.
@alejrandom6592
@alejrandom6592 3 ай бұрын
Yeah it's a slight mistake
@sagnikbiswas3268
@sagnikbiswas3268 2 ай бұрын
Im not great at linear algebra but isn't P just the matrix of eigenvectors
@wargreymon2024
@wargreymon2024 2 ай бұрын
dude you see P^-1, that said invertible
@jakeaustria5445
@jakeaustria5445 3 ай бұрын
Thank You
@Mini_Wolf.
@Mini_Wolf. 2 ай бұрын
Fabulous
@spaceman392001
@spaceman392001 2 ай бұрын
The infinite sum for e^A starts with I, the identity matrix, not the number 1
@MuPrimeMath
@MuPrimeMath 2 ай бұрын
Yes. Here 1 denotes the multiplicative identity in the ring of matrices, which is the identity matrix.
@dgsndmt4963
@dgsndmt4963 2 ай бұрын
Nice...
@MrNerst
@MrNerst 2 ай бұрын
Further assumptions are needed. For example, that A is a square-matrix, never explicitly mentioned along the video, otherwise det(e^A) doesn't make any sense, as the determinant is defined only for square-matrices. A proof that the power-expansion of e^A is valid because A is a bounded linear operator in finite dimensions.
@wargreymon2024
@wargreymon2024 2 ай бұрын
No, he did perfect. Whatever you said isn't defined on the domain of determinant to begin with, yours is excessive.
@MrNerst
@MrNerst 2 ай бұрын
@@wargreymon2024 First, study a little about endomorphisms and then share your opinion on social media.
@wargreymon2024
@wargreymon2024 2 ай бұрын
ambiguous and excessive
@antormosabbir4750
@antormosabbir4750 2 ай бұрын
Wow!
@wargreymon2024
@wargreymon2024 2 ай бұрын
👍🏻👍🏻👍🏻👍🏻👍🏻🔥🔥🔥🔥
@jaimeduncan6167
@jaimeduncan6167 2 ай бұрын
He keep saying "any matrix", but the power of a Matrix, in general, is not well defined. For square Matrixes it is, A^n will always exist, but it's not the case for more random matrixes (m x n) than one can define. This is a mayor flaw of the video, in particular for a formal mathematics one.
@MyOneFiftiethOfADollar
@MyOneFiftiethOfADollar 3 ай бұрын
Note that we could call this a Simply Beautiful solution, BUT not as beautiful as a Cowboy cheerleader.
@edofarmer812
@edofarmer812 2 ай бұрын
Me on a first date
@newdayrising
@newdayrising 2 ай бұрын
The series expansion of e^A starts with the identity matrix, not 1.
@MuPrimeMath
@MuPrimeMath 2 ай бұрын
Yes, here 1 denotes the multiplicative identity of the matrix ring.
@harrypewpew901
@harrypewpew901 2 ай бұрын
I do not know why, but these math/cs geeks creep me the f out, they give serial killer vibe.
@kingfrozen4257
@kingfrozen4257 2 ай бұрын
This proof is so wrong in many aspect I can't resolve it in a comment so I will just leave a dislike and tell viewers to not use this proof in their HW or any actual work they are doing!
@louis8041
@louis8041 2 ай бұрын
I 100% agree and I’ll do it myself : Given a matrix A and considered as a complex matrix, its caracteristic polynomial has n roots (Gauss theorem) if not counted with mutltiplicities so A is C-trigonalizable and its diagonal coefficients are its eigen values lambda. Thus exp(A) is similar to a triangular matrix which diagonal coefficients are exp(lambda). Taking the determinant gives the result. No need to talk about Jordan’s form (absolute non-trivial result which can’t be used like nothing!!)
The Derivative Equals The Square
11:03
Mu Prime Math
Рет қаралды 19 М.
"A Random Variable is NOT Random and NOT a Variable"
29:04
Dr Mihai Nica
Рет қаралды 60 М.
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН
Мясо вегана? 🧐 @Whatthefshow
01:01
История одного вокалиста
Рет қаралды 7 МЛН
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
Complex numbers as matrices | Representation theory episode 1
19:17
Proof: Orthogonal Matrices Satisfy A^TA=I
9:48
Mu Prime Math
Рет қаралды 15 М.
the geometry of the third derivative
31:10
Michael Penn
Рет қаралды 102 М.
Kepler’s Impossible Equation
22:42
Welch Labs
Рет қаралды 243 М.
Induction on Real Numbers
12:22
Mu Prime Math
Рет қаралды 17 М.
This open problem taught me what topology is
27:26
3Blue1Brown
Рет қаралды 931 М.
Why Runge-Kutta is SO Much Better Than Euler's Method #somepi
13:32
Phanimations
Рет қаралды 163 М.
The deeper meaning of matrix transpose
25:41
Mathemaniac
Рет қаралды 397 М.