7: Power Method for Eigenvalues - Learning Linear Algebra

  Рет қаралды 21,099

Mu Prime Math

Mu Prime Math

5 жыл бұрын

Full Learning Linear Algebra playlist: • Learning Linear Algebra Explanation / derivation / proof of the power method for finding eigenvectors and eigenvalues numerically. The way it works is so cool!
Finding eigenvalues directly: • 6: Eigenvalues: Why de...
New math videos every Monday and Friday. Subscribe to make sure you see them!

Пікірлер: 41
@laneellisor7113
@laneellisor7113 5 жыл бұрын
Love your channel! Keep up the great work
@matthewholmes280
@matthewholmes280 3 жыл бұрын
You are a legit lifesaver, anyone got recommendations of channels as good?
@reislerjul
@reislerjul 3 жыл бұрын
Incredibly helpful, thank you!
@ishaaq5506
@ishaaq5506 3 жыл бұрын
Bro!!Have a blast..Can't express my Understanding..Hats off🔥
@Lakesaltwalker
@Lakesaltwalker 3 жыл бұрын
Yes; this guy is good. So easy to understand
@mayfu6508
@mayfu6508 3 жыл бұрын
Thank you so much!!! Clearly explained!!!
@pablorc8091
@pablorc8091 2 жыл бұрын
5:22 is where it clicked for me. Thank you very much
@lazergurka-smerlin6561
@lazergurka-smerlin6561 Жыл бұрын
Thank you! This brought a lot of clarity to the method. I could find myself guessing how it worked halfway through so yea, I enjoy actually understanding the programming methods I am to use lol
@PyroEscapeVehicle
@PyroEscapeVehicle Жыл бұрын
Brilliant video!!! Thank you for the help : )
@crazybrembo
@crazybrembo 4 жыл бұрын
Thank you,you make me understand.
@archianosohliya3436
@archianosohliya3436 4 жыл бұрын
I love the thumbnail as much as I love the explanation
@AbdulWaheed-ue4np
@AbdulWaheed-ue4np 2 жыл бұрын
Thank you sir very much 💖
@akshatagarwal6776
@akshatagarwal6776 4 жыл бұрын
earned a sub.
@diyanair214
@diyanair214 5 ай бұрын
thank you..
@zahraakhalife9150
@zahraakhalife9150 Жыл бұрын
Thanks a lot🙂🙂 Why don't we raise the largest eigenvalue to the power (1/k) so that we evaluate lamda_1 ?
@VibingMath
@VibingMath 5 жыл бұрын
Great video with funny thumbnail from dragonball 😂 A lot of power gained!
@rob876
@rob876 4 жыл бұрын
Great video. We never learnt this method way back when. I suppose this is because it is fairly inconvenient to apply if you don't have a tool like Excel. With Excel, it becomes a fairly simple task. Is it possible to use some kind of extrapolation method to make a good estimate of what the eigenvalue will converge to without having to make so many matrix multiplications?
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
The Gershgorin circle theorem can give general ranges for eigenvalues, but if we want precision, the two options are either matrix multiplication or using row reduction to solve the system Ax=v, which I talk about in video 8 on the shifted inverse power method. You can see my video on Gershgorin circles here: kzbin.info/www/bejne/qnTZg313bLlqp8k
@abdullahjhatial2614
@abdullahjhatial2614 3 жыл бұрын
amazing
@mertcihangiroglu7402
@mertcihangiroglu7402 3 жыл бұрын
at [07:22] you say the biggest number is 3 it is 6 obviously just wanted to mention that. Thanks for the video very clearly explained. Do you have any videos related to Lagrange interpolation or Pagerank algorithm?
@lucassmith8717
@lucassmith8717 2 жыл бұрын
it says 0.6 not 6 :)
@nenadilic9486
@nenadilic9486 3 жыл бұрын
Aren't we multiplying vector x over and over again by A? From what you said about scaling in between matrix-vector multiplications it implies multiplying the scaled right-hand side vector by A (instead of x) in the next iteration and so on, which will totally screw up the equation.
@bergamobobson9649
@bergamobobson9649 3 жыл бұрын
Great explanation, u have the gift to teach But at 5:27 I think u want to say they will keep decreasing because they are actually less than 1
@MuPrimeMath
@MuPrimeMath 3 жыл бұрын
In that case I was saying that the exponent was increasing, not the numbers themselves
@bergamobobson9649
@bergamobobson9649 3 жыл бұрын
@@MuPrimeMath Ann okay my bad
@holyshit922
@holyshit922 Жыл бұрын
Can you record something on QR method but from programmer's not from mathematician's perspective What I mean from programmer's perspective We don't need to construct Householder matrices and multiply them in standard way We dont need to construct Givens rotation matrices and multiply them in standard way We can multiply matrix by Householder matrix using only O(n^2) time and O(n) space We can multiply matrix by Givens matrix using only O(n) time and O(1) space Mathematicians dont care about it but programmers do Moreover I havent seen correct choice of shift so far I would like to see how to program block QR method (we work on chosen block of the matrix instead of whole matrix) I found how to reduce matrix to Hessenberg form via Gaussian elimination and i write how multiplication by rotation matrices looks like We need multiplication by rotation matrices from the left to get matrix R from matrix A We need multiplication by rotation matrices from the right to get matrix Q from matrix I
@aleksanderasimov4899
@aleksanderasimov4899 2 жыл бұрын
i like the shirt!
@bombster98
@bombster98 4 жыл бұрын
this channel is lit and all but like what r u wearing lmao is that a shirt from a paint rave?
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
It's a tie dye of a vector!
@casinarro
@casinarro 3 ай бұрын
Okay so this helps get only one, the dominant eigen value
@adrenochromeaddict4232
@adrenochromeaddict4232 Жыл бұрын
you're actually a good teacher I regret that I avoided your video because of the anime character
@yoosong2000
@yoosong2000 3 жыл бұрын
where is the X suddenly coming from?
@MuPrimeMath
@MuPrimeMath 3 жыл бұрын
X is an arbitrary vector! The power method will approach an eigenvector of the matrix for any choice of X. On the other hand, if we want the dominant (biggest magnitude) eigenvector, then we must choose an X such that when we express it as a linear combination of the eigenvectors like at 1:31, the coefficient for the dominant eigenvector is nonzero.
@holyshit922
@holyshit922 Жыл бұрын
Eigenvalues are more complicated to calculate then you presented in this video 1. We reduce matrix to Hessenberg form via transformations which preserve similarity of the matrix (Reduction will accelerate QR decomposition of the matrix) 2. QR decomposition via reflections or rotations 3. QR method should work on the chosen block of the matrix instead on whole of the matrix 4. We should choose shift properly which isnt explain well In fact multiplication by reflection matrices and by rotation matrices also isnt explained well
@Palmer9297
@Palmer9297 4 жыл бұрын
Where is vegeta!!!!!!
@bombster98
@bombster98 4 жыл бұрын
you missed that the initial vector cannot be an eigenvector! Otherwise the matrix will just scale out that vector and you wont get the vector associated with the dominant eigenvalue!
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
Yes, that's a good point!
@bergamobobson9649
@bergamobobson9649 3 жыл бұрын
Really good point
@binarysaiyan9389
@binarysaiyan9389 3 жыл бұрын
Wtf is Vegeta doing here?
@MuPrimeMath
@MuPrimeMath 3 жыл бұрын
Vegeta, what's the scouter say about his power level?
@binarysaiyan9389
@binarysaiyan9389 3 жыл бұрын
@@MuPrimeMath It comes as no surprise, it's over 9000
8: Shifted Inverse Power Method - Learning Linear Algebra
6:53
Mu Prime Math
Рет қаралды 19 М.
Does size matter? BEACH EDITION
00:32
Mini Katana
Рет қаралды 20 МЛН
Дарю Самокат Скейтеру !
00:42
Vlad Samokatchik
Рет қаралды 8 МЛН
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,6 МЛН
What do Matrices Represent? - Learning Linear Algebra
12:01
Mu Prime Math
Рет қаралды 4,7 М.
9: Gershgorin Circle Theorem - Learning Linear Algebra
10:34
Mu Prime Math
Рет қаралды 43 М.
Ramanujan: Making sense of 1+2+3+... = -1/12 and Co.
34:31
Mathologer
Рет қаралды 3,3 МЛН
Topology Definitions: Closure, Boundary, Interior
14:24
Mu Prime Math
Рет қаралды 20 М.
Eigenvalues & Powers of Matrices
16:57
slcmath@pc
Рет қаралды 49 М.
21. Eigenvalues and Eigenvectors
51:23
MIT OpenCourseWare
Рет қаралды 621 М.
Integral of ln(x) with Feynman's trick!
7:52
Mu Prime Math
Рет қаралды 650 М.
My favorite proof of the n choose k formula!
13:36
Mu Prime Math
Рет қаралды 37 М.