How to compute the dominant eigenvalue using the power method. Join me on Coursera: imp.i384100.ne... Lecture notes at www.math.ust.hk... Paperback at www.amazon.com... Subscribe to my channel: www.youtube.com...
Пікірлер: 22
@simonebozzardi13012 жыл бұрын
Our Lord and Savior Jeffrey Chasnov. There's no way I'd be able to understand all of this by myself. You truly are a blessing to all of us self-taught students out here
@lifelyrics5659 Жыл бұрын
What alien language did he just say?
@nerium1440 Жыл бұрын
The convergence part was super helpful, way better explained than other sources
@AjdinKocan4 ай бұрын
indeed, this man is single-handedly responsible for me understanding this and everyone else explains it like its defense against the dark arts
@mariomariovitiviti2 күн бұрын
the "trick" is used to diminish dimensions to a scalar :)
@marcellocali8985 Жыл бұрын
Why is there the X_p transposed when we solve for lambda_1? Can't understand this part.
@OlehAbramov Жыл бұрын
It took me some time to figure it out, but the idea is that we simply take a projection of X_p+1 onto X_p, so X_p transposed is just a part of the projection formula. The idea of the projection is such that you find some coefficient, which when multiplied by X_p, would produce such a scaled version of X_p that is the closest to X_p+1. We know that when p is sufficiently large, both X_p+1 and X_p are simply some multiples of e1, so they are collinear, which means X_p+1 can be expressed as X_p multiplied by some coefficient and that is the coefficient that projection gives us. Now if we look at the formulas of X_p+1 and X_p, the difference between them is just an extra multiple of lambda, so lambda = projection coefficient of X_p+1 onto X_p.
@TheTylerdurden1523 жыл бұрын
did you learned how to write backwards just so that you could lecture from behind a glass pane strictly to not be in the way of the camera/student view? If so i commend educators like yourself, thank you
@ProfJeffreyChasnov3 жыл бұрын
Haha!
@minahany58942 жыл бұрын
I think it's more like he has the glass pane between himself and the camera and the camera simply reflects everything he writes.
@bellacrazyable Жыл бұрын
un genio, muchísimas gracias
@omerjunedi5874 Жыл бұрын
when did mike ehrmantraut become a math professor?????
@osamaelzubair120310 ай бұрын
How did you get the lambd1 formula ? Where did the X-transpose come from ?
@mariomariovitiviti2 күн бұрын
in order to diminish to a scalar you use the dot product :)
@Ralster10 ай бұрын
At 2:15 you say that the n eigenvectors are linear combinations of the other n vectors, but they're also linearly independent? How can they be both?
@lucrece48368 ай бұрын
As far as i understand, he says that the col vectors can be written as lin combin of evectors, as it spans Rn.
@sunnytian476510 ай бұрын
literally watching this minutes before my exam because i couldn't understand my lecture
@SvetiK13242 жыл бұрын
thank you!!
@jaholt11 Жыл бұрын
are you writing backwards on a glass pane you're standing behind or is it some camera trickery?
@megdutton7327 Жыл бұрын
pretty sure he’s writing normally then when he edits the video he mirrors it so the writing is facing us
@JohnJTraston Жыл бұрын
Great! I can see you head really well. Just not the text in front of you.
@ashwiniashu2542 жыл бұрын
Hello sir...I have a dought in power theorem....so can I talk with u sir