Powers of Matrices and Markov Matrices

  Рет қаралды 49,048

MIT OpenCourseWare

MIT OpenCourseWare

Күн бұрын

Пікірлер: 28
@georgesadler7830
@georgesadler7830 3 жыл бұрын
Dr. Strang really wants each student to understand linear algebra in all phases from top to bottom and from left to right. After the lectures are over he wants you to retain this information forever. These lectures really tell you the student the passion DR. Strang has for teaching.
@AmandaSMaria
@AmandaSMaria 7 жыл бұрын
the best teacher in the world!! Thank you professor Strang!!
@galolgedi9314
@galolgedi9314 6 жыл бұрын
Amanda Maria
@yujing7544
@yujing7544 7 жыл бұрын
“the best teacher in the world!! Thank you professor Strang!! ”, the same words from my heart.
@ffvgdsg5584
@ffvgdsg5584 5 жыл бұрын
as a student of international university, which is not in US i've witnessed that most professors are using prof Strang's lectures as a pattern and the book particularly we use is intro to linear algebra G.Strang. That is another evidence of that he is currently the best professor in the world
@Jeshtroy
@Jeshtroy 7 жыл бұрын
He teaches in a way that a 3rd grader could understand!
@jimnewton4534
@jimnewton4534 3 жыл бұрын
To compute the nth power of a square matrix A requires at most log(n) matrix multiplications, i.e. n^3 log n. (assuming cubic time matrix multiplication) Why? because if n is even I can simply square A^{n/2}, and if n is odd I can multiply A by A^{n-1}. If I have the Eigen decomposition then raising to the nth power is n^3 (just compute the powers along the diagonal of the diagonal matrix) and multiple on left and right by V and V^{-1}.
@dengdengkenya
@dengdengkenya 5 жыл бұрын
Fantastic! I'm really enlightened by this lecture, though I can't say Professor Strang always gave such a clear explanation on each linear algebra subject.
@edwardhartz1029
@edwardhartz1029 4 жыл бұрын
That was superb, I love how thorough you are. Now I understand why pii^(n)=a*(lambda1)^n+b*(lambda2)^n+... (For some a,b,c.. which can be determined using simultaneous equations.)
@paulchan6818
@paulchan6818 Жыл бұрын
A good lecturer that connects to his students.
@tgx3529
@tgx3529 3 жыл бұрын
Time 12:24, is it realy Markov matrix from Markov proces?? 0,8+0,31 and 0,2+0,71
@arvindvishwakarma4257
@arvindvishwakarma4257 6 жыл бұрын
Best teacher in the world
@jonahansen
@jonahansen 6 жыл бұрын
What a great teacher!
@saeida.alghamdi1671
@saeida.alghamdi1671 4 жыл бұрын
Quite Interesting implication of the presentation!
@dengdengkenya
@dengdengkenya 5 жыл бұрын
What if all eigenvalues were less than one in the case of Markov Matrix example? Or is there any theorem that proves at least one lambda is greater than one here?
@jordanhansen6649
@jordanhansen6649 5 жыл бұрын
There is always an Eigen value that equals 1 in the case of a Markov Matrix
@jesusglez09
@jesusglez09 4 жыл бұрын
i love this Thank you Prof. Thank you MIT
@devrimturker
@devrimturker 4 жыл бұрын
Excellent explanations
@sauravnagar1745
@sauravnagar1745 6 жыл бұрын
How does he write the values of eigenvectors so easily? I mean he doesn't even perform any mental calculations. Does anyone has any clue?
@checkout8352
@checkout8352 5 жыл бұрын
Thank you vey much
@engineershmily
@engineershmily 6 жыл бұрын
please anyone can guide me how if U(k+1)=AU(K) then U(k)=A^k U(0) at time 3:25
@jancirani2748
@jancirani2748 6 жыл бұрын
Start with u(1). 1. U(1)= A.U(0). --> eq.1 2. U(2)= A.U(1) = A.A.U(0) (from eq.1) U(2)= A^2.U(0) Proceeding like this, It gives U(k) = A^k.U(0). Hope, it's clear
@shrinivasiyengar5799
@shrinivasiyengar5799 6 жыл бұрын
@@jancirani2748 can this be said to be similar to how in the continuous time system X' = AX gives a solution X(t) = (e^At)X(0)
@WolfixDwell
@WolfixDwell 4 жыл бұрын
@@shrinivasiyengar5799 Hi, prolly little bit late but: Solution to differential equation as your system dynamic is equal to x(t)= (e^At)x(0), so this solution coresponds to differential equation. And solving the e^At corresponds to e^\lambda_1 t + e^\lambda_2 t ... bcs of equality of characteristic polynomial equation
@yohanshailu5620
@yohanshailu5620 3 жыл бұрын
make it 1.75x
@wuoshiwzm001
@wuoshiwzm001 7 жыл бұрын
professor Strang is getting old.... but always giant to me..
@videofountain
@videofountain 7 жыл бұрын
Everyone is getting older or the other alternative. Including you.
@justinkim7743
@justinkim7743 4 жыл бұрын
Thank you so much
Solving Linear Systems
15:48
MIT OpenCourseWare
Рет қаралды 30 М.
24. Markov Matrices; Fourier Series
51:12
MIT OpenCourseWare
Рет қаралды 118 М.
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 148 МЛН
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 36 МЛН
人是不能做到吗?#火影忍者 #家人  #佐助
00:20
火影忍者一家
Рет қаралды 20 МЛН
Proton Mail Basics & Key Features
3:03
Proton | Privacy by Default
Рет қаралды 1 М.
Markov Matrices
11:49
MIT OpenCourseWare
Рет қаралды 58 М.
Visualize Different Matrices part1 | SEE Matrix, Chapter 1
14:51
Visual Kernel
Рет қаралды 82 М.
Two MIT Professors ACCIDENTALLY discovered this simple SECRET TO LEARNING
5:10
Singular Value Decomposition (the SVD)
14:11
MIT OpenCourseWare
Рет қаралды 632 М.
The Column Space of a Matrix
12:44
MIT OpenCourseWare
Рет қаралды 138 М.
25. Symmetric Matrices and Positive Definiteness
43:52
MIT OpenCourseWare
Рет қаралды 136 М.
The Matrix Exponential
15:32
MIT OpenCourseWare
Рет қаралды 125 М.
The Big Picture of Linear Algebra
15:57
MIT OpenCourseWare
Рет қаралды 994 М.
Markov Chains & Transition Matrices
6:54
Dr. Trefor Bazett
Рет қаралды 237 М.
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 148 МЛН