Eigenvalues and Eigenvectors

  Рет қаралды 45,631

Steve Brunton

Steve Brunton

Күн бұрын

This video explores the eigenvalues and eigenvectors of a matrix "A". This is one of the most important concepts in linear algebra. The eigenvectors represent a change of coordinates in which the "A" matrix becomes diagonal, with entries given by the eigenvalues. This allows us to easily solve arbitrarily complex linear systems of differential equations.
Playlist: • Engineering Math: Diff...
Course Website: faculty.washington.edu/sbrunto...
@eigensteve on Twitter
eigensteve.com
databookuw.com
This video was produced at the University of Washington
%%% CHAPTERS %%%
0:00 Overview and Eigenvalue Equation
3:15 Eigenvalues and Eigenvectors are "Special"
6:00 Example 2x2 Matrix
14:55 Computing Eigenvalues and Eigenvectors for any Matrix
21:18 The Determinant Measures Area of a Transformation
25:50 Determinant of 3x3 Matrix
28:35 Revisit 2x2 Matrix Example

Пікірлер: 64
@dennislui2938
@dennislui2938 Жыл бұрын
"Not a perfectionist. But I do have standards." -- eigensteve 🤣🤣🤣
@et4493
@et4493 Жыл бұрын
I'm dead LMFAO
@TNTsundar
@TNTsundar Жыл бұрын
This is such an Eigen lecture! Thank you for teaching us!
@pierricbross
@pierricbross Жыл бұрын
Just want you to know these videos are great. If you have any self doubt about anything like the time you spend thinking or whatever it makes the video better, not worse. Love the time you spend making sure everything is legible and coloured, and the explanations. You'll be out-earning Organic Chemistry Tutor in no time.
@potter-otter
@potter-otter Жыл бұрын
"Not a perfectionist but I do have standards" love it! 😂
@giovanniminelli5590
@giovanniminelli5590 Жыл бұрын
Love that graphical approach in the cartesian plane. Those were concepts usually taught as abstract properties but having an image in my mind helped me a lot!! Thank you!
@StaticMusic
@StaticMusic Жыл бұрын
I love this, eigenSteve! Incredibly grateful for your lectures and the huge effort you put in
@press2701
@press2701 Жыл бұрын
I took controls 30yrs ago in engineering (chemical) from Prof MF Doherty, time-series, stochastics, control expert. Excellent class, excellent prof. The BEST part, which I'm patiently waiting for you to get into, was closed loop control, mostly in freq-domain (s-transforms). PID control design, moving poles and zeroes, control figures-of-merit. Super stuff, brings e-values/e-vectors and eng math to real life. Controllers (PID for sure) are everywhere. I also took a course in z-transforms, digital process control. I wonder, does nobody use z-transforms anymore? Is that math dead, replaced by computation? I've never seen/heard of z-transform again. Kind of like linear-programming: I took one course, and never heard of it again. ps: I hope your YT vids boost your tenure status. It's a fine piece of work you're collecting. Especially love that black board you got. There's a physics prof on YT who uses same (ouch, I forget his name). Impressive piece of kit.
@rajendramisir3530
@rajendramisir3530 Жыл бұрын
Brilliant lecture Professor Bruton. I really enjoyed it. Interesting for me to learn that eigen vectors and eigen values are used to solve ordinary differential equations. Just fascinating stuff!
@patriksandahl4052
@patriksandahl4052 11 ай бұрын
By following your videos, and by carefully working out all the examples myself I finally understand things that I've been struggling with for years. Your IRL students are previleged to have you as their professor.
@hoseinzahedifar1562
@hoseinzahedifar1562 Жыл бұрын
Your method of teaching these mathematical concepts is amazing... Thank you.❤❤❤
@anirbanchel430
@anirbanchel430 Жыл бұрын
Thank u sir for doing this.... I live in a third world country and never been taught linear algebra the way u did...
@wolfisr
@wolfisr Жыл бұрын
Dear Prof Brunton, thanks for this detailed series - i was never taught the deep connection between lin algebra and diff equations like you do it here. I think it is worth mentioning that all the diff equations you use here (in all the chapters of the series up till here) are not only linear but also HOMOGENEOUES. Thank!
@hasinabrar3263
@hasinabrar3263 Жыл бұрын
I have completed my masters in mechatronics and automatic control recently and a huge portion of my knowledge is from your videos and from your book data driven engineering. If I ever do a PhD in United States, It would be an honour for me to perhaps work with you or be under your wing somehow.
@USFJUM
@USFJUM Жыл бұрын
So I do.
@rasjon5224
@rasjon5224 Жыл бұрын
Great lecture! Another cool insight: the A matrix from the example is a symmetric matrix. the eigenvectors of a symmetric matrix are always orthogonal, as one can also see in the drawn coordinate system. this is important whenever A is a covariance matrix (a covariance matrix is always symmetric), e.g. for the PCA this means that the PCs are always orthogonal
@niz_T
@niz_T 3 ай бұрын
Finally i got a clear understanding of the topic - thanks a lot!! 👍
@GabriellaVLara
@GabriellaVLara 6 ай бұрын
why did no one teach us like this at the University? :') thank you for an amazing video!
@abdulwasayikhlaq8013
@abdulwasayikhlaq8013 6 ай бұрын
Amazing lecture!
@michaelpotter3418
@michaelpotter3418 7 ай бұрын
Superb lecture. Very clear.
@jones1351
@jones1351 Жыл бұрын
Thanks for the memories. I took linear algebra a lifetime ago. It was one semester if memory serves, and we got as far as eigen vectors and values. At the time I had no idea what was the goal of it all. Except how to find the values and vectors. I was in the tall weeds. So, diff equations, huh?
@lioneloddo
@lioneloddo Жыл бұрын
For me, the magical thing appears when vectors are waves ! And then, the magnitude of these waves, are frequencies. A high magnitude means a high frequencies. It's so Amazing! How Can we imagine a Vector, not having a length, but a frequency?
@danielsmb2635
@danielsmb2635 Жыл бұрын
Prof Brunton, I knew something deep inside the Eigen (historically, linguistically and technical) many years ago. I studied it almost every day for about 20 years and still don’t understand fully. Due to the symmetry of the birthdate, I named my son Eigen. My translation on Eigen is fundamental law necessary to keep equilibrium under dynamics propagation. Hope to see more video on Eigen in explaining the universe, in chaos, there is a cosmos; in all disorders, a secret orders (Carl Jung) ❤
@carultch
@carultch 9 ай бұрын
It's essentially like the prefix "auto-" in English. Had this concept been coined in Britain instead of Germany, they likely would be called autovalues and autovectors.
@lalaland9509
@lalaland9509 2 ай бұрын
great video, thank you!
@andreich8274
@andreich8274 Жыл бұрын
Brilliant video
@alexcook4851
@alexcook4851 Жыл бұрын
Perfect for soft iron compass calibration
@mauYair
@mauYair Жыл бұрын
It seems to me that "eigen" means self/own. I imagine the unmovable directions associated with the transformation, all directions end up somewhere else, the eigenvectors are the transformation's own (self) directions. I speak Portuguese and Hebrew, in both languages eigen is translated as self: autovetor/autovalor (pt); ערך עצמי / וקטור עצמי (he). Awesome videos, Steve! you rock!
@tupublicoful
@tupublicoful Жыл бұрын
Self transformation sounds like an appropriate term.
@xtrabass646
@xtrabass646 Жыл бұрын
In Greece we have the word ιδιο in linear algebra where eigen is used. And the meaning is self/own.
@danieljulian4676
@danieljulian4676 6 ай бұрын
Also relevant is the "proprietary" semantics of "eigen", also toward "own", or even "hidden", as fundamental properties of the matrix to which they are eigen-val/vec. In the first application of the concepts, it was translated as "proper values" and "proper vectors" in English.
@hassanlaqrabti4036
@hassanlaqrabti4036 Жыл бұрын
Professur i bought your book but i’m wondering where i can get good exercices like in image processing ?
@skaska24361
@skaska24361 9 ай бұрын
Thank you Prof. Brunton My interest is to characterize which solution is more stable and (!) probable. Is there any correlation between the eigenvalue and the attractor's attractiveness? As for me eigenvalue clearly gives the depth and stability property, but does it correlate with the "width" of the attractor, e.g. how many random initial conditions can "fall down". Would appreciate any ideas and responses from the community.
@curtpiazza1688
@curtpiazza1688 3 ай бұрын
Great lecture! 😂
@byronwatkins2565
@byronwatkins2565 Жыл бұрын
Eigen means own or belonging to. It also can mean peculiar, inherently, or distinctive.
@AI-DJ
@AI-DJ Жыл бұрын
thank you i want to know how to solve scoliosis by non surgery..
@cerbahsamir5119
@cerbahsamir5119 Жыл бұрын
But if the lamda is negative then x kinda changes direction
@dtentes-Bivouac-974
@dtentes-Bivouac-974 Жыл бұрын
First vector (1,1) or (2,2) etc.... second vector (1 ; -1) or (2;-2) linear combinaison No ? Great Video
@alial-ghanimi8357
@alial-ghanimi8357 Жыл бұрын
Hi Steve, Why you are not using word "unique" instead of "special" when you referring to Eigens ?
@benjaminpommer628
@benjaminpommer628 Жыл бұрын
The formular for diagonalizing a matrix and for the eigenvectors look indeed similar but they are not the same since matrix multiplications are not communitative, right? Or how are they linked to each other?
@APaleDot
@APaleDot Жыл бұрын
They are linked! The motivation for eigenvectors is to find a basis where the transformation matrix boils down to simply scaling along certain axes. If the problem you're working on can be expressed as just stretching or squishing in certain directions, then you have a bunch of vectors multiplied by scalars rather than transforming in all sorts of arbitrary ways which is what happens with matrix multiplication, and that's a lot easier to deal with. How is this stretching/shrinking expressed as a matrix? It's a diagonal matrix. The first entry on the diagonal stretches the first axis, the second entry stretches the second axis, and so on... So what does this have to do with eigenvectors? Well, the eigenvectors tell you the axes along which the stretching occurs. An eigenvector is defined as a vector which is simply scaled by λ when multiplied by its matrix, so if you decompose a vector into components that lie in the same directions as the eigenvectors, each of those components will only get scaled by some λ when multiplied by the given matrix. In other words, the given matrix _looks_ diagonal if you take the eigenvectors as your basis vectors.
@user-vb3ly2yn9l
@user-vb3ly2yn9l 10 ай бұрын
After reading 3bule1brown's linear algebra, I can see what this chapter says...
@matthiascarlstedt6037
@matthiascarlstedt6037 Жыл бұрын
"These aren't the Eigenvalues 🤖you're looking for." 🤣
@muthukamalan.m6316
@muthukamalan.m6316 Жыл бұрын
waiting to get videos from UMAP and t-sne.
@shashidharreddy2959
@shashidharreddy2959 Жыл бұрын
@33.13 for lambda = 4, x could be [1 -1] or [-1 1]. Does this mean there could be two eigen vectors corresponding to one eigen value? But [1 -1] is just 180 degree rotation of [-1 1]. So they both are parallel but in opposite direction. Are both of these vectors considered as two different vectors or a same vector?
@csam952
@csam952 Жыл бұрын
Yes, technically xi could be [1 -1] or [-1 1]. In fact there are an infinite number of eigenvectors that have the same eigenvalue. These eigenvectors have the same span (i.e. are on the same line) so they each get stretched/squashed by the same amount (eigenvalue). [1 -1] and [-1 1] are different vectors but they are both vectors with the same span, so they are both in the set of eigenvectors that have an eigenvalue of 4.
@rosskious7084
@rosskious7084 18 сағат бұрын
Everything on the same line as 1, -1 and 4, -4 ( going both way) will produce a 4 times increase ( either negative or positive). This includes -1, 1 . That is the relationship between the eigenvalue and the eugenics vector set.
@francescoghizzo
@francescoghizzo Жыл бұрын
Why is professor Steve Brunton doing a kamehameha? 😄
@fabiofarina9579
@fabiofarina9579 Жыл бұрын
it's a kamehame-Ax
@enisten
@enisten Жыл бұрын
I want to see him do a shoryuken on the Schrodinger equation in his next lecture. 😂
@Anorve
@Anorve Жыл бұрын
I wonder what they use as a mainboard
@danielsmb2635
@danielsmb2635 Жыл бұрын
Transparent glass maybe….
@derekapolloalambatin8136
@derekapolloalambatin8136 Ай бұрын
neat.
@lawrencechan3131
@lawrencechan3131 Жыл бұрын
I wish to do my undergraduate again....😢
@enisten
@enisten Жыл бұрын
I feel you, my friend.
@sahriarhossain360
@sahriarhossain360 2 ай бұрын
Something not making sense to me. The topic feels very complex to me
@et4493
@et4493 Жыл бұрын
Me and ksi are long time friends 😂 we need a backstory mr. Brunton
@davidmurphy563
@davidmurphy563 Жыл бұрын
Ok, I got 90% of that which for an uneducated moron is encouraging. The definition of an eigenvector is straightforward; a vector which is scaled when transformed by the matrix. Computing the determinant wasn't derived but, hey, it's an area/volume calculation of a parallelogram; base * height and an identity matrix has a area/volume of 1 so fine. Where I got lost was with lamba; it looks like that's just a scalar. I _think_ he was just re-expressing the transformed vector as a scalar of the identity matrix. So, 5 * [1, 0] and 4 * [0, 1]. 22:33 Where I'm lost is at the top he says vector x * matrix m = vector x * scalar. Oh, that's just the definition of an eigenvector, the transform is the vector scaled. Ok, that's fine. Which in turn equals the scalar * identity (ok, a scaled identity) times that by x. That's x scaled. Ok, that's just saying the same thing once again. Fine. Then it get weird... (the matrix * the scaled identity) Why?! Ok, that scales the matrix. * a vector = zero. What?! This is witchcraft, burn him! Ok, listened to it again. You times a vector by the identity nothing happens. Times is by a scalar and is scales. Ohhhh.... Obviously if you scale the identity matrix and multiply that by the transform matrix then you're just scaling the transform matrix. Times that by x and you're scaling x. Why does it equal zero? Zero of what?
@davidmurphy563
@davidmurphy563 Жыл бұрын
@@hyperadapted I appreciate the tip but I can't stand that channel. Ugh. Plus, what's happening spatially is crystal clear to me. It's just the final bit of the vector maths to calculate the eigenvector. I'm almost there I think. I've got hands on experience coding games, graphics engines and neural nets but my maths knowledge is below high school which means I have to learn everything from scratch. Khan Academy has a great linear algebra that's absolutely outstanding. I briefly watched but didn't rigorously work through the eigens with it so I'll grab a pen and paper and ram it into my thick skull. Thanks again for the suggestion!
@APaleDot
@APaleDot Жыл бұрын
I know this was commented 3 months ago but I figured I'd try to explain: We have the fundamental idea behind eigenvectors: when they are multiplied by their matrix, they only get scaled by some amount λ. This is represented in the eigenvector equation Ax = λx, or for our purposes we'll write Ax = λIx because multiplying by I doesn't change x at all but it's nice for us because now we have a matrix on both sides of the equation. Because both sides have matrices we can add/subtract them, so we can rewrite the equation as Ax - λIx = 0 and factor out the x to the right which becomes (A - λI)x = 0. This is expressing exactly the same truth as the original equation: the eigenvector x will be scaled by λ when transformed by A, but now expressed as "there is no difference between the eigenvector scaled by λ and the eigenvector transformed by A". But this new form of the equation is handy because now we have some matrix M = (A - λI) which sends x to 0. If we can figure out which particular λs satisfy that equation, we've got our answer. Any transformation that sends x to 0 must collapse one of the dimensions of the space, in other words it must not only send x to zero but it must send all multiples of x to zero as well. That entire line that lies on x is just completely smashed into the origin. This follows from the linear properties of matrix multiplication, but I won't go into detail on that. Given that you understand determinants, you'll notice that any transformation which completely smashes one of the dimensions into a point will also completely flatten any volume down nothing as well. So, we know that the determinant of M must be 0, because we know that M flattens the space by at least one dimension. And the rest is just algebra as shown in the video (solving for the roots of a polynomial).
@puzu9202
@puzu9202 Жыл бұрын
Kame-mathe-HA!
@AlessandroZir
@AlessandroZir 3 ай бұрын
wouldn't it be much simpler and more clear to say that eigenvectors have to do with magnitude and not direction?! I don't understand why all that juggling to say something really simple...
@JoannaPirieHill
@JoannaPirieHill Жыл бұрын
No, I don't know that x is a vector and A is a matrix. They are both shown as scalars in your equations. Using inconsistent notation leads to confusion and prevents learning. Choose a notation that is consistent throughout mathematics. Not doing so is lazy at best and elitist at worst.
@YRBYD
@YRBYD 2 ай бұрын
His notation is perfectly clear. "lazy at best and elitist at worst"? He is providing hundreds of hours of high quality educational content for free. That's neither lazy nor elitist.
Motivating Eigenvalues and Eigenvectors with Differential Equations
23:58
Жайдарман | Туған күн 2024 | Алматы
2:22:55
Jaidarman OFFICIAL / JCI
Рет қаралды 712 М.
OMG😳 #tiktok #shorts #potapova_blog
00:58
Potapova_blog
Рет қаралды 3,5 МЛН
Они убрались очень быстро!
00:40
Аришнев
Рет қаралды 3,6 МЛН
1❤️#thankyou #shorts
00:21
あみか部
Рет қаралды 88 МЛН
4. Eigenvalues and Eigenvectors
48:56
MIT OpenCourseWare
Рет қаралды 141 М.
Matrix Systems of Differential Equations
24:58
Steve Brunton
Рет қаралды 62 М.
How to Find Eigenvalues and Eigenvectors of 3x3 Matrices (I)
31:18
MathSci Consult
Рет қаралды 4,6 М.
3 x  3 eigenvalues and eigenvectors
12:29
Prime Newtons
Рет қаралды 40 М.
Real life example of Eigen values and Eigen vectors
4:44
Solid Mechanics Classroom
Рет қаралды 113 М.
Singular Value Decomposition (SVD): Mathematical Overview
12:51
Steve Brunton
Рет қаралды 377 М.
What eigenvalues and eigenvectors mean geometrically
9:09
Dr. Trefor Bazett
Рет қаралды 184 М.
Eigenvalues and Eigenvectors
19:01
MIT OpenCourseWare
Рет қаралды 235 М.
#miniphone
0:16
Miniphone
Рет қаралды 3,5 МЛН
Cadiz smart lock official account unlocks the aesthetics of returning home
0:30
Нашел еще 70+ нововведений в iOS 18!
11:04