Okay, if anyone is like me who is interested in Linear Algebra and not necessarily Differential Equations, but still wants to understand this lecture for the sake of completeness in this course, here is the list of all the Differential Equations theory you need for this lecture and where you can find content to learn it: 1. Separable differential equations - Khan Academy - this is for v(t)=exp(Lambda t) v(0) 2. 2nd order linear homogeneous equations - Khan Academy - you need this if you want to understand one calculation in the next item on this list (quick to learn anyways) 3. Systems of differential equations - MIT OCW 18.03, lectures 24 and 25 - basically does the first example in this lecture, but actually goes through the steps of explaining what is happening, highly recommended to watch both videos As someone who is unlikely to apply a whole lot of differential equations in the future, I still found this lecture useful to reinforce the properties of Eigenstuff from the previous lecture, also the very last bit of the lecture is very cool
@mitrus4 Жыл бұрын
the comment I needed, thank you! xD
@lq_129 ай бұрын
gracias
@mightyentity34948 ай бұрын
Thank you, much appreciated
@AnkitAgarwal-gw6qw8 ай бұрын
Thanks a lot buddy! This was really helpful.
@fuchunhsieh73963 ай бұрын
Thanks for the really helpful resources!
@qbtc5 жыл бұрын
This lecture and the one before are some of the hardest in the whole course. Stay the course, everyone.
@projetforme14414 жыл бұрын
@Sirin Kalapatuksice probably people in MIT learns ODE's at the same time of this lecture in their analysis course
@snowy01104 жыл бұрын
to whomever think that it is complicated: you are missing essentials from 3blue1brown. His explanations will give you a solid basis to watch MIT lectures with joy and pleasure.
@liuhsin-yu95934 жыл бұрын
@@snowy0110 Thank you so much, I'll check that out latter. I've done calculus years ago but forgot all of it, and I'm struggling with the course.
@SahilZen42 Жыл бұрын
But both seem to me interesting 👍
@seventyfive75976 жыл бұрын
For those confused by this lecture, don't worry, you should be, it's no a failure of yours, it's a syllabus description mistake. I'm happy I see these videos as a review for things I ALREADY LEARNED years ago, because obviously if I were first exposed to this lecture with the prerequisites the course mentions, it wouldn't have been possible to understand this lecture. While the course specifies 18.02 as a prereq, and then immediately dismisses in its syllabus as not important, this lecture requires both 18.02 and 18.03, the latter is not mentioned at all in the syllabus. Universities usually parallelize these kind of courses with the parallel requisites such as 18.03, but OCW does not hint that at their site or here. While for me it was nice, I recognize that if you don't know diff. eqs this would not be possible for you to learn. The good news is that you can skip 1 or two lectures and you'd be right back on track, as this is just a nice implementation demonstration, but does not block Lin. Alg. learning.
@ashutoshtiwari43985 жыл бұрын
Thanks for stating that man. Although I figured that out, reading this reassured me.
@gordonlim23225 жыл бұрын
The textbook actually says: "This section is mostly linear algebra, but to read it you need one fact from calculus: The derivative of e^(lambda)(t) is (lambda)e^(lambda)t" which I have learnt before in junior college. However I have not yet lernt differential equations at the university level and this lecture has been especially hard for me. The textbook also does not follow the same flow or use the same examples as the video. At the same time I'm hesistant to skip a whole section with the fear that I might actually already know the prerequisites but merely stumbled by how the lecture is conducted.
@RAJATTHEPAGAL4 жыл бұрын
I think i did this in first year. I have seen this kind of solution of e^t terms for multiple ODE constraints given. Are these that only but just more elegantly solved using matrices. Because I remember we used give differentials some sort of variable then form a polynomial out of it solve them and then move to creating a solution. Is the matrix method doing the same but more elegantly just wanted to know..... It's bee years I have studied those.
@AdamOmidpanah3 жыл бұрын
Most unis have a whole course dedicated to diff eq. However, most engineering programs throw students into a statics course at the same time as they're taking diff eq. Having elementary exposure to diff eq prior to statics much quicker, and they will be quicker to move toward numerical methods for non-linear diff eq.
@mississippijohnfahey71752 жыл бұрын
@@gordonlim2322 unfortunately the textbook doesn't follow the lectures very well at all... Best to watch the lectures first (both 18.03 and 18.06), take notes, then use that to study the textbook. It's taken me about a year to tackle all that on my own, but I was taking a full time course load at my university as well. Now I'm reviewing select lectures for the second time and it's helping solidify everything nicely
@tachyon77775 жыл бұрын
The two dimensional sound channel matrix has only one independent vector. Bad one.
@puneetgarg4731 Жыл бұрын
You can use projection Matrix Mono Sound to solve the problem
@danishji2172 Жыл бұрын
@@puneetgarg4731 You seem like a smart man. Been following your comments around for a while here.
@puneetgarg4731 Жыл бұрын
@@danishji2172 thankyou
@yazhouhao70867 жыл бұрын
The greatest instructor I have ever met!! Thank you, Dr. Strang!
@JDBolt19587 жыл бұрын
An awesome lecture indeed! Studied elementary differential equations at University 40 years ago. About two weeks of lectures to present the same materail. Linear algebra clarifies the process very well. The information relayed here conveys the cumulative effort of my lifetime study and concentration..as a hobby....yet some students will soak this up in minutes...work problems for a couple of weeks and have it down.....Awesome!!!!
@ozzyfromspace4 жыл бұрын
This lecture was all about flexing the computational power of eigenstuffs. I'm definitely impressed!
@imagedezach5 жыл бұрын
My right ear learned a lot
@AmolDeometaphys5 жыл бұрын
turn on mono sound in ease of access if you use windows
@aryajane64244 жыл бұрын
@@AmolDeometaphys thanks for your advice :)
@mskiptr4 жыл бұрын
You can also do the same on Android
@ze24114 жыл бұрын
hahahahahahha awesome comment
@ChangAi-jp9ivАй бұрын
Thanks for your comment, I can give my left ear a chance to study!😎
@molraah90467 жыл бұрын
I watched this lecture maybe three times. and still screaming. thank you Prof. Gilbert Strang
@starriet2 жыл бұрын
(note for myself) 23:46) if the two eigenvalues are complex numbers, they'll be conjugates of each other(sum becomes a real number). 31:47, 33:26) Search what 'matrix exponential' is. It's *NOT* just a simple notation of element-wise exponential operation. * Also note that this lecture does *not* cover the entire rigorous mathematical proofs, as well as other lectures of Prof. Strang (At least this series of Linear Algebra lectures so far). * The solution using "matrix exponential" notation (31:55) can be understood if you see 44:30, given the Taylor series of the matrix is the _'definition'_ of the matrix exponential notation.
@damnit2585 жыл бұрын
he just keeps pushing the boundaries!! it's not that i don't understand but i never thought that u can use such tools this way!
@dunsacc9 жыл бұрын
this one lecture covers most of my math and control engineering module. guys a genius
@computername9 жыл бұрын
+dunsacc Same here. You really know someone got it if he can explain something others were referring to as hard in actually simple words.
@jessstuart7495 Жыл бұрын
Dr. Strang effortlessly incorporates little nuggets of wisdom in his lectures to keep the more advanced students interested, at the same time keeping most of the lecture simple enough so other students don't get lost. Bravo Sir!
@IntegralMoon9 жыл бұрын
This lecture is amazing! Its amazing how well he communicates these concepts
@pengjiang52817 жыл бұрын
Amazing lecture! It will be better to learn this course with the textbook also by Strang. The videos provide big pictures and kind of whet your appetite for reading the book. The book gives detailed explanation and just satisfies your curiosity.
@rongrongmiao46387 жыл бұрын
Ok guys don't complain about the difficulty of this section. You're gonna need some differential equation knowledge in order to fully understand this one, and the next one (FFT). E.g. solving a system of differential equation. Prof Strang did a fantastic job on that which I've never used a linear algebra point of view to understand system of differential equations and stabilities.
@waverly24686 жыл бұрын
It helps to have Strang's textbook. But yes, you would need to have taken a course in Diff. equations. Knowing a little bit about control systems theory would help, too.
@joshkhorsandi49692 жыл бұрын
For anyone looking for some more visual intuition, 3blue1Brown has an amazing video on raising matrices to powers in his DE series.
@henryzhu73094 жыл бұрын
Awesome lecture. Neat and consistent. The stability part is much better than what i have learned in linear system class.
@georgesadler78303 жыл бұрын
Although the audio is low in this video, DR. Strang relay the information on differential equations and exp(At) very well. Matrix exponential is very important in linear systems.
@archishmanchaudhuri63113 ай бұрын
a note for those considering [-1 1]^T as the second eigenvector, we get c1 = 1/3 and c2 = -1/3.
@TheDareDevil251013 жыл бұрын
These lectures are awsome. Strang makes Linear Algebra seem so easy - completely oppisite of his "colleagues" at my university. I've stopped attending their lectures in favour of these. Thanks MIT (W. Gilbert Strang)!
@siddharthkailasam91706 жыл бұрын
high five broo (same situation even after 7 years lol)
@grantguo9399 Жыл бұрын
i think this lecture is probably the most difficult one in this series, but as usual well explained!
@Leonugent20123 жыл бұрын
Professor Strang is the Columbo of Linear Algebra
@kstahmer10 жыл бұрын
@hypnoticpoisons Use Euler’s formula with x =6t: e^ix = cos x + i sin x |e^ix| = |cos x + i sin x| = sqrt((cos x + i sin x)(cos x - i sin x)) = sqrt(cos^2 x + sin^2 x) = sqrt(1) = 1
@mind-blowing_tumbleweed Жыл бұрын
Thank you, sir. It was very valuable for those of us who isn't yet too comfortable with imaginary stuff yet.
@anmol_dot_ninja4 жыл бұрын
i have never been so lost in my entire life.....
@projetforme14414 жыл бұрын
Keep learning, this is worth it
@thovinh53866 жыл бұрын
Took me almost 3 hours to get close to "fully understand" the lecture.
@ashutoshtiwari43985 жыл бұрын
Same.
@danieljulian46765 жыл бұрын
Yeah, I watch the lecture twice and it all gels for me. There's a beautiful thing lurking in here, I think, which is the situation where only one eigenvalue is relevant.
@muyuanliu31754 ай бұрын
me 2, but at third try
@ozzyfromspace4 жыл бұрын
For that first example, I don’t think assuming the solution was the Iinear combination of exponentials was the right way to go. Of course, it’s correct, but it’s not very instructive. I solved this example problem in my own, and this is what I did: Broke the system of odes into a system of difference equations (finite element approximations of the odes). Then I solved for the vector one node away in terms of the first node by reapplying something like u_(n+1)=A*u_n over and over. The computational hack here is that A being middle diagonalizable matrix doesn’t actually need to be multiplied many, many times. Once you get your answer, take the limit of your solution such that the difference equations converge to the original odes, and boom, you have your solution. It blew my mind a lil bit when I got the right answer lol 😅.
@sergiohuaman60844 жыл бұрын
@26:00 shouldn't it be det >=0 to account for the solutions with steady state? And I suppose yes, this lecture is clear if you have been previously exposed to at least basic differential equations. But hey! OCW offers that course for free too! ;)
@purusharthmalik93413 жыл бұрын
You are confusing stability with steady state. A system is said to be stable when equilibrium is achieved i.e the system goes to 0. For that to happen, the strict condition is that all the eigenvalues MUST be negative. If in case one or more of the eigenvalues become 0, the system is MAY OR MAY NOT be stable depending upon the multiplicity of the eigenvalues.
@lucilius94914 жыл бұрын
I think it's better to first watch the amazing TA Linan Chen video then come back and watch this lecture
@adamlevin63288 жыл бұрын
Seems like he had an extra cup of coffee this morning
@nimaabbasi19276 жыл бұрын
probably u were watching it on 1.5x , too LOL
@ww6606 жыл бұрын
@@nimaabbasi1927 i m on 2x speed. he is shivering lol.
@ZhanyeLI4 жыл бұрын
haha, that is true
@xy11123 жыл бұрын
if you are not familiar with laplace transform, especially coming from a signal and system background, you do get lost in here, it's not because of your poor understanding. But for EE, this is an essential and magic piece!
@bikespike4150 Жыл бұрын
This was a great lecture as usual. My only complaint would be that the audio is only present on the right channel. I am sure many others would also appreciate if that could possibly be fixed. I would not mind doing that myself if there is a contact that I could send it to. Cheers
@pablo_CFO4 жыл бұрын
For all of you who are lost in this conference, don't worry, if you are just starting your career and have not seen calculus or differential equations, this will be almost impossible to understand, I think this conference seeks to show the potential of linear algebra in other fields, and I don't think it is necessary to fully understand it if you are just starting out, don't get frustrated by not understanding it, the level is much higher than the previous ones. By the way, if you develop an interest in dynamic systems (chaos theory), this is definitely a conference that you will remember in the future.
@iyalovecky10 жыл бұрын
ERROR: stability regions are meesed up. Unit circle is for degrees, and left half-plane is for exponentials.
@TrevorKafka13 жыл бұрын
@hypnoticpoisons Because e^(6it) = cos(6t)+i sin(6t), which always has a modulus of 1.
@harshsaxena1999it6 жыл бұрын
can i call this statement "kadkaesque explanation of |z|=z(~z)"?
@Weightlifeter14 жыл бұрын
@sbhdgr8 No. He's trying to write the second order differential equation as a system of first order linear equations, or basically you are just substituting y'=some variable, basically just manipulating the expressions so the 2nd order equation is "transformed" into equations of first order. The matirx of coeffiecients represents the coefficients of the "transformed" resulting system of 1ST order linear equations. It is the same idea as substitution in quadratic equations like ax^4+bx^2+c, and
@santagill3566 Жыл бұрын
29:50 Why he can set u= Sv? Should s have n independent eigenvectors for 2 by 2 A ?
@yunfan70343 жыл бұрын
Can anyone explain 32:18 why u(t)= s e^(lambda* t) s(inverse) * u(0) = e^A*t *u(0)? From 9:33, for example u(0)= c1* 1*x1, c2*1*x2 (t=0, so e^lambda 1 or 2 * t, are 0)
@vivianhuang56472 жыл бұрын
My understanding: earlier part of the lecture showed u(t) is some form of exponential function that u(t)=e^(At), and u(0) is just some initial condition to position u(t)'s initial position, so u(t)=e^(At)*u(0); Professor set u=Sv, where S's basis are eigenvectors of A. We have (u)'=(Sv)'=Au=ASv, S here is a constant matrix, so take it out, (u)'=(Sv)'=S(v)'=ASv, to write this in another way as S(dv/dt)=ASv, so we have dv/dt=S^(inverse)ASv. We know that S^(inverse)AS is the diagonal matrix with lambdas, so dv/dt=Lambda*v, and this shows that v is some form of exponential function as well, so we can write v(t)=e^(Lambda*t)v(0); given u=Sv, so substitute u(0)=Sv(0) here (which leads to v(0)=S^(inverse)*u(0)), so we have u(t)=Sv(t)=S*e^(Lambda*t)*S^(inverse)*u(0).
@yunfan70342 жыл бұрын
@@vivianhuang5647 appreciate your help.
@andreagjoka1039 жыл бұрын
in 30.41 he writes "set u=Sv" he says that S is the eigenvector Matrix but he does not say nothing for WHAT IS THIS "v small" ,and why did he do that trick..it is disheartening
@LGLucid9 жыл бұрын
+Andrea Gjoka I understand how that could frustrate you. Here's what he's doing. He's decomposing "u" into a linear combination of the eigenvectors, which are arranged in the matrix "S". The elements of the vector "v" are the weights in that linear combination. He performed that "trick" so he could get rid of "A" (in which the variables of the system are coupled together), and replace it with capital Lambda, which is a diagonal matrix (uncoupled variables). The fact that capital Lambda is diagonal is important because the exponential of a diagonal matrix turns out to be trivial. So he's setting up an identity that relates the exponential of a non-diagonal matrix to the exponential of a diagonal one (multiplied as needed by the eigenvector matrix and its inverse). This is the foundation for the series-based definition of matrix exponentials he presents a little later in the lecture. Hope this helps.
@andreagjoka1039 жыл бұрын
+L.G. Lucid Thank you for the explanation.I have to see it again now. you seem to focus better than I do ..
@swapnils69026 жыл бұрын
He is merely trying to express u as a linear combination of eigen vectors. Vector v is not composed of constants as S contains constants. All components of V are time (or t) dependent. Hope it helped.
@TeejK4 жыл бұрын
In case anyone had the same question, here's how I figured it out: S spans the space (it is a set of basis vectors) so you can transform S to any vector in the space with the right combination of its column vectors. It is similar to what was done in the last lecture transforming u_0 to Sc
@yaboimdaro2 жыл бұрын
For those who have struggled with the where does -k and -b came from: See that y^|| = dy^| / dt = -by^| - ky
@sudharshantr8757 Жыл бұрын
if the formula for (I - At)-1 is correct at 38:49, if At = 2I, rhs must blow up but clearly (I - 2I)-1 does exists and is -I. Am I missing something?
@theSpicyHam Жыл бұрын
these are my favorite lessons
@technoshrink9 жыл бұрын
Copyrighted in 2000, made in 2005... Has MIT discovered time travel??
@andreagjoka1039 жыл бұрын
uploaded in 2009 also
@mitocw9 жыл бұрын
+Andrea Gjoka The videos were recorded in 1999, published on MIT OpenCourseWare in 2005, and re-published at a higher resolution in 2009.
@andreagjoka1039 жыл бұрын
very satisfactory explanation! i wonder though if the curriculum has changed since 1999.
@shubhamtalks97185 жыл бұрын
@@mitocw Oh my GOD, these lectures are so old... I didn't realize it. They are still the best.
@shubhamtalks97185 жыл бұрын
@@mitocw Please fix this video's audio channel.
@mikechen31743 жыл бұрын
The comments are always much consolation to see...
@jaykane422 жыл бұрын
If anyone manages to see this, should I have done differential equations by now; or not? Also, before differential equations, should I do calculus 3/multivariate calculus; or differential equations first? Thanks to anyone who responds and I hope you have had a lovely day so far!
@initiald9759 ай бұрын
Wow, for a video 14 years ago is better than that of my math teacher today.
@ianyappy12 жыл бұрын
S contains a bunch of numbers independent of t (the eigenvectors), so d/dt and S would commute.
@Johnnymatics14 жыл бұрын
Great! What a gifted Professor, thank you Dr. Strang!
@enesozcan20927 жыл бұрын
even the beauty gilbert strang cannot make me love that freaking algebra
@tathagatanandi58136 жыл бұрын
Ahhh!!! Mind blown near the end!!!
@pathung20024 жыл бұрын
This lecture is amazing...
@clarencekoh69215 жыл бұрын
The corresponding lecture form 18.03 by MIT OCW that bridges this module is Lecture 25. kzbin.info/www/bejne/npalp4mfiM5srrM
@ignatiusjacquesreilly704 жыл бұрын
To be clear that I'm not missing something in the lecture, he assumes exp(lambda*t) is a solution to the matrix differential equation BEFORE he has explained what a matrix exponential is or shown how it might be derived, correct? I'm asking because I want to make sure I'm not missing some section of the video. I went back and watched parts of it multiple times and I think he is skipping around at some places in a way that makes it hard to follow, even if you're familiar with differential equations and the material up to this point. I generally like his broad and conceptual approach but in this video I believe there were too many jumps from one line of thought to another. Nonetheless, once I got past some of the excessive handwaviness, the overall approach to decoupling differential equations is excellent and I learned a lot.
@sunritroykarmakar44063 жыл бұрын
Yes
@kingplunger13 ай бұрын
36:45 why ?
@sanchayanbhowal44464 жыл бұрын
This is the best lecture series of Linear algebra ❤️
@pelemanov13 жыл бұрын
@hypnoticpoisons I think x is limited to numbers between 0 and 1
@dennisting52093 жыл бұрын
I just finished watching 3Blue1Brown’s first video on matrix exponentials, it’s so well made. Even though I feel like I haven’t fully understood every aspect of this lecture, I do believe that with the help of 3Blue1Brown, this lecture might turn into intuition for me someday.
@zee6397611 жыл бұрын
life saver for my final
@MsBean-sv8zo2 жыл бұрын
Can anyone explain, at 31:39, how do we know that v = e^lambda t * v(0)? Where did the v(0) come from?
@condafarti Жыл бұрын
It's the initial condition of the function v(t). Since at time=0 the e^λt equals 1, then it's just v(0). It's practically the C(which represents the initial condition) as it is referred in the exponential models and differential equations.
@seamus98985 жыл бұрын
31:53 what is this "natural notation"? I wish he derived that form from u(t) = c1e^(λt)x1
@gaming4life255 жыл бұрын
I'm also struggling with that part. Please explain...
@박정범-o3k4 жыл бұрын
From the fact that dv(t)/dt = Λ * v(t), try to derive formula for v(t). You'll get v(t) = c1 * e^(λ1t) * [1; 0] + c2 * e^(λ2t) * [0; 1]. When you set t = 0, you'll get c1 = v1(0) and c2 = v2(0). So it becomes the "natural notation".
@abdelaziz27883 жыл бұрын
30:30 i guess this should be at the beginning of the lecture but it was a very nice lecture thank you for ur efforts
@Hotheaddragon4 жыл бұрын
At 49:30 can someone please explain me how in matrix -b and -k came, I tried hard but couldn't understood, or am I missing some sort of pre requisite here?
@mitocw4 жыл бұрын
Prerequisites: Multivariable Calculus (18.02). See ocw.mit.edu/18-06S05 for more info. Best wishes on your studies!
@thedailyepochs3384 жыл бұрын
he made y^|| the subject of the formula which took b and k to the other end of the equation and then used the first row of the matrix to get the equation and the second row was used to get the second equation which is trivial, hope this helps
@Weightlifeter14 жыл бұрын
@sbhdgr8 you substitute z=x^2 so the higher power "quadratic" equation can be written as a actual quadratic equation az^2+bz+c.
@jcjobin12 жыл бұрын
This is genius! Now I want to go to MIT, but not enough money... guess I will stay at Dalhousie
@middlevoids Жыл бұрын
Gilbert "The Wizard" Strang!
@suziiemusic8 жыл бұрын
Dear God, Thank you for Gilbert Strang
@kishkinay30423 жыл бұрын
If I need to understand this lecture to proceed with the rest of the course, I’d better just give up maths for life...
@MRIDDY279 жыл бұрын
Seeing this lecture.....i say it's enough in linear algebra.....i am done with it.....most of the points are unclear in this lecture.....bt before this,every lecture was extraordinary...
@Sandyy101017 жыл бұрын
yeah!! Everything is clear until this lecture... Maybe it's because he assumes we have taken some courses of 18.01-05 before watching this video
@DeadPool-jt1ci4 жыл бұрын
@@Sandyy10101 18.06 , has 18.01 (Single variable calculus ) , 18.02 ( Multivariate Calculus) and 18.03 (Differential Equations) as prerequisites. But my guess is , even if you dont udnerstand the differential equations stuff , it's still ok , you can just skip any video with things like "fourier transforms , diff eqs " and so on.You should definitely watch the positive defineteness lectures
@differentialeq7698 жыл бұрын
In 32.08, can anyone explain how does he derive v(t) and u(t)?
@davidlovell7297 жыл бұрын
start with dv/dt = Lambda v. This says the derivative of a (vector) variable is a constant times the variable. From differential equations, you know the solution is an exponential function (multiplied by the initial condition). This is just expressing it as a vector (i.e., solving several single-variable cases simultaneously). Thus, v(t)=e^Lambda t v(0). But v was a vector that he got by taking u, and finding what combination of columns of S would be necessary to produce u. v is the vector of those coefficients. Since there are n independent columns of S, it forms a basis, so any vector u can be written as a combination of those columns. Thus, u=Sv. So the next step is u(t) = S v(t) = S e^Lambda v(0). But since u(0)=Sv(0), v(0) = S^-1 u(0). Substitute that, and you're done: u(t) = S e^Lambda S^-1 u(0).
@bharathkamath13047 жыл бұрын
The notes for this lecture on ocw.mit.edu mixed up u(0) and v(0) in a bunch of places... (like u(t) = S e^Lambda S^-1 V(0)). Spent hours trying to figure what's going on before I saw this comment XD. Thanks!
@jerrychen43486 жыл бұрын
god~~ thank, man. you save my day.👍
@胯下蜈蚣長老4 жыл бұрын
thanks, you save my month lol
@TeejK4 жыл бұрын
@@davidlovell729 lifesaver
@syedsheheryarbokhari27804 жыл бұрын
Prof. Strang is amazing but you should skip this lecture if you are NOT familiar with differential equations. It is too fast paced, and avoids multiple non-distinct eigenvalues. I didn't find the book section very helpful as well. The good thing is skipping this "applications" lecture doesn't affect the series. I would recommend Edwards-Penney differential equations Chapter 5 if you really want to understand what is going on and where these equations are coming from (i.e. the link between eigenvectors, exponential matrices and different equations). You can look at ocw 18.03 section 4 as well, which will explain this at a reasonable pace. or you can just leave thinking aside and put the following things into memory. A system of differential equations has a solution in the eigenvectors and eigenvalues of A. Exponential of a matrix is a taylor series with scalar replaced by a matrix and follows the same derivative. For stability, real part of eigenvalues has to be negative.
@ricardocesargomes72748 жыл бұрын
very satisfactory explanation!
@eccesignumrex44827 жыл бұрын
Best one Yet ! !! !
@sansha26874 жыл бұрын
46:30, 46:45, 49:45
@thovinh53866 жыл бұрын
I'm so gonna forget all the homeworks I've done in this course (because it makes me learn how to compute rather than actual knowledge, I'll still remember the "knowledge" in the lecture though) Confusing part: 0:00 what does that linear problem really mean? (Why is the solution only u(t) when there're 2 variables u1 and u2? Shouldn't it be u1(t) and u2(t) Maybe that some calculus that I don't know yetf) 9:55 (Au popping off like a wild Pokemon) 18:48 (What is that value, sine and cosine thingy and why? Maybe that's some calculus problem that I should just agree that it is for now) I heard about Taylor series Is (1-x)^-1 = 1 + x + x^2 +... true with every x? Oh, here's the answer : mathforum.org/library/drmath/view/61847.html But then how can we judge if At < I, monkaHmm
@programyourface8 жыл бұрын
Do I need to know calculus for this ;(. I will learn that after and watch this again.
@antoniolewis10168 жыл бұрын
Yes, you do.
@davidlovell7297 жыл бұрын
You need to know first order linear differential equations. You might learn this somewhere towards the end of a single-variable integral calculus course, and of course you learn it in a first course on differential equations.
@andreagjoka1038 жыл бұрын
on the book there is a part I did not get at all-namely starting"to display a circle on screen,replace y"=-y' with finite finite difference equation" from end of 315 page till 317, anyone who has understood that part?
@andyralph94953 жыл бұрын
e^At does not converge as t tends to infinity....why does he say that?
@sbhdgr814 жыл бұрын
I have a question if someone could answer it; right at the end of the lecture where a second order diff. eqn is converted into a 2x2 matrix called A; while defining the matrix A, shouldn't element A22 be= -k/b since on rearranging the given eqn and solving for y' we get y'= - (1/b) *y"/b - (k/b)* y; so since the coefficient of y is -k/b ; shouldn't this coefficient correspond to element A22 of the 2x2 Matrix A?
@SudeepJoshi222 жыл бұрын
No, because as Professor explained in the last seconds of the lecture in order to convert a 2nd order system to a first order we have to introduce a redundant equation hence the elements A21 and A22 must be 1 and 0 to complete the matrix.
@codenzar77723 жыл бұрын
I watch on 2x and understood everything and now i am feeling like i need to watch again in 0.25x
@reginaldorodrigues35302 жыл бұрын
Great teacher.
@ashutoshsingla72694 жыл бұрын
For a moment I thought my left ear had given up.
@lee_land_y696 жыл бұрын
I think ill leave this and the next lecture for some other day, coz i dont have enough solid background with diffs to understand these 2 lectures. those who has completed course are this two lectures a must for understanding the rest of the course? thanks
@NisargJain6 жыл бұрын
Not quite. But I was doing this through OCW so it came in quiz 2. I don't think you'll need it later until you want to solve high order differential equations. But this is as important as the applications of Linear algebra get so, you should study it nevertheless.
@ognjenfilipovic68934 жыл бұрын
Great cource!Thank you a lot!
@welcomethanks51922 жыл бұрын
why u(0)= [1,0].T, is it a dummy vector that help calcuate?
@thovinh53866 жыл бұрын
As a non English native speaker university student, the first minute of the video is like the professor speaking alien language to me :|
@hariprasadyalla7 жыл бұрын
Nice lecture. Easy to see why solutions are exponential in t. But, I did not get why solution is exponential in lambda*t.
@frazulabrar93982 жыл бұрын
He showed at 9:58
@himanchalsingh11356 жыл бұрын
Can anyone give me an example of a 'real-symmetric' matrix whose eigen value and singular value are not same...
@andyralph94953 жыл бұрын
In the end why does he say that matrix converts 5th order differential equation to 1st order differential equation! there are higher orders in the L.H.S...
@pavlenikacevic49767 жыл бұрын
48:14 you change it by switching from Lagrangian mechanics to Hamiltonian :D
@ilkertalatcankutlucan32575 жыл бұрын
hey folks, It went smooth for me until min. 36, though I am stucked there. I was thinking Taylor Series Expansion is a quadrization around a Fixed Point, and Mr. Strang simply uses the t = 0 as the fixed point for his linearization.. For what ı know this Expansion should only approximate around the limited neighbourhood of t=0. Am I mistaken somehwhere ? How come is this Taylor Expansion a general thing ? How can it represent the whole e^At for all t ?
@francoisduhem25854 жыл бұрын
ilker Talat Can KUTLUCAN i think it’s because the radius of convergence of the exponential function is infinite.
@zolanhlangulela947 Жыл бұрын
Is zero always an Eigen value for 2 x 2 matrix?
@sahilnegi43263 жыл бұрын
Buy why is trace a sum of eige values and det product of it
@thetheoreticalphysicist585211 ай бұрын
Beautiful
@bud38913 жыл бұрын
if i had this class........ i would fail it.
@Sandyy101017 жыл бұрын
same lol
@ecd42820036 жыл бұрын
Superb!
@gaming4life255 жыл бұрын
I lost at 31:53. Somebody help...
@chetankushwaha90524 жыл бұрын
Me 🙋 too plzz help
@MultiRNR10 ай бұрын
Still hard to understand why need to think of eigenvalue and eigenvector in first place
@MultiRNR10 ай бұрын
Figured out, all doubts answered in previous lecture note: openlearninglibrary.mit.edu/assets/courseware/v1/3a9097c7342f1022aa9e3395196125eb/asset-v1:OCW+18.06SC+2T2019+type@asset+block/18.06_Unit_II_2.9_Lecsum.pdf The key is the difference equation that fully make best use of properties of eigenvectors and eigenvalues that help transform problem into some exp closed form. Simply brilliant 😮
@MultiRNR10 ай бұрын
Similar applies to differential equations here that construct a e^lambda*t form because it’s derivative will put lambda in front, which may leads to eigenValue.
@cherma114 жыл бұрын
34:13 is a beauty
@alijoueizadeh84775 жыл бұрын
Thank you.
@hypnoticpoisons13 жыл бұрын
36:21 how can sum(x^n) be 1/(1-x), that is obviously too small
@danieljulian46765 жыл бұрын
the series only works if the |x| < 1; there are derivations all over the place. This is so basic.
@ozzyfromspace4 жыл бұрын
Actually, if you’re doing Cesaro sums, the statement is valid for abs(x) > 1. It’s not that basic....