23. Differential Equations and exp(At)

  Рет қаралды 367,831

MIT OpenCourseWare

MIT OpenCourseWare

Күн бұрын

Пікірлер: 218
@TeejK
@TeejK 4 жыл бұрын
Okay, if anyone is like me who is interested in Linear Algebra and not necessarily Differential Equations, but still wants to understand this lecture for the sake of completeness in this course, here is the list of all the Differential Equations theory you need for this lecture and where you can find content to learn it: 1. Separable differential equations - Khan Academy - this is for v(t)=exp(Lambda t) v(0) 2. 2nd order linear homogeneous equations - Khan Academy - you need this if you want to understand one calculation in the next item on this list (quick to learn anyways) 3. Systems of differential equations - MIT OCW 18.03, lectures 24 and 25 - basically does the first example in this lecture, but actually goes through the steps of explaining what is happening, highly recommended to watch both videos As someone who is unlikely to apply a whole lot of differential equations in the future, I still found this lecture useful to reinforce the properties of Eigenstuff from the previous lecture, also the very last bit of the lecture is very cool
@mitrus4
@mitrus4 Жыл бұрын
the comment I needed, thank you! xD
@lq_12
@lq_12 9 ай бұрын
gracias
@mightyentity3494
@mightyentity3494 8 ай бұрын
Thank you, much appreciated
@AnkitAgarwal-gw6qw
@AnkitAgarwal-gw6qw 8 ай бұрын
Thanks a lot buddy! This was really helpful.
@fuchunhsieh7396
@fuchunhsieh7396 3 ай бұрын
Thanks for the really helpful resources!
@qbtc
@qbtc 5 жыл бұрын
This lecture and the one before are some of the hardest in the whole course. Stay the course, everyone.
@projetforme1441
@projetforme1441 4 жыл бұрын
@Sirin Kalapatuksice probably people in MIT learns ODE's at the same time of this lecture in their analysis course
@snowy0110
@snowy0110 4 жыл бұрын
to whomever think that it is complicated: you are missing essentials from 3blue1brown. His explanations will give you a solid basis to watch MIT lectures with joy and pleasure.
@liuhsin-yu9593
@liuhsin-yu9593 4 жыл бұрын
@@snowy0110 Thank you so much, I'll check that out latter. I've done calculus years ago but forgot all of it, and I'm struggling with the course.
@SahilZen42
@SahilZen42 Жыл бұрын
But both seem to me interesting 👍
@seventyfive7597
@seventyfive7597 6 жыл бұрын
For those confused by this lecture, don't worry, you should be, it's no a failure of yours, it's a syllabus description mistake. I'm happy I see these videos as a review for things I ALREADY LEARNED years ago, because obviously if I were first exposed to this lecture with the prerequisites the course mentions, it wouldn't have been possible to understand this lecture. While the course specifies 18.02 as a prereq, and then immediately dismisses in its syllabus as not important, this lecture requires both 18.02 and 18.03, the latter is not mentioned at all in the syllabus. Universities usually parallelize these kind of courses with the parallel requisites such as 18.03, but OCW does not hint that at their site or here. While for me it was nice, I recognize that if you don't know diff. eqs this would not be possible for you to learn. The good news is that you can skip 1 or two lectures and you'd be right back on track, as this is just a nice implementation demonstration, but does not block Lin. Alg. learning.
@ashutoshtiwari4398
@ashutoshtiwari4398 5 жыл бұрын
Thanks for stating that man. Although I figured that out, reading this reassured me.
@gordonlim2322
@gordonlim2322 5 жыл бұрын
The textbook actually says: "This section is mostly linear algebra, but to read it you need one fact from calculus: The derivative of e^(lambda)(t) is (lambda)e^(lambda)t" which I have learnt before in junior college. However I have not yet lernt differential equations at the university level and this lecture has been especially hard for me. The textbook also does not follow the same flow or use the same examples as the video. At the same time I'm hesistant to skip a whole section with the fear that I might actually already know the prerequisites but merely stumbled by how the lecture is conducted.
@RAJATTHEPAGAL
@RAJATTHEPAGAL 4 жыл бұрын
I think i did this in first year. I have seen this kind of solution of e^t terms for multiple ODE constraints given. Are these that only but just more elegantly solved using matrices. Because I remember we used give differentials some sort of variable then form a polynomial out of it solve them and then move to creating a solution. Is the matrix method doing the same but more elegantly just wanted to know..... It's bee years I have studied those.
@AdamOmidpanah
@AdamOmidpanah 3 жыл бұрын
Most unis have a whole course dedicated to diff eq. However, most engineering programs throw students into a statics course at the same time as they're taking diff eq. Having elementary exposure to diff eq prior to statics much quicker, and they will be quicker to move toward numerical methods for non-linear diff eq.
@mississippijohnfahey7175
@mississippijohnfahey7175 2 жыл бұрын
@@gordonlim2322 unfortunately the textbook doesn't follow the lectures very well at all... Best to watch the lectures first (both 18.03 and 18.06), take notes, then use that to study the textbook. It's taken me about a year to tackle all that on my own, but I was taking a full time course load at my university as well. Now I'm reviewing select lectures for the second time and it's helping solidify everything nicely
@tachyon7777
@tachyon7777 5 жыл бұрын
The two dimensional sound channel matrix has only one independent vector. Bad one.
@puneetgarg4731
@puneetgarg4731 Жыл бұрын
You can use projection Matrix Mono Sound to solve the problem
@danishji2172
@danishji2172 Жыл бұрын
@@puneetgarg4731 You seem like a smart man. Been following your comments around for a while here.
@puneetgarg4731
@puneetgarg4731 Жыл бұрын
@@danishji2172 thankyou
@yazhouhao7086
@yazhouhao7086 7 жыл бұрын
The greatest instructor I have ever met!! Thank you, Dr. Strang!
@JDBolt1958
@JDBolt1958 7 жыл бұрын
An awesome lecture indeed! Studied elementary differential equations at University 40 years ago. About two weeks of lectures to present the same materail. Linear algebra clarifies the process very well. The information relayed here conveys the cumulative effort of my lifetime study and concentration..as a hobby....yet some students will soak this up in minutes...work problems for a couple of weeks and have it down.....Awesome!!!!
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
This lecture was all about flexing the computational power of eigenstuffs. I'm definitely impressed!
@imagedezach
@imagedezach 5 жыл бұрын
My right ear learned a lot
@AmolDeometaphys
@AmolDeometaphys 5 жыл бұрын
turn on mono sound in ease of access if you use windows
@aryajane6424
@aryajane6424 4 жыл бұрын
@@AmolDeometaphys thanks for your advice :)
@mskiptr
@mskiptr 4 жыл бұрын
You can also do the same on Android
@ze2411
@ze2411 4 жыл бұрын
hahahahahahha awesome comment
@ChangAi-jp9iv
@ChangAi-jp9iv Ай бұрын
Thanks for your comment, I can give my left ear a chance to study!😎
@molraah9046
@molraah9046 7 жыл бұрын
I watched this lecture maybe three times. and still screaming. thank you Prof. Gilbert Strang
@starriet
@starriet 2 жыл бұрын
(note for myself) 23:46) if the two eigenvalues are complex numbers, they'll be conjugates of each other(sum becomes a real number). 31:47, 33:26) Search what 'matrix exponential' is. It's *NOT* just a simple notation of element-wise exponential operation. * Also note that this lecture does *not* cover the entire rigorous mathematical proofs, as well as other lectures of Prof. Strang (At least this series of Linear Algebra lectures so far). * The solution using "matrix exponential" notation (31:55) can be understood if you see 44:30, given the Taylor series of the matrix is the _'definition'_ of the matrix exponential notation.
@damnit258
@damnit258 5 жыл бұрын
he just keeps pushing the boundaries!! it's not that i don't understand but i never thought that u can use such tools this way!
@dunsacc
@dunsacc 9 жыл бұрын
this one lecture covers most of my math and control engineering module. guys a genius
@computername
@computername 9 жыл бұрын
+dunsacc Same here. You really know someone got it if he can explain something others were referring to as hard in actually simple words.
@jessstuart7495
@jessstuart7495 Жыл бұрын
Dr. Strang effortlessly incorporates little nuggets of wisdom in his lectures to keep the more advanced students interested, at the same time keeping most of the lecture simple enough so other students don't get lost. Bravo Sir!
@IntegralMoon
@IntegralMoon 9 жыл бұрын
This lecture is amazing! Its amazing how well he communicates these concepts
@pengjiang5281
@pengjiang5281 7 жыл бұрын
Amazing lecture! It will be better to learn this course with the textbook also by Strang. The videos provide big pictures and kind of whet your appetite for reading the book. The book gives detailed explanation and just satisfies your curiosity.
@rongrongmiao4638
@rongrongmiao4638 7 жыл бұрын
Ok guys don't complain about the difficulty of this section. You're gonna need some differential equation knowledge in order to fully understand this one, and the next one (FFT). E.g. solving a system of differential equation. Prof Strang did a fantastic job on that which I've never used a linear algebra point of view to understand system of differential equations and stabilities.
@waverly2468
@waverly2468 6 жыл бұрын
It helps to have Strang's textbook. But yes, you would need to have taken a course in Diff. equations. Knowing a little bit about control systems theory would help, too.
@joshkhorsandi4969
@joshkhorsandi4969 2 жыл бұрын
For anyone looking for some more visual intuition, 3blue1Brown has an amazing video on raising matrices to powers in his DE series.
@henryzhu7309
@henryzhu7309 4 жыл бұрын
Awesome lecture. Neat and consistent. The stability part is much better than what i have learned in linear system class.
@georgesadler7830
@georgesadler7830 3 жыл бұрын
Although the audio is low in this video, DR. Strang relay the information on differential equations and exp(At) very well. Matrix exponential is very important in linear systems.
@archishmanchaudhuri6311
@archishmanchaudhuri6311 3 ай бұрын
a note for those considering [-1 1]^T as the second eigenvector, we get c1 = 1/3 and c2 = -1/3.
@TheDareDevil2510
@TheDareDevil2510 13 жыл бұрын
These lectures are awsome. Strang makes Linear Algebra seem so easy - completely oppisite of his "colleagues" at my university. I've stopped attending their lectures in favour of these. Thanks MIT (W. Gilbert Strang)!
@siddharthkailasam9170
@siddharthkailasam9170 6 жыл бұрын
high five broo (same situation even after 7 years lol)
@grantguo9399
@grantguo9399 Жыл бұрын
i think this lecture is probably the most difficult one in this series, but as usual well explained!
@Leonugent2012
@Leonugent2012 3 жыл бұрын
Professor Strang is the Columbo of Linear Algebra
@kstahmer
@kstahmer 10 жыл бұрын
@hypnoticpoisons Use Euler’s formula with x =6t: e^ix = cos x + i sin x |e^ix| = |cos x + i sin x| = sqrt((cos x + i sin x)(cos x - i sin x)) = sqrt(cos^2 x + sin^2 x) = sqrt(1) = 1
@mind-blowing_tumbleweed
@mind-blowing_tumbleweed Жыл бұрын
Thank you, sir. It was very valuable for those of us who isn't yet too comfortable with imaginary stuff yet.
@anmol_dot_ninja
@anmol_dot_ninja 4 жыл бұрын
i have never been so lost in my entire life.....
@projetforme1441
@projetforme1441 4 жыл бұрын
Keep learning, this is worth it
@thovinh5386
@thovinh5386 6 жыл бұрын
Took me almost 3 hours to get close to "fully understand" the lecture.
@ashutoshtiwari4398
@ashutoshtiwari4398 5 жыл бұрын
Same.
@danieljulian4676
@danieljulian4676 5 жыл бұрын
Yeah, I watch the lecture twice and it all gels for me. There's a beautiful thing lurking in here, I think, which is the situation where only one eigenvalue is relevant.
@muyuanliu3175
@muyuanliu3175 4 ай бұрын
me 2, but at third try
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
For that first example, I don’t think assuming the solution was the Iinear combination of exponentials was the right way to go. Of course, it’s correct, but it’s not very instructive. I solved this example problem in my own, and this is what I did: Broke the system of odes into a system of difference equations (finite element approximations of the odes). Then I solved for the vector one node away in terms of the first node by reapplying something like u_(n+1)=A*u_n over and over. The computational hack here is that A being middle diagonalizable matrix doesn’t actually need to be multiplied many, many times. Once you get your answer, take the limit of your solution such that the difference equations converge to the original odes, and boom, you have your solution. It blew my mind a lil bit when I got the right answer lol 😅.
@sergiohuaman6084
@sergiohuaman6084 4 жыл бұрын
@26:00 shouldn't it be det >=0 to account for the solutions with steady state? And I suppose yes, this lecture is clear if you have been previously exposed to at least basic differential equations. But hey! OCW offers that course for free too! ;)
@purusharthmalik9341
@purusharthmalik9341 3 жыл бұрын
You are confusing stability with steady state. A system is said to be stable when equilibrium is achieved i.e the system goes to 0. For that to happen, the strict condition is that all the eigenvalues MUST be negative. If in case one or more of the eigenvalues become 0, the system is MAY OR MAY NOT be stable depending upon the multiplicity of the eigenvalues.
@lucilius9491
@lucilius9491 4 жыл бұрын
I think it's better to first watch the amazing TA Linan Chen video then come back and watch this lecture
@adamlevin6328
@adamlevin6328 8 жыл бұрын
Seems like he had an extra cup of coffee this morning
@nimaabbasi1927
@nimaabbasi1927 6 жыл бұрын
probably u were watching it on 1.5x , too LOL
@ww660
@ww660 6 жыл бұрын
@@nimaabbasi1927 i m on 2x speed. he is shivering lol.
@ZhanyeLI
@ZhanyeLI 4 жыл бұрын
haha, that is true
@xy1112
@xy1112 3 жыл бұрын
if you are not familiar with laplace transform, especially coming from a signal and system background, you do get lost in here, it's not because of your poor understanding. But for EE, this is an essential and magic piece!
@bikespike4150
@bikespike4150 Жыл бұрын
This was a great lecture as usual. My only complaint would be that the audio is only present on the right channel. I am sure many others would also appreciate if that could possibly be fixed. I would not mind doing that myself if there is a contact that I could send it to. Cheers
@pablo_CFO
@pablo_CFO 4 жыл бұрын
For all of you who are lost in this conference, don't worry, if you are just starting your career and have not seen calculus or differential equations, this will be almost impossible to understand, I think this conference seeks to show the potential of linear algebra in other fields, and I don't think it is necessary to fully understand it if you are just starting out, don't get frustrated by not understanding it, the level is much higher than the previous ones. By the way, if you develop an interest in dynamic systems (chaos theory), this is definitely a conference that you will remember in the future.
@iyalovecky
@iyalovecky 10 жыл бұрын
ERROR: stability regions are meesed up. Unit circle is for degrees, and left half-plane is for exponentials.
@TrevorKafka
@TrevorKafka 13 жыл бұрын
@hypnoticpoisons Because e^(6it) = cos(6t)+i sin(6t), which always has a modulus of 1.
@harshsaxena1999it
@harshsaxena1999it 6 жыл бұрын
can i call this statement "kadkaesque explanation of |z|=z(~z)"?
@Weightlifeter
@Weightlifeter 14 жыл бұрын
@sbhdgr8 No. He's trying to write the second order differential equation as a system of first order linear equations, or basically you are just substituting y'=some variable, basically just manipulating the expressions so the 2nd order equation is "transformed" into equations of first order. The matirx of coeffiecients represents the coefficients of the "transformed" resulting system of 1ST order linear equations. It is the same idea as substitution in quadratic equations like ax^4+bx^2+c, and
@santagill3566
@santagill3566 Жыл бұрын
29:50 Why he can set u= Sv? Should s have n independent eigenvectors for 2 by 2 A ?
@yunfan7034
@yunfan7034 3 жыл бұрын
Can anyone explain 32:18 why u(t)= s e^(lambda* t) s(inverse) * u(0) = e^A*t *u(0)? From 9:33, for example u(0)= c1* 1*x1, c2*1*x2 (t=0, so e^lambda 1 or 2 * t, are 0)
@vivianhuang5647
@vivianhuang5647 2 жыл бұрын
My understanding: earlier part of the lecture showed u(t) is some form of exponential function that u(t)=e^(At), and u(0) is just some initial condition to position u(t)'s initial position, so u(t)=e^(At)*u(0); Professor set u=Sv, where S's basis are eigenvectors of A. We have (u)'=(Sv)'=Au=ASv, S here is a constant matrix, so take it out, (u)'=(Sv)'=S(v)'=ASv, to write this in another way as S(dv/dt)=ASv, so we have dv/dt=S^(inverse)ASv. We know that S^(inverse)AS is the diagonal matrix with lambdas, so dv/dt=Lambda*v, and this shows that v is some form of exponential function as well, so we can write v(t)=e^(Lambda*t)v(0); given u=Sv, so substitute u(0)=Sv(0) here (which leads to v(0)=S^(inverse)*u(0)), so we have u(t)=Sv(t)=S*e^(Lambda*t)*S^(inverse)*u(0).
@yunfan7034
@yunfan7034 2 жыл бұрын
@@vivianhuang5647 appreciate your help.
@andreagjoka103
@andreagjoka103 9 жыл бұрын
in 30.41 he writes "set u=Sv" he says that S is the eigenvector Matrix but he does not say nothing for WHAT IS THIS "v small" ,and why did he do that trick..it is disheartening
@LGLucid
@LGLucid 9 жыл бұрын
+Andrea Gjoka I understand how that could frustrate you. Here's what he's doing. He's decomposing "u" into a linear combination of the eigenvectors, which are arranged in the matrix "S". The elements of the vector "v" are the weights in that linear combination. He performed that "trick" so he could get rid of "A" (in which the variables of the system are coupled together), and replace it with capital Lambda, which is a diagonal matrix (uncoupled variables). The fact that capital Lambda is diagonal is important because the exponential of a diagonal matrix turns out to be trivial. So he's setting up an identity that relates the exponential of a non-diagonal matrix to the exponential of a diagonal one (multiplied as needed by the eigenvector matrix and its inverse). This is the foundation for the series-based definition of matrix exponentials he presents a little later in the lecture. Hope this helps.
@andreagjoka103
@andreagjoka103 9 жыл бұрын
+L.G. Lucid Thank you for the explanation.I have to see it again now. you seem to focus better than I do ..
@swapnils6902
@swapnils6902 6 жыл бұрын
He is merely trying to express u as a linear combination of eigen vectors. Vector v is not composed of constants as S contains constants. All components of V are time (or t) dependent. Hope it helped.
@TeejK
@TeejK 4 жыл бұрын
In case anyone had the same question, here's how I figured it out: S spans the space (it is a set of basis vectors) so you can transform S to any vector in the space with the right combination of its column vectors. It is similar to what was done in the last lecture transforming u_0 to Sc
@yaboimdaro
@yaboimdaro 2 жыл бұрын
For those who have struggled with the where does -k and -b came from: See that y^|| = dy^| / dt = -by^| - ky
@sudharshantr8757
@sudharshantr8757 Жыл бұрын
if the formula for (I - At)-1 is correct at 38:49, if At = 2I, rhs must blow up but clearly (I - 2I)-1 does exists and is -I. Am I missing something?
@theSpicyHam
@theSpicyHam Жыл бұрын
these are my favorite lessons
@technoshrink
@technoshrink 9 жыл бұрын
Copyrighted in 2000, made in 2005... Has MIT discovered time travel??
@andreagjoka103
@andreagjoka103 9 жыл бұрын
uploaded in 2009 also
@mitocw
@mitocw 9 жыл бұрын
+Andrea Gjoka The videos were recorded in 1999, published on MIT OpenCourseWare in 2005, and re-published at a higher resolution in 2009.
@andreagjoka103
@andreagjoka103 9 жыл бұрын
very satisfactory explanation! i wonder though if the curriculum has changed since 1999.
@shubhamtalks9718
@shubhamtalks9718 5 жыл бұрын
@@mitocw Oh my GOD, these lectures are so old... I didn't realize it. They are still the best.
@shubhamtalks9718
@shubhamtalks9718 5 жыл бұрын
@@mitocw Please fix this video's audio channel.
@mikechen3174
@mikechen3174 3 жыл бұрын
The comments are always much consolation to see...
@jaykane42
@jaykane42 2 жыл бұрын
If anyone manages to see this, should I have done differential equations by now; or not? Also, before differential equations, should I do calculus 3/multivariate calculus; or differential equations first? Thanks to anyone who responds and I hope you have had a lovely day so far!
@initiald975
@initiald975 9 ай бұрын
Wow, for a video 14 years ago is better than that of my math teacher today.
@ianyappy
@ianyappy 12 жыл бұрын
S contains a bunch of numbers independent of t (the eigenvectors), so d/dt and S would commute.
@Johnnymatics
@Johnnymatics 14 жыл бұрын
Great! What a gifted Professor, thank you Dr. Strang!
@enesozcan2092
@enesozcan2092 7 жыл бұрын
even the beauty gilbert strang cannot make me love that freaking algebra
@tathagatanandi5813
@tathagatanandi5813 6 жыл бұрын
Ahhh!!! Mind blown near the end!!!
@pathung2002
@pathung2002 4 жыл бұрын
This lecture is amazing...
@clarencekoh6921
@clarencekoh6921 5 жыл бұрын
The corresponding lecture form 18.03 by MIT OCW that bridges this module is Lecture 25. kzbin.info/www/bejne/npalp4mfiM5srrM
@ignatiusjacquesreilly70
@ignatiusjacquesreilly70 4 жыл бұрын
To be clear that I'm not missing something in the lecture, he assumes exp(lambda*t) is a solution to the matrix differential equation BEFORE he has explained what a matrix exponential is or shown how it might be derived, correct? I'm asking because I want to make sure I'm not missing some section of the video. I went back and watched parts of it multiple times and I think he is skipping around at some places in a way that makes it hard to follow, even if you're familiar with differential equations and the material up to this point. I generally like his broad and conceptual approach but in this video I believe there were too many jumps from one line of thought to another. Nonetheless, once I got past some of the excessive handwaviness, the overall approach to decoupling differential equations is excellent and I learned a lot.
@sunritroykarmakar4406
@sunritroykarmakar4406 3 жыл бұрын
Yes
@kingplunger1
@kingplunger1 3 ай бұрын
36:45 why ?
@sanchayanbhowal4446
@sanchayanbhowal4446 4 жыл бұрын
This is the best lecture series of Linear algebra ❤️
@pelemanov
@pelemanov 13 жыл бұрын
@hypnoticpoisons I think x is limited to numbers between 0 and 1
@dennisting5209
@dennisting5209 3 жыл бұрын
I just finished watching 3Blue1Brown’s first video on matrix exponentials, it’s so well made. Even though I feel like I haven’t fully understood every aspect of this lecture, I do believe that with the help of 3Blue1Brown, this lecture might turn into intuition for me someday.
@zee63976
@zee63976 11 жыл бұрын
life saver for my final
@MsBean-sv8zo
@MsBean-sv8zo 2 жыл бұрын
Can anyone explain, at 31:39, how do we know that v = e^lambda t * v(0)? Where did the v(0) come from?
@condafarti
@condafarti Жыл бұрын
It's the initial condition of the function v(t). Since at time=0 the e^λt equals 1, then it's just v(0). It's practically the C(which represents the initial condition) as it is referred in the exponential models and differential equations.
@seamus9898
@seamus9898 5 жыл бұрын
31:53 what is this "natural notation"? I wish he derived that form from u(t) = c1e^(λt)x1
@gaming4life25
@gaming4life25 5 жыл бұрын
I'm also struggling with that part. Please explain...
@박정범-o3k
@박정범-o3k 4 жыл бұрын
From the fact that dv(t)/dt = Λ * v(t), try to derive formula for v(t). You'll get v(t) = c1 * e^(λ1t) * [1; 0] + c2 * e^(λ2t) * [0; 1]. When you set t = 0, you'll get c1 = v1(0) and c2 = v2(0). So it becomes the "natural notation".
@abdelaziz2788
@abdelaziz2788 3 жыл бұрын
30:30 i guess this should be at the beginning of the lecture but it was a very nice lecture thank you for ur efforts
@Hotheaddragon
@Hotheaddragon 4 жыл бұрын
At 49:30 can someone please explain me how in matrix -b and -k came, I tried hard but couldn't understood, or am I missing some sort of pre requisite here?
@mitocw
@mitocw 4 жыл бұрын
Prerequisites: Multivariable Calculus (18.02). See ocw.mit.edu/18-06S05 for more info. Best wishes on your studies!
@thedailyepochs338
@thedailyepochs338 4 жыл бұрын
he made y^|| the subject of the formula which took b and k to the other end of the equation and then used the first row of the matrix to get the equation and the second row was used to get the second equation which is trivial, hope this helps
@Weightlifeter
@Weightlifeter 14 жыл бұрын
@sbhdgr8 you substitute z=x^2 so the higher power "quadratic" equation can be written as a actual quadratic equation az^2+bz+c.
@jcjobin
@jcjobin 12 жыл бұрын
This is genius! Now I want to go to MIT, but not enough money... guess I will stay at Dalhousie
@middlevoids
@middlevoids Жыл бұрын
Gilbert "The Wizard" Strang!
@suziiemusic
@suziiemusic 8 жыл бұрын
Dear God, Thank you for Gilbert Strang
@kishkinay3042
@kishkinay3042 3 жыл бұрын
If I need to understand this lecture to proceed with the rest of the course, I’d better just give up maths for life...
@MRIDDY27
@MRIDDY27 9 жыл бұрын
Seeing this lecture.....i say it's enough in linear algebra.....i am done with it.....most of the points are unclear in this lecture.....bt before this,every lecture was extraordinary...
@Sandyy10101
@Sandyy10101 7 жыл бұрын
yeah!! Everything is clear until this lecture... Maybe it's because he assumes we have taken some courses of 18.01-05 before watching this video
@DeadPool-jt1ci
@DeadPool-jt1ci 4 жыл бұрын
@@Sandyy10101 18.06 , has 18.01 (Single variable calculus ) , 18.02 ( Multivariate Calculus) and 18.03 (Differential Equations) as prerequisites. But my guess is , even if you dont udnerstand the differential equations stuff , it's still ok , you can just skip any video with things like "fourier transforms , diff eqs " and so on.You should definitely watch the positive defineteness lectures
@differentialeq769
@differentialeq769 8 жыл бұрын
In 32.08, can anyone explain how does he derive v(t) and u(t)?
@davidlovell729
@davidlovell729 7 жыл бұрын
start with dv/dt = Lambda v. This says the derivative of a (vector) variable is a constant times the variable. From differential equations, you know the solution is an exponential function (multiplied by the initial condition). This is just expressing it as a vector (i.e., solving several single-variable cases simultaneously). Thus, v(t)=e^Lambda t v(0). But v was a vector that he got by taking u, and finding what combination of columns of S would be necessary to produce u. v is the vector of those coefficients. Since there are n independent columns of S, it forms a basis, so any vector u can be written as a combination of those columns. Thus, u=Sv. So the next step is u(t) = S v(t) = S e^Lambda v(0). But since u(0)=Sv(0), v(0) = S^-1 u(0). Substitute that, and you're done: u(t) = S e^Lambda S^-1 u(0).
@bharathkamath1304
@bharathkamath1304 7 жыл бұрын
The notes for this lecture on ocw.mit.edu mixed up u(0) and v(0) in a bunch of places... (like u(t) = S e^Lambda S^-1 V(0)). Spent hours trying to figure what's going on before I saw this comment XD. Thanks!
@jerrychen4348
@jerrychen4348 6 жыл бұрын
god~~ thank, man. you save my day.👍
@胯下蜈蚣長老
@胯下蜈蚣長老 4 жыл бұрын
thanks, you save my month lol
@TeejK
@TeejK 4 жыл бұрын
@@davidlovell729 lifesaver
@syedsheheryarbokhari2780
@syedsheheryarbokhari2780 4 жыл бұрын
Prof. Strang is amazing but you should skip this lecture if you are NOT familiar with differential equations. It is too fast paced, and avoids multiple non-distinct eigenvalues. I didn't find the book section very helpful as well. The good thing is skipping this "applications" lecture doesn't affect the series. I would recommend Edwards-Penney differential equations Chapter 5 if you really want to understand what is going on and where these equations are coming from (i.e. the link between eigenvectors, exponential matrices and different equations). You can look at ocw 18.03 section 4 as well, which will explain this at a reasonable pace. or you can just leave thinking aside and put the following things into memory. A system of differential equations has a solution in the eigenvectors and eigenvalues of A. Exponential of a matrix is a taylor series with scalar replaced by a matrix and follows the same derivative. For stability, real part of eigenvalues has to be negative.
@ricardocesargomes7274
@ricardocesargomes7274 8 жыл бұрын
very satisfactory explanation!
@eccesignumrex4482
@eccesignumrex4482 7 жыл бұрын
Best one Yet ! !! !
@sansha2687
@sansha2687 4 жыл бұрын
46:30, 46:45, 49:45
@thovinh5386
@thovinh5386 6 жыл бұрын
I'm so gonna forget all the homeworks I've done in this course (because it makes me learn how to compute rather than actual knowledge, I'll still remember the "knowledge" in the lecture though) Confusing part: 0:00 what does that linear problem really mean? (Why is the solution only u(t) when there're 2 variables u1 and u2? Shouldn't it be u1(t) and u2(t) Maybe that some calculus that I don't know yetf) 9:55 (Au popping off like a wild Pokemon) 18:48 (What is that value, sine and cosine thingy and why? Maybe that's some calculus problem that I should just agree that it is for now) I heard about Taylor series Is (1-x)^-1 = 1 + x + x^2 +... true with every x? Oh, here's the answer : mathforum.org/library/drmath/view/61847.html But then how can we judge if At < I, monkaHmm
@programyourface
@programyourface 8 жыл бұрын
Do I need to know calculus for this ;(. I will learn that after and watch this again.
@antoniolewis1016
@antoniolewis1016 8 жыл бұрын
Yes, you do.
@davidlovell729
@davidlovell729 7 жыл бұрын
You need to know first order linear differential equations. You might learn this somewhere towards the end of a single-variable integral calculus course, and of course you learn it in a first course on differential equations.
@andreagjoka103
@andreagjoka103 8 жыл бұрын
on the book there is a part I did not get at all-namely starting"to display a circle on screen,replace y"=-y' with finite finite difference equation" from end of 315 page till 317, anyone who has understood that part?
@andyralph9495
@andyralph9495 3 жыл бұрын
e^At does not converge as t tends to infinity....why does he say that?
@sbhdgr8
@sbhdgr8 14 жыл бұрын
I have a question if someone could answer it; right at the end of the lecture where a second order diff. eqn is converted into a 2x2 matrix called A; while defining the matrix A, shouldn't element A22 be= -k/b since on rearranging the given eqn and solving for y' we get y'= - (1/b) *y"/b - (k/b)* y; so since the coefficient of y is -k/b ; shouldn't this coefficient correspond to element A22 of the 2x2 Matrix A?
@SudeepJoshi22
@SudeepJoshi22 2 жыл бұрын
No, because as Professor explained in the last seconds of the lecture in order to convert a 2nd order system to a first order we have to introduce a redundant equation hence the elements A21 and A22 must be 1 and 0 to complete the matrix.
@codenzar7772
@codenzar7772 3 жыл бұрын
I watch on 2x and understood everything and now i am feeling like i need to watch again in 0.25x
@reginaldorodrigues3530
@reginaldorodrigues3530 2 жыл бұрын
Great teacher.
@ashutoshsingla7269
@ashutoshsingla7269 4 жыл бұрын
For a moment I thought my left ear had given up.
@lee_land_y69
@lee_land_y69 6 жыл бұрын
I think ill leave this and the next lecture for some other day, coz i dont have enough solid background with diffs to understand these 2 lectures. those who has completed course are this two lectures a must for understanding the rest of the course? thanks
@NisargJain
@NisargJain 6 жыл бұрын
Not quite. But I was doing this through OCW so it came in quiz 2. I don't think you'll need it later until you want to solve high order differential equations. But this is as important as the applications of Linear algebra get so, you should study it nevertheless.
@ognjenfilipovic6893
@ognjenfilipovic6893 4 жыл бұрын
Great cource!Thank you a lot!
@welcomethanks5192
@welcomethanks5192 2 жыл бұрын
why u(0)= [1,0].T, is it a dummy vector that help calcuate?
@thovinh5386
@thovinh5386 6 жыл бұрын
As a non English native speaker university student, the first minute of the video is like the professor speaking alien language to me :|
@hariprasadyalla
@hariprasadyalla 7 жыл бұрын
Nice lecture. Easy to see why solutions are exponential in t. But, I did not get why solution is exponential in lambda*t.
@frazulabrar9398
@frazulabrar9398 2 жыл бұрын
He showed at 9:58
@himanchalsingh1135
@himanchalsingh1135 6 жыл бұрын
Can anyone give me an example of a 'real-symmetric' matrix whose eigen value and singular value are not same...
@andyralph9495
@andyralph9495 3 жыл бұрын
In the end why does he say that matrix converts 5th order differential equation to 1st order differential equation! there are higher orders in the L.H.S...
@pavlenikacevic4976
@pavlenikacevic4976 7 жыл бұрын
48:14 you change it by switching from Lagrangian mechanics to Hamiltonian :D
@ilkertalatcankutlucan3257
@ilkertalatcankutlucan3257 5 жыл бұрын
hey folks, It went smooth for me until min. 36, though I am stucked there. I was thinking Taylor Series Expansion is a quadrization around a Fixed Point, and Mr. Strang simply uses the t = 0 as the fixed point for his linearization.. For what ı know this Expansion should only approximate around the limited neighbourhood of t=0. Am I mistaken somehwhere ? How come is this Taylor Expansion a general thing ? How can it represent the whole e^At for all t ?
@francoisduhem2585
@francoisduhem2585 4 жыл бұрын
ilker Talat Can KUTLUCAN i think it’s because the radius of convergence of the exponential function is infinite.
@zolanhlangulela947
@zolanhlangulela947 Жыл бұрын
Is zero always an Eigen value for 2 x 2 matrix?
@sahilnegi4326
@sahilnegi4326 3 жыл бұрын
Buy why is trace a sum of eige values and det product of it
@thetheoreticalphysicist5852
@thetheoreticalphysicist5852 11 ай бұрын
Beautiful
@bud389
@bud389 13 жыл бұрын
if i had this class........ i would fail it.
@Sandyy10101
@Sandyy10101 7 жыл бұрын
same lol
@ecd4282003
@ecd4282003 6 жыл бұрын
Superb!
@gaming4life25
@gaming4life25 5 жыл бұрын
I lost at 31:53. Somebody help...
@chetankushwaha9052
@chetankushwaha9052 4 жыл бұрын
Me 🙋 too plzz help
@MultiRNR
@MultiRNR 10 ай бұрын
Still hard to understand why need to think of eigenvalue and eigenvector in first place
@MultiRNR
@MultiRNR 10 ай бұрын
Figured out, all doubts answered in previous lecture note: openlearninglibrary.mit.edu/assets/courseware/v1/3a9097c7342f1022aa9e3395196125eb/asset-v1:OCW+18.06SC+2T2019+type@asset+block/18.06_Unit_II_2.9_Lecsum.pdf The key is the difference equation that fully make best use of properties of eigenvectors and eigenvalues that help transform problem into some exp closed form. Simply brilliant 😮
@MultiRNR
@MultiRNR 10 ай бұрын
Similar applies to differential equations here that construct a e^lambda*t form because it’s derivative will put lambda in front, which may leads to eigenValue.
@cherma11
@cherma11 4 жыл бұрын
34:13 is a beauty
@alijoueizadeh8477
@alijoueizadeh8477 5 жыл бұрын
Thank you.
@hypnoticpoisons
@hypnoticpoisons 13 жыл бұрын
36:21 how can sum(x^n) be 1/(1-x), that is obviously too small
@danieljulian4676
@danieljulian4676 5 жыл бұрын
the series only works if the |x| < 1; there are derivations all over the place. This is so basic.
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
Actually, if you’re doing Cesaro sums, the statement is valid for abs(x) > 1. It’s not that basic....
24. Markov Matrices; Fourier Series
51:12
MIT OpenCourseWare
Рет қаралды 116 М.
Differential Equations and exp (At)
18:41
MIT OpenCourseWare
Рет қаралды 22 М.
Andro, ELMAN, TONI, MONA - Зари (Official Audio)
2:53
RAAVA MUSIC
Рет қаралды 8 МЛН
«Жат бауыр» телехикаясы І 30 - бөлім | Соңғы бөлім
52:59
Qazaqstan TV / Қазақстан Ұлттық Арнасы
Рет қаралды 340 М.
22. Diagonalization and Powers of A
51:50
MIT OpenCourseWare
Рет қаралды 519 М.
Differential equations, a tourist's guide | DE1
27:16
3Blue1Brown
Рет қаралды 4,2 МЛН
2023 MIT Integration Bee - Finals
28:09
MIT Integration Bee
Рет қаралды 2,1 МЛН
The Dome Paradox: A Loophole in Newton's Laws
22:59
Up and Atom
Рет қаралды 1,2 МЛН
Aviation expert 'shocked' by South Korea plane crash
8:15
Sky News
Рет қаралды 1,1 МЛН
30. Linear Transformations and Their Matrices
49:27
MIT OpenCourseWare
Рет қаралды 451 М.
21. Eigenvalues and Eigenvectors
51:23
MIT OpenCourseWare
Рет қаралды 660 М.
Two MIT Professors ACCIDENTALLY discovered this simple SECRET TO LEARNING
5:10
Overview of Differential Equations
14:04
MIT OpenCourseWare
Рет қаралды 610 М.
2024 MIT Integration Bee - Finals
1:09:25
MIT Integration Bee
Рет қаралды 754 М.