Cant lie being able to pause the video and ponder about the ideas is so nice to have. Goes to show how much work those students had to put in
@PhucLe-qs7nx3 жыл бұрын
00:00 Error from last lecture, row dependent. 04:28 4 Fundamental subspaces. 08:30 Where are those spaces? 11:45 Dimension of those spaces. 21:20 Basis for those space. 30:00 N(A^T) "Left nullspace"? 42:10 New "matrix" space?
@lokahit694011 ай бұрын
i am asking you because your's is the most recent comment? 1)at 9:15 how the column space is R^m? for mxn(m rows x n columns)matrix there are n colums so there are n column vectors so it supposed to be R^n right?
@aarongreenberg1598 ай бұрын
@@lokahit6940 Because each vector in the column space has m components. Yes, there are n vectors, but the number of components of a vector describes the dimensions of space its in. This is different once you get to a basis, where the number of vectors describe its dimension, but even that is a subspace of R^(# of components). So a two-vector basis where each vector has 5 components is a 2d subspace in R^5.
@deveshbhatt40635 ай бұрын
@@aarongreenberg159 Thanks for the clarification.
@GavinoFelix10 жыл бұрын
"But, after class - TO MY SORROW - a student tells me, 'Wait a minute that [third vector] is not independent...'" I love it. What other professor brings this kind of passion to linear algebra? This is what makes real in the flesh lectures worthwhile.
@xoppa096 жыл бұрын
Give that brave student a medal.
@fanzhang37466 жыл бұрын
xoppa09 I think here it is the Professor that's honorable . He elaborated on his mistake, which is reasonably embarrassing for him, and made clear important concepts. I think most others would just correct it, apologize, and move on. You can see his embarrassment when he used words like 'bury', and the reaction when he accidentally uncovered the board again later.
@andersony49703 жыл бұрын
@@fanzhang3746 I don't think he is much embarrassed. He talked about doing math in class in the first vedio of this series, if you've watched that. He said that it might be inevitable to make mistakes, and it's great to go through all the processes with the students including making errors and correcting those.
@NazriB2 жыл бұрын
Lies again? FAS FUS Sheng Siong
@sahil00942 жыл бұрын
whats so passionate about accepting & correcting own mistake?
@juansepardo2020 Жыл бұрын
I am a 4th year, double engineering student re-learning linear algebra so I can have a stronger basis for ML, DL and AI. Never in my college classes, or independent studying, have I been so amazed in the way a concept is introduced as I was when prof. Strang got to the computing of the left null space. The way this man teaches is just astonishing, thank you very much.
@reganmian9 ай бұрын
Have you checked out his newest book "Linear Algebra and Learning from Data"?. That plus "Introduction to Statistical Learning" given a foundation in programming, probability, and statistical inference is a killer combo. I'm a statistics graduate student wanting to specialize in ML. I've been watching these on 2x speed as a review
@itsnotthattough75885 ай бұрын
OMG I'm literally the same. I jumped on ML and AI early in my 2nd year, but could not understand any concepts thoroughly. Now I really feel the need to relearn the basics and prof. Strang is like the savior for me.
@corey333p7 жыл бұрын
"No mathematics went on there; we just got some vectors that were lying down to stand up."
@corey333p7 жыл бұрын
Gotta know the bases for the spaces.
@why64474 жыл бұрын
AHAHHAHHAHAHHAHAH
@delta_sleepy11 ай бұрын
😂
@DanielCoutoF9 жыл бұрын
I am so fascinated by the way that professor G. Strang gives his lectures, he does it in such a great way that even a 5 years old boy could understand , on the side , teachers from my university make the subject so complicated, that even highly above the avarege students struggle to understand the concepts poperly.
@JadedForAlways9 жыл бұрын
+Daniel Couto Fonseca What about a 5 year old girl?
@DanielCoutoF9 жыл бұрын
Only 5 years old WHITE BOYS I would say
@JadedForAlways9 жыл бұрын
Are you joking? I can't tell
@DanielCoutoF9 жыл бұрын
I guess it's more funny if you dont
@Bm23CC8 жыл бұрын
+Daniel Couto Fonseca I challenge you to teach a 5 yr old linear algebra.Good luck to that.
@davidwilliam1524 жыл бұрын
How a perfect thing that being able to be a great mathematician and a great teacher at the same time! Especially, being a great teacher is priceless!
@PyMoondra5 жыл бұрын
The end portion really educated how matrix algebra theory can be applied to computer vision; really glad he added that in.
@duqueng14 жыл бұрын
The best teacher ever. I really admire the act of MIT. Like in a phrase in its website: "Unlocking Knowledge, Empowering Minds."
@jonathanoneill34648 жыл бұрын
These lectures are saving my bachelors in Engineering. Thanks MIT!
@rohanmalik8956 жыл бұрын
woah your icon image tells that very precisely that you survived engineering after all.....wish me luck
@YufanZhouАй бұрын
This lecture about the four subspaces is the most beautiful Linear Algebra lecture I have ever had.
@Upgradezz3 жыл бұрын
It's my honor to have met you even virtually, sir!
@bobmike8285 жыл бұрын
Correct me if I'm wrong but Strang was introducing abstract algebra at the end. Once you have all of these linear transformation transforming more linear transformations, you have an even greater transformation of space. Absolutely love this man
@usozertr4 жыл бұрын
Bob Mike yes, and in an earlier lecture he was talking about how n x n permutation matrices form a group
@pubgplayer1720 Жыл бұрын
Yes, abstract vector spaces are quite important in linear algebra
@xiaohanwang38859 жыл бұрын
For the first time I envy students in MIT. Because they have such genius lectures to attend.
@NostraDavid2 Жыл бұрын
I don't. I've got it better. No time pressure to watch the lectures, I don't NEED to make the exercises, nor the exams. It's great! 😁
@swatejreddy216 Жыл бұрын
@@NostraDavid2 and nor the hefty money too. So yeah.
@jingyiwang5113 Жыл бұрын
I am really grateful for your wonderful explanation about the four fundamental subspaces. My mathematics exam is tomorrow. It is a wonderful source for me to learn and refresh my memory. Thank you so much!
@trevandrea890910 ай бұрын
Thank you so much!! Your explanation is soo amazing! Now I finally get why the column space of A and R are different, and why the row space of A and R is the same!! Btw, I'm saving 24:00 for the explanation of the subspaces of A and R
@yanshudu93702 жыл бұрын
Conclusion: Four fundamental subspaces of A(m*n), including 1. The column space means spanning the column vectors, which is in R to m, notation as C(A) 2. The nullspace of A means the free variables corresponding vector span the null space, which is in R to n, notation as N(A) 3. The row space means spanning the row vectors, which is in R to n, notation as C(A') equal to n-r 4. The left nullspace of A means the A' free variables corresponding vector span the null space, which is in R to m, notation as N(A') equal to m-r. other conclusions: The sum of dim(C(A')) and N(A) is equal to n, the sum of dim(C(A)) and N(A') is equal to m.
@bfl90753 жыл бұрын
I was totally astonished by the idea of computing left nullspace! Thank you Dr. Gilbert.
@DeLuini985 Жыл бұрын
Thank God for dr.Strang. I am understanding concepts that have eluded me for over a decade.
@stefanfarier7384 Жыл бұрын
I really like how he talks. He sounds so friendly in his explanations.
@All_Kraft9 ай бұрын
Thank was great performance! Thank you MIT.
@antoniolewis10168 жыл бұрын
This man has dedication! Also, that girl in the beginning must have been a sharp genius.
@ispeakforthebeans5 жыл бұрын
Bruh its MIT they got Gods in there you talk about sharp
@akmalsultanov98014 жыл бұрын
well, when you have an intuition of just row space and column space and connection between them, it's quite obvious and you don't have to be a genius to recognize the dependency of those row vectors. In fact, the first half of the linear algebra is relatively simple.
@sreenjaysen9274 жыл бұрын
I think professor just made that up and he intentionally did wrong in the previous lecture just to introduce the row space. Professor just planned it like in "Money Heist"
@leophysics2 жыл бұрын
@@sreenjaysen927 I agree
@gavinresch11444 жыл бұрын
It is amazing how he can do these lectures in front of no students and still be so engaging. In a way he is a great actor.
@Lets_MakeItSimple3 жыл бұрын
There are students in back rows
@Cyraxsify8 жыл бұрын
At t = 38:00, Strang shows a way that expedites finding L: find E, then solve [E| I | to get E inverse which = L. Now we can quickly decompose A into LU if we do Gaussian elimination only--not Gauss-Jordan elimination--from the beginning. At t = 43:00, he defines a vector space out of 3x3 matrices, call it M_33. At t = 47:00, he covers the dimensions of subspaces of M.
@Afnimation11 жыл бұрын
It's interesting that he constantly regards on the fact that he exposes things without proving them, but in fact I think he explains the things so clearly an understandable that he does'nt need to prove them, because we can realize about them almost in an axiomatic way.
@robertcarhart4168 Жыл бұрын
Strang proves things without you even realizing that you've just experienced a 'proof.' He makes it very conversational and intuitive.
@navs86035 жыл бұрын
Thank you MIT for enabling us enjoy these treats.. And Prof. Strang is just pure genius
@KaveriChatra5 жыл бұрын
"I see that this fourth space is getting second class citizen treatment..it doesn't deserve it"
@NG-we8uu5 жыл бұрын
Kaveri Chatra by coincidence I read this exactly when he said it
@alenjose39034 жыл бұрын
@@NG-we8uu me too, i just read this while i was listening to it 😂
@MrGameWWE4 жыл бұрын
Me too 😂😂
@georgesadler78303 жыл бұрын
Incorporating MATLAB commands in the lecture is a great way for students to learn about matrices and linear algebra in context. The overall lecture is another classic by DR. Gilbert Strang.
@archilzhvania62426 жыл бұрын
He makes everything look so clear.
@ispeakforthebeans5 жыл бұрын
"Poor misbegotten fourth subspace" -Gilbert Strang, 1999 Remember when Elizabeth Sobeck decided to give GAIA feelings? These guys gave math feelings. And I love him for that. I didn't even know that was possible.
@serg30313 жыл бұрын
I want to write on that chalkboard with that chalk.
@vabez004 жыл бұрын
It seems quite satisfying indeed
@Lets_MakeItSimple3 жыл бұрын
the chalk looked like a big stone
@yolobot29202 ай бұрын
Nah man u got sm severe autism
@LAnonHubbard12 жыл бұрын
Loved the bit at the end where he showed that upper triangular or symmetric or diagonal matrices form a subspace.
@easterPole7 жыл бұрын
I'm into the fifth minute and wondering whether he made that mistake in last lecture knowingly
@sachidanandprajapati94464 жыл бұрын
man, exactly. Due to this error, i came to know if a matrix in non invertible, the columns would be linearly dependent
@eduardoschiavon56524 жыл бұрын
40:54 There's no one in the class...
@ManishKumar-xx7ny4 жыл бұрын
Same thought and maybe he did. Great chance
@matthieugrosrenaud17773 жыл бұрын
@@eduardoschiavon5652 nah it's because they reduced the rows of the class, whtat we see are the rows of zeros.
@GiovannaIwishyou3 жыл бұрын
I'm actually pretty sure he did this on purpose to trick the audience. Since first two rows are identical, it's too obvious when you learn that matrix must have the same number of linearly independent columns and rows (and it's a GREAT introduction to the lecture).
@kaiding3322 Жыл бұрын
I believe Prof. Strang deliberately made the mistake at the end of Lec 9, in order to transition the focus from column space to row space. The transition was too smooth for this to be an accident. This is also a great show of humility that he didn't mind being perceived making a mistake!
@maoqiutong6 жыл бұрын
The second time to see nobody in the classroom. The camera man is really happy to be a VIP student I believe.
@phil97n Жыл бұрын
How can you tell? He seemed to be talking to audience
@klartraum84954 ай бұрын
@@phil97n cameraman avoids to point at the chairs and at the end you don't hear the usual chatter, just silence
@lokeshkumar-ub9bb9 жыл бұрын
at 3:15 - 3:20 Instead of looking at the row picture to realize the dependence we may also see that 2*(column 2) - (column 1) gives (column-3) :)
@jacobm70266 жыл бұрын
This is correct, but his mistake actually illuminates the importance of understanding independence from both the row space and column space. Most matrices wont be this easy to find column space independence so conceptualizing both of those spaces will give you a deeper, richer understanding of vector spaces in general
@dhruvg5505 жыл бұрын
He explains in the first three minutes why you didn't even have to look at the columns. The girl who pointed this out was quick!
@京城五5 жыл бұрын
@@dhruvg550 I think the girl was Gilbert Strang himself
@DerekWoolverton3 жыл бұрын
I was nodding my head, keeping up just swimmingly, it all made perfect sense. He wrapped up the diagram and it seemed like we were done. Then he stepped over to the far board and replaced vectors with matrices and just turned everything upside down. Didn't see that coming.
@yourroyalhighness76628 ай бұрын
My, I feel so….dense. What a sense of humor this brilliant man must have to have penned a book entitled “Linear Algebra for Everyone”. Sir, I can’t even subract!
@chuckhei4 жыл бұрын
I really don't know what to say..... Satisfying? Grateful? OMG I just love it!!!!
@dariopl8664 Жыл бұрын
min 18:50 If it's helpful for anybody: the dimension of the null space is the same as the number of basis vectors that form the null space. Just like the dimension of a column space (or rank) is the number of linearly independent columns (i.e. vectors within the matrix), in the case of the null space, its dimension is the number of linearly independent columns, i.e. the number of basis vectors that form the null space.
@ozzyfromspace4 жыл бұрын
Worth mentioning: if row-reduction of the matrix generates the most natural row space basis without much effort, we can also generate the most natural basis of the column space of said matrix by doing row-reduction on the transpose of the matrix. This is all so incredibly fascinating!
@yufanzhou99484 жыл бұрын
The mistake professor Strang made turned into a great connection to the new topic. That's why he is a genius
@fanggladys99862 жыл бұрын
He is lecturing to an empty classroom if you look at time 40'53'' !! Even more wonders!
@williamss4277Ай бұрын
beginning from natural number N, then integer Z, then real number, then complex number which is just only 2 dimensional number, then vector which is a n dimensional number. vector space of vectors is like the range of N, or Z, or real numbers. with vector space the confident definition of vectors can be obtained. in terms of calculation linear algebra uses computer to do calculation. the key is to find out the algorith by studying examples of low dimensions vectors and matrices. with the algorithm the calculation can scale to vectors and matrices of high dimensions. obviously vector is a much much more expressive number comparing with preceeding any kind of number. so vector as well as linear algebra become a very powerful math tools in many applications.
@Mike-mu3og5 жыл бұрын
45:26 transform an exclamation mark into an M. Brilliant!
@shivamkasat63345 жыл бұрын
A mathematician with Great sense of Humour. Mr. Strang !
@georgipopov27542 жыл бұрын
Brilliant. This lectures connects the complex puzzle
@gavilanch15 жыл бұрын
So? This can mean a lot of things, and one of them is that they couldn´t tape this class and Strang had to repeat it in front of the cameras and they didn´t pay to some people to just sit right there so people like you would stop commenting that fact. Great classes, I do not speak english as native language, but certainly this is awesome, I really appreciate it So much Thanks to MIT and Professor Strang!!
@p.z.83555 ай бұрын
Why is he such a good lecturer, my Prof used to just read from the text book
@karthik36853 жыл бұрын
There is a problem with Dr. Strang's lectures. The problem is, he makes it so intuitive that I'm literally nodding in agreement the entire lecture. I've now watched the lectures once, read the book chapters, and watched the lectures a second time. And while I have a good grasp of everything discussed so far, they all sort of blend in. I couldn't list the things I learnt one by one for these 10 lectures. :D (well, I sort of can.)
@johnk81742 жыл бұрын
ya gotta do the problems. That pulls it together in your head.
@maximliankremsner6335 жыл бұрын
Thank you so much for this lecture series. This helps a lot! Great professor with great and easy to understand explanations.
@MAGonzzManifesto11 жыл бұрын
Thank you Dr. Strang and MIT. These videos are amazing and keeping me afloat in my class.
@ozzyfromspace4 жыл бұрын
It's not that she found a numerical error, it was the power of her reasoning for it. I'm shook, whoever that girl is, she's clearly brilliant.
@webstime13 жыл бұрын
He made that story up to drive a point
@shavuklia77317 жыл бұрын
Oh cool. I've never computer the nullspace of the row space before. Initially, I thought of computer the nullspace of the columnspace of the transpose, but the method he provides - calculating E - is so easy, once you've already done all the work computing the other subspaces.
@ucrclxl14 жыл бұрын
What fascinates me are some stats you can find below this video. Maybe it's some bug but youtube tells us that this video is most popular among: 1) men 45-54 yo 2) men 35-44 3) men 25-34 Which I find really strange cause I've thought that most of viewers would be actual students. Also, popularity by region is interesting stat.
@agarwaengrc8 жыл бұрын
where exactly can you find these stats? When I click on statistics I just get a viewcount graph
@serenakillion70085 жыл бұрын
Thank you MIT and Professor Strang!
@marcuschiu86154 жыл бұрын
this is mind-blowing i don't fully understand it but i know it's mind-blowing
@aymensekhri21334 жыл бұрын
Thank you Prof. Strang
@markymark4438 жыл бұрын
lol funny I'm just first watching this today and it was posted exactly 7 years ago xD thanks for the video, really helpful! I was struggling with this concept for my current linear algebra 2 course since I took the non-specialist version of linear algebra 1 which didn't really test us on proofs at all. I think I have a better understanding of the four fundamental subspaces now! :)
@alsah-him15714 жыл бұрын
9:45 Professor Strang subtly integrates class consciousness into his lecture of the Four Fundamental Subspaces. Truly a genius.
@bokumo70633 жыл бұрын
Last hired First fired?
@JohnPaul-di3ph3 жыл бұрын
My mind got blown when I realized you could get the basis for the left null space from row transformation. I mean, it seems completely obvious after he points it out but I never thought much of it until then.
@christophercrawford2883 Жыл бұрын
Nice lecture. Would like to have seen that N(A) and C(A^T) are independent (or even orthogonal!)
@abdelaziz27883 жыл бұрын
40:50 is the best plot twist awesomee
@onatgirit47983 жыл бұрын
If all youtube content would be deleted today, the most upsetting thing for me would probably be losing this series of lessons.
@brogcooper2513 жыл бұрын
He is not only a master lecturer, he is a master of writing on a chalkboard. I swear, it looks like he is using a paint pen.
@ozzyfromspace4 жыл бұрын
*Question:* what is the relationship between rank(A) and rank(A^T)? Does rank(A) = rank(A^T) in general? The professor seems to be hinting at this, but rref(A) only preserves the column space, so it doesn’t seem so trivial to me. Any insight is highly appreciated. Edit: I found the answer. rank(A) = rank(A^T) by virtue of the fact that linear independence of the columns implies linear independence of the rows, even for non-square matrices. I proved this for myself this evening. The main idea for the proof (at least how I did it) is that if you have two linearly dependent rows, one above the other say, row reduction kills the lower one (reduces number of possibly independent rows). Killing off the row (making the row all zeros) also makes it so that the given row can’t have a pivot. Thus, we’ve reduced the number of potential pivot columns by one. That’s the relationship in a nutshell. The math is only slightly more involved
@ostrodmit2 жыл бұрын
rref(A) does not preserve the column space, only the null and row spaces. It does preserve the dim(Row(A)) however, which suffices to prove that the row and column ranks are equal.
@marverickbin6 жыл бұрын
vector spaces of matrices! mindblow!
@mohammedtarek95444 жыл бұрын
Im just gonna write this for ppl like me if you are like me(didn't finish high school yet(first year)) you will have to put some effort in studying this u gotta search for some basics u may not have been taught yet and it will take a lot of time to search alot but just try and take the reason you are studying this that early as a motivation cuaze some stuff will be so frustrating and hard to understand cuaze u haven't mastered some basics yet but ik for sure it's worth for me Im studying this to learn machine learning and some stuff i wanna also do physics engines and stuff like that to help people and i found that everything i wanna do is related to linear algebra and once i get it and get deep into it i can do whatever i want just by some basic research Trust me it's hard but worth❤
@souravghosh75585 жыл бұрын
Somehow after some thought I figured out why Prof says all upper triangular matrices are subspaces of 3X3 Matrices! If we add any 2 upper triangular matrices we are still in that space as also if we multiply with scalar any upper triangular matrices we are still in that space. Same with Symmetric and Diagonal matrices. But Prof assumes that all students will decipher this and he did not spell that out. Otherwise its a privilege to hear from a genius!
@sauravparajuli49884 жыл бұрын
The twist at the end was better than that of GOT's.
@ermomusic4 жыл бұрын
You could also argue that it isnt a basis because -1 time the first vector plus 2 times de second vector gives us the third vector... You really dropped the ball there professor G. hahahaha just kidding, this man is the best thing that ever happened to Linear Algebra right after Gauss
@AlexanderList10 жыл бұрын
Class is crowded these days, no worries. Don't know why no one is attending back in 2005!
@omega73777 жыл бұрын
It was actually in 2000. But it was uploaded to web in Spring 2005. The dates written in video titles are dates of upload not dates of record.
@anikislamdu13 жыл бұрын
great lecture .i am so grateful to prof.gilbert
@arteks20013 жыл бұрын
Correction of error from previous lecture 0:43 Introduction to the four fundamental subspaces (column space, null space, row space, left null space) 4:20 Basis and dimension of each fundamental subspace 11:44 Basis and dimension of the column space 12:50 Dimension of the row space (it is the rank) 14:41 Basis and dimension of the null space 17:05 Dimension of the left null space (m - rank) 19:41 Basis of the row space (nonzero rows in the rref) 21:08 Basis of the left null space 29:48 Review of the four fundamental subspaces 42:09 A new vector space of all 3 by 3 matrices 42:32
@ramkrishna32564 жыл бұрын
For finding basis for N(A), Why can't we use similar approach of finding basis for left nullspace. 1) trans(A) - - - -> RREF 2) E' × trans(A) = RREF 3) finding basis from E'
@MultiRNR9 ай бұрын
Yes I have same question and this way sounds more mechanical (programmable) than earlier way
@davidmurphy56311 ай бұрын
3:30 Funny, I was looking in col space and noticed that -1 * C1+ 2 * C2 = C3 and completely missing the far more obvious fact that R1 = R2. Hey ho. Target fixation. It's what sunk the Kaga and Akagi.
@BVaibhav-mt8jx3 жыл бұрын
he is so dam good at explaning! I love him!!!!!!!!!!!
@ChandanKumar-ct7du6 жыл бұрын
Thank You Frof. Strang...
@ghsjgsjg53chjdkhjydhdkhfmh744 жыл бұрын
😖😖 He's the best professor I know and yet my brain doesn't get it at once😂
@nonconsensualopinion3 жыл бұрын
That's fine. All at once doesn't matter. What matters is "forever and always". Do what you must to understand it deeply so that you will know it the rest of your life. It may take watching the video many times and will probably require writing down some matrices and doing them yourself. Math is a subject which is hard to learn by observation; it really depends on participation. Remember, the students in the audience were MIT students, so they had proven they were quite talented. Those students saw what you saw in the video. Those students had the ability to talk to this professor after class. Those students had homework practice. Still, when the quiz was administered, I guarantee the average score was below 100%. Even after all that help, some students didn't quite get it all. They didn't get it "all at once". How can you expect yourself to do better than that, especially if you demand it happen "all at once"?
@SandeepSingh-hc3no2 жыл бұрын
It's like an enlightenment moment when he says, "she said, it's got two identical rows"
@apaksoy6 жыл бұрын
I watched this video hoping to learn what the row space and left null space were good for and learned nothing new. This lecture recounts only the definitions of the four fundamental subspaces and their dimensions and works on an example of finding their respective bases. Like in his book "Introduction to Linear Algebra", Dr. Strang takes so many shortcuts and skips over the precise definition of many concepts and proofs and precise definitions of so many important theorems that I find his lectures (and book as well) useful only to a limited extent. I appreciate the effort but I believe there should be a better way of teaching linear algebra.
@syedsaad69296 жыл бұрын
I felt that too, both his lectures and his book lack in rigor and depth. Do you know some other resource?
@apaksoy6 жыл бұрын
@@syedsaad6929 It has significant shortcomings as well but I found Dr. van de Geijn's course on edX, "Linear Algebra: Foundations to Frontiers", to suit better to my taste, having a better balance of theory vs applications. The fall class has just ended but they have a new one starting 16 January 2019 (totally free if you like) and the professors themselves answer the questions during the course! The course also provides downloadable pdf notes along with the class. Having made the above criticism against Dr. Strang's way of teaching linear algebra, I have to acknowledge that nearly all of the (worked) examples I have studied and quite a few of the exercises in his book "Introduction to Linear Algebra" were excellent. Though I have reservations about his approach to teaching linear algebra, I still recommend studying his book but not as the only source.
@apaksoy6 жыл бұрын
Also check out these resources which I found helpful at times: 1) Linear Algebra Done Right ( www.linear.axler.net ), 2) immersive linear algebra ( immersivemath.com/ila/index.html ), 3) A First Course in Linear Algebra ( linear.ups.edu/html/fcla.html ). The first one is the site for Dr. Sheldon Axler's book which refers to videos based on his book. Videos only provide a summary of his book but still helpful like the abridged version of the book. Unfortunately, you may need his unabridged book to make the most out of his teachings but it is not (legally) freely available. Axler's approach is totally different from than that of Strang and more suitable for math majors than other science and engineering majors but it is so clean and fundamental. It is a good source whenever you want to understand some basic linear algebra concept deeply.
@mitocw5 жыл бұрын
@@apaksoy There are also a number of other courses and resources available for linear algebra on MIT OpenCourseWare. We recommend you check out Herb Gross' "Calculus Revisited: Complex Variables, Differential Equations, and Linear Algebra" (ocw.mit.edu/RES18-008 and/or KZbin playlist: kzbin.info/aero/PLD971E94905A70448 ) To see the complete listing of courses related to linear algebra, visit our Course Finder: ocw.mit.edu/courses/find-by-topic/#cat=mathematics&subcat=linearalgebra. Best wishes on your studies!
@syedsaad69295 жыл бұрын
@@apaksoy I agree, I do follow the book. Its good for applications of linear algebra which is what I need, but not what satisfies me.
@timelordyunt76965 жыл бұрын
Take another look at the list...the first time I feel glad at so many left unwatched.
@yojansh4 жыл бұрын
Just when I thought he ran out of blackboard to write he moves to the right and lo and behold there's more of them
@guptaji_uvach15 жыл бұрын
Thanks Dr. Strang
@redthunder6183 Жыл бұрын
god dude, my school cobined multivariable calc and linear algebra into one class, so this entire lecture was only one part of 4 of my most recent lecture
@xiemins4 жыл бұрын
May I say that the vectors in R span the same space as vectors in A after row operation because you can do a reverse ROW operation and construct the same vectors in A from R? It can't be true for column space because after row operations you most likely can't reverse and reconstruct the original column vectors from R through COLUMN combinations.
@miladaghajohari23083 жыл бұрын
I love these lectures
@s4mwize5 жыл бұрын
Here's a paper by prof. Strang related to this lecture. web.mit.edu/18.06/www/Essays/newpaper_ver3.pdf
@habenbelai74204 жыл бұрын
36:24 ...SPORTS. IT'S IN THE GAME!
@magdaamiridi70906 жыл бұрын
Hello! Does anybody know any other lecturers like Dr. Strang with such passion in fields like convex optimization, detection estimation or probability theory?
@q44444q5 жыл бұрын
Look up lectures by Steven Boyd. "Stanford Engineering Everywhere" is like Stanford's version of OCW and has some great courses in convex optimization: EE263 and EE364A. They aren't quite as good as Strang's lectures, but he's hard to beat!
@nonconsensualopinion4 жыл бұрын
John N. Tsitsiklis has great probability lectures on MIT open courseware here on KZbin. Highly recommended.
@jrkirby9313 жыл бұрын
great that we can E=AR this lecture.
@OneZombieTrain7 жыл бұрын
Guess you could say the students who understood that were all E=ARs
@encheng11368 жыл бұрын
There are no students sitting there, but the lecture is still so good.
@flyLeonardofly8 жыл бұрын
i think he breaks with the usual convention for m * n Matrix being: m rows, n collums ... which confused me, but great lecture anyways ... edit: I was wrong 9:12 ... misunderstood what he said there, ofc the columspace has m (rows many) components, because colums go m (rows-many) components down ... thanks Robert Smits
@TheRsmits8 жыл бұрын
He keeps the convention. His example A=[[1 2 3 1], [1 1 2 1], [1 2 3 1]] has 3 rows and 4 columns. The 1st row is [1 2 3 1] has 4 components and is thus an element of R^4.
@Neme1127 жыл бұрын
He doesn't but it is definitely a convention that confuses me and I have to think twice about the coordinates every time. Usually in mathematics (and in programming) the dimensions are X (meaning horizontal offset) and then Y - here it's reversed. If somebody tells me coordinates [100, 1] I expect it to be far to the right, not way down.
@kristiantorres10805 жыл бұрын
I was confused too! So I scrolled down in the comments looking for a lost soul like me xD Thank you for the explanation Robert!
@user-wm8xr4bz3b5 жыл бұрын
thanks Luis .. you cleared my doubt !! ^^
@AkshayGundeti11 жыл бұрын
Thanx a lot Mr.Strang and MIT
@fuahuahuatime519610 жыл бұрын
25:06 So performing row eliminations doesn't change the row space but changes the column space? So to get the basis for the column space, would you have to do column elimination for matrix [A]? Or could you take the transpose, do row elimination, and just use that row basis for [A] transpose as the column basis for [A]?
@readap4278 жыл бұрын
+Pablo P That's what I was thinking as I watched that part of the video. It seems that approach would work. Before this lecture, it's the approach I probably would have used, but now that I see the tie-in to pseudo-Gauss-Jordan, I think I prefer pseudo-Gauss-Jordan.
@ozzyfromspace4 жыл бұрын
At 14:00 that’s *not* true in general. The number of basis vectors of C(A) is rank(A) *yes* but said basis vectors are not the pivot columns. The professor probably misspoke or something. Doing rref(A) of a matrix means we’re taking linear combinations of the rows of A, so there’s no reason to believe the column space will be preserved during the row-reduction operations. In general, C(A) is not equal to C(rref(A)). Let A^T stand for transpose of matrix A. Then the following is always true: *C(A) = C( ( rref(A^T) )^T )* . He’s kind of implied this already in previous lectures without using the rote notation, I’m just connecting the dots like a game. Edit: A simple example is A = [1,2; 2,4]. rref(A) = [1,2; 0,0]. The column space of rref(A) is along the “y” axis whereas the column space of A is along the line “y” = 2 ”x”. It should be clear that aside from the zero vector, they have no other vectors in common. Further, rref(A) has one pivot column, so the dimension of A should be 1. 2*[1;2] = [2;4] (columns are linearly dependent, so we really only have *1* vector to scale. Hence, dimension of C(A) = r = 1, as earlier predicted 😊 Hopefully this helps someone. Professor Strang is one of the best educators out there, I see why people admire MIT! This playlist in linear algebra is the perfect way to prepare for the vectorized implementations of machine learning (if you actually wanna understand what you’re doing, that is). Best wishes to you all. 👨🏽🏫
@Afnimation11 жыл бұрын
I think that for to figure out how a space looks like, you must ask yourself how a space looks like... because a subspace is in fact a space inside another space, so just imagine that instead of being in a three dimensional space we are in a 3 dimensional subspace inside another bigger-dimensional space and a null-space turns to be the space of all vectors in that bigger-dimensional space that when multiplies any array of 3 basis vector of our "subspace R^3" that nullifies our space (makes it 0)
@RomiiLeeh10 жыл бұрын
Thank you for sharing this video prof Strang!!! Very helpful! :D