10. The Four Fundamental Subspaces

  Рет қаралды 624,273

MIT OpenCourseWare

MIT OpenCourseWare

15 жыл бұрын

MIT 18.06 Linear Algebra, Spring 2005
Instructor: Gilbert Strang
View the complete course: ocw.mit.edu/18-06S05
KZbin Playlist: • MIT 18.06 Linear Algeb...
10. The Four Fundamental Subspaces
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu

Пікірлер: 318
@corey333p
@corey333p 7 жыл бұрын
"No mathematics went on there; we just got some vectors that were lying down to stand up."
@corey333p
@corey333p 7 жыл бұрын
Gotta know the bases for the spaces.
@why6447
@why6447 4 жыл бұрын
AHAHHAHHAHAHHAHAH
@delta_sleepy
@delta_sleepy 5 ай бұрын
😂
@GavinoFelix
@GavinoFelix 10 жыл бұрын
"But, after class - TO MY SORROW - a student tells me, 'Wait a minute that [third vector] is not independent...'" I love it. What other professor brings this kind of passion to linear algebra? This is what makes real in the flesh lectures worthwhile.
@xoppa09
@xoppa09 6 жыл бұрын
Give that brave student a medal.
@fanzhang3746
@fanzhang3746 5 жыл бұрын
xoppa09 I think here it is the Professor that's honorable . He elaborated on his mistake, which is reasonably embarrassing for him, and made clear important concepts. I think most others would just correct it, apologize, and move on. You can see his embarrassment when he used words like 'bury', and the reaction when he accidentally uncovered the board again later.
@andersony4970
@andersony4970 3 жыл бұрын
@@fanzhang3746 I don't think he is much embarrassed. He talked about doing math in class in the first vedio of this series, if you've watched that. He said that it might be inevitable to make mistakes, and it's great to go through all the processes with the students including making errors and correcting those.
@NazriB
@NazriB 2 жыл бұрын
Lies again? FAS FUS Sheng Siong
@sahil0094
@sahil0094 2 жыл бұрын
whats so passionate about accepting & correcting own mistake?
10 жыл бұрын
Thank you MIT, thank you Prof Strang.
@PhucLe-qs7nx
@PhucLe-qs7nx 2 жыл бұрын
00:00 Error from last lecture, row dependent. 04:28 4 Fundamental subspaces. 08:30 Where are those spaces? 11:45 Dimension of those spaces. 21:20 Basis for those space. 30:00 N(A^T) "Left nullspace"? 42:10 New "matrix" space?
@lokahit6940
@lokahit6940 5 ай бұрын
i am asking you because your's is the most recent comment? 1)at 9:15 how the column space is R^m? for mxn(m rows x n columns)matrix there are n colums so there are n column vectors so it supposed to be R^n right?
@aarongreenberg159
@aarongreenberg159 2 ай бұрын
@@lokahit6940 Because each vector in the column space has m components. Yes, there are n vectors, but the number of components of a vector describes the dimensions of space its in. This is different once you get to a basis, where the number of vectors describe its dimension, but even that is a subspace of R^(# of components). So a two-vector basis where each vector has 5 components is a 2d subspace in R^5.
@juansepardo2020
@juansepardo2020 11 ай бұрын
I am a 4th year, double engineering student re-learning linear algebra so I can have a stronger basis for ML, DL and AI. Never in my college classes, or independent studying, have I been so amazed in the way a concept is introduced as I was when prof. Strang got to the computing of the left null space. The way this man teaches is just astonishing, thank you very much.
@reganmian
@reganmian 4 ай бұрын
Have you checked out his newest book "Linear Algebra and Learning from Data"?. That plus "Introduction to Statistical Learning" given a foundation in programming, probability, and statistical inference is a killer combo. I'm a statistics graduate student wanting to specialize in ML. I've been watching these on 2x speed as a review
@DanielCoutoF
@DanielCoutoF 9 жыл бұрын
I am so fascinated by the way that professor G. Strang gives his lectures, he does it in such a great way that even a 5 years old boy could understand , on the side , teachers from my university make the subject so complicated, that even highly above the avarege students struggle to understand the concepts poperly.
@andydidyouhear
@andydidyouhear 9 жыл бұрын
Daniel Couto Fonseca A 5 years old is a bit extreme:)
@JadedForAlways
@JadedForAlways 8 жыл бұрын
+Daniel Couto Fonseca What about a 5 year old girl?
@DanielCoutoF
@DanielCoutoF 8 жыл бұрын
Only 5 years old WHITE BOYS I would say
@JadedForAlways
@JadedForAlways 8 жыл бұрын
Are you joking? I can't tell
@DanielCoutoF
@DanielCoutoF 8 жыл бұрын
I guess it's more funny if you dont
@maoqiutong
@maoqiutong 5 жыл бұрын
The second time to see nobody in the classroom. The camera man is really happy to be a VIP student I believe.
@phil97n
@phil97n Жыл бұрын
How can you tell? He seemed to be talking to audience
@KaveriChatra
@KaveriChatra 5 жыл бұрын
"I see that this fourth space is getting second class citizen treatment..it doesn't deserve it"
@NG-we8uu
@NG-we8uu 4 жыл бұрын
Kaveri Chatra by coincidence I read this exactly when he said it
@alenjose3903
@alenjose3903 3 жыл бұрын
@@NG-we8uu me too, i just read this while i was listening to it 😂
@MrGameWWE
@MrGameWWE 3 жыл бұрын
Me too 😂😂
@serg303
@serg303 13 жыл бұрын
I want to write on that chalkboard with that chalk.
@vabez00
@vabez00 4 жыл бұрын
It seems quite satisfying indeed
@Lets_MakeItSimple
@Lets_MakeItSimple 2 жыл бұрын
the chalk looked like a big stone
@matthewsarsam8920
@matthewsarsam8920 5 ай бұрын
Cant lie being able to pause the video and ponder about the ideas is so nice to have. Goes to show how much work those students had to put in
@yanshudu9370
@yanshudu9370 2 жыл бұрын
Conclusion: Four fundamental subspaces of A(m*n), including 1. The column space means spanning the column vectors, which is in R to m, notation as C(A) 2. The nullspace of A means the free variables corresponding vector span the null space, which is in R to n, notation as N(A) 3. The row space means spanning the row vectors, which is in R to n, notation as C(A') equal to n-r 4. The left nullspace of A means the A' free variables corresponding vector span the null space, which is in R to m, notation as N(A') equal to m-r. other conclusions: The sum of dim(C(A')) and N(A) is equal to n, the sum of dim(C(A)) and N(A') is equal to m.
@xiaohanwang3885
@xiaohanwang3885 8 жыл бұрын
For the first time I envy students in MIT. Because they have such genius lectures to attend.
@NostraDavid2
@NostraDavid2 Жыл бұрын
I don't. I've got it better. No time pressure to watch the lectures, I don't NEED to make the exercises, nor the exams. It's great! 😁
@swatejreddy216
@swatejreddy216 7 ай бұрын
@@NostraDavid2 and nor the hefty money too. So yeah.
@jonathanoneill3464
@jonathanoneill3464 7 жыл бұрын
These lectures are saving my bachelors in Engineering. Thanks MIT!
@rohanmalik895
@rohanmalik895 5 жыл бұрын
woah your icon image tells that very precisely that you survived engineering after all.....wish me luck
@easterPole
@easterPole 6 жыл бұрын
I'm into the fifth minute and wondering whether he made that mistake in last lecture knowingly
@sachidanandprajapati9446
@sachidanandprajapati9446 4 жыл бұрын
man, exactly. Due to this error, i came to know if a matrix in non invertible, the columns would be linearly dependent
@eduardoschiavon5652
@eduardoschiavon5652 3 жыл бұрын
40:54 There's no one in the class...
@ManishKumar-xx7ny
@ManishKumar-xx7ny 3 жыл бұрын
Same thought and maybe he did. Great chance
@matthieugrosrenaud1777
@matthieugrosrenaud1777 3 жыл бұрын
@@eduardoschiavon5652 nah it's because they reduced the rows of the class, whtat we see are the rows of zeros.
@GiovannaIwishyou
@GiovannaIwishyou 2 жыл бұрын
I'm actually pretty sure he did this on purpose to trick the audience. Since first two rows are identical, it's too obvious when you learn that matrix must have the same number of linearly independent columns and rows (and it's a GREAT introduction to the lecture).
@antoniolewis1016
@antoniolewis1016 7 жыл бұрын
This man has dedication! Also, that girl in the beginning must have been a sharp genius.
@ispeakforthebeans
@ispeakforthebeans 5 жыл бұрын
Bruh its MIT they got Gods in there you talk about sharp
@akmalsultanov9801
@akmalsultanov9801 4 жыл бұрын
well, when you have an intuition of just row space and column space and connection between them, it's quite obvious and you don't have to be a genius to recognize the dependency of those row vectors. In fact, the first half of the linear algebra is relatively simple.
@sreenjaysen927
@sreenjaysen927 4 жыл бұрын
I think professor just made that up and he intentionally did wrong in the previous lecture just to introduce the row space. Professor just planned it like in "Money Heist"
@leophysics
@leophysics 2 жыл бұрын
@@sreenjaysen927 I agree
@Q.Mechanic
@Q.Mechanic 3 жыл бұрын
It's my honor to have met you even virtually, sir!
@ispeakforthebeans
@ispeakforthebeans 5 жыл бұрын
"Poor misbegotten fourth subspace" -Gilbert Strang, 1999 Remember when Elizabeth Sobeck decided to give GAIA feelings? These guys gave math feelings. And I love him for that. I didn't even know that was possible.
@PyMoondra
@PyMoondra 4 жыл бұрын
The end portion really educated how matrix algebra theory can be applied to computer vision; really glad he added that in.
@davidwilliam152
@davidwilliam152 4 жыл бұрын
How a perfect thing that being able to be a great mathematician and a great teacher at the same time! Especially, being a great teacher is priceless!
@duqueng
@duqueng 14 жыл бұрын
The best teacher ever. I really admire the act of MIT. Like in a phrase in its website: "Unlocking Knowledge, Empowering Minds."
@lokeshkumar-ub9bb
@lokeshkumar-ub9bb 8 жыл бұрын
at 3:15 - 3:20 Instead of looking at the row picture to realize the dependence we may also see that 2*(column 2) - (column 1) gives (column-3) :)
@jacobm7026
@jacobm7026 5 жыл бұрын
This is correct, but his mistake actually illuminates the importance of understanding independence from both the row space and column space. Most matrices wont be this easy to find column space independence so conceptualizing both of those spaces will give you a deeper, richer understanding of vector spaces in general
@dhruvg550
@dhruvg550 5 жыл бұрын
He explains in the first three minutes why you didn't even have to look at the columns. The girl who pointed this out was quick!
@user-qq2gl9ep5d
@user-qq2gl9ep5d 4 жыл бұрын
@@dhruvg550 I think the girl was Gilbert Strang himself
@trevandrea8909
@trevandrea8909 4 ай бұрын
Thank you so much!! Your explanation is soo amazing! Now I finally get why the column space of A and R are different, and why the row space of A and R is the same!! Btw, I'm saving 24:00 for the explanation of the subspaces of A and R
@bfl9075
@bfl9075 2 жыл бұрын
I was totally astonished by the idea of computing left nullspace! Thank you Dr. Gilbert.
@Cyraxsify
@Cyraxsify 7 жыл бұрын
At t = 38:00, Strang shows a way that expedites finding L: find E, then solve [E| I | to get E inverse which = L. Now we can quickly decompose A into LU if we do Gaussian elimination only--not Gauss-Jordan elimination--from the beginning. At t = 43:00, he defines a vector space out of 3x3 matrices, call it M_33. At t = 47:00, he covers the dimensions of subspaces of M.
@navs8603
@navs8603 5 жыл бұрын
Thank you MIT for enabling us enjoy these treats.. And Prof. Strang is just pure genius
@jingyiwang5113
@jingyiwang5113 Жыл бұрын
I am really grateful for your wonderful explanation about the four fundamental subspaces. My mathematics exam is tomorrow. It is a wonderful source for me to learn and refresh my memory. Thank you so much!
@pianosdeaf
@pianosdeaf 3 жыл бұрын
16:35 how I want to feel after the exam when I screw up
@LAnonHubbard
@LAnonHubbard 11 жыл бұрын
Loved the bit at the end where he showed that upper triangular or symmetric or diagonal matrices form a subspace.
@yufanzhou9948
@yufanzhou9948 4 жыл бұрын
The mistake professor Strang made turned into a great connection to the new topic. That's why he is a genius
@MAGonzzManifesto
@MAGonzzManifesto 11 жыл бұрын
Thank you Dr. Strang and MIT. These videos are amazing and keeping me afloat in my class.
@maximliankremsner633
@maximliankremsner633 4 жыл бұрын
Thank you so much for this lecture series. This helps a lot! Great professor with great and easy to understand explanations.
@All_Kraft
@All_Kraft 3 ай бұрын
Thank was great performance! Thank you MIT.
@georgesadler7830
@georgesadler7830 2 жыл бұрын
Incorporating MATLAB commands in the lecture is a great way for students to learn about matrices and linear algebra in context. The overall lecture is another classic by DR. Gilbert Strang.
@DeLuini985
@DeLuini985 6 ай бұрын
Thank God for dr.Strang. I am understanding concepts that have eluded me for over a decade.
@archilzhvania6242
@archilzhvania6242 6 жыл бұрын
He makes everything look so clear.
@gavinresch1144
@gavinresch1144 3 жыл бұрын
It is amazing how he can do these lectures in front of no students and still be so engaging. In a way he is a great actor.
@Lets_MakeItSimple
@Lets_MakeItSimple 2 жыл бұрын
There are students in back rows
@shivamkasat6334
@shivamkasat6334 4 жыл бұрын
A mathematician with Great sense of Humour. Mr. Strang !
@bobmike828
@bobmike828 4 жыл бұрын
Correct me if I'm wrong but Strang was introducing abstract algebra at the end. Once you have all of these linear transformation transforming more linear transformations, you have an even greater transformation of space. Absolutely love this man
@usozertr
@usozertr 3 жыл бұрын
Bob Mike yes, and in an earlier lecture he was talking about how n x n permutation matrices form a group
@pubgplayer1720
@pubgplayer1720 Жыл бұрын
Yes, abstract vector spaces are quite important in linear algebra
@stefanfarier7384
@stefanfarier7384 Жыл бұрын
I really like how he talks. He sounds so friendly in his explanations.
@georgipopov2754
@georgipopov2754 2 жыл бұрын
Brilliant. This lectures connects the complex puzzle
@Mike-mu3og
@Mike-mu3og 5 жыл бұрын
45:26 transform an exclamation mark into an M. Brilliant!
@DerekWoolverton
@DerekWoolverton 3 жыл бұрын
I was nodding my head, keeping up just swimmingly, it all made perfect sense. He wrapped up the diagram and it seemed like we were done. Then he stepped over to the far board and replaced vectors with matrices and just turned everything upside down. Didn't see that coming.
@shavuklia7731
@shavuklia7731 7 жыл бұрын
Oh cool. I've never computer the nullspace of the row space before. Initially, I thought of computer the nullspace of the columnspace of the transpose, but the method he provides - calculating E - is so easy, once you've already done all the work computing the other subspaces.
@anikislamdu
@anikislamdu 12 жыл бұрын
great lecture .i am so grateful to prof.gilbert
@serenakillion7008
@serenakillion7008 4 жыл бұрын
Thank you MIT and Professor Strang!
@kaiding3322
@kaiding3322 Жыл бұрын
I believe Prof. Strang deliberately made the mistake at the end of Lec 9, in order to transition the focus from column space to row space. The transition was too smooth for this to be an accident. This is also a great show of humility that he didn't mind being perceived making a mistake!
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
Worth mentioning: if row-reduction of the matrix generates the most natural row space basis without much effort, we can also generate the most natural basis of the column space of said matrix by doing row-reduction on the transpose of the matrix. This is all so incredibly fascinating!
@onatgirit4798
@onatgirit4798 3 жыл бұрын
If all youtube content would be deleted today, the most upsetting thing for me would probably be losing this series of lessons.
@Afnimation
@Afnimation 11 жыл бұрын
It's interesting that he constantly regards on the fact that he exposes things without proving them, but in fact I think he explains the things so clearly an understandable that he does'nt need to prove them, because we can realize about them almost in an axiomatic way.
@robertcarhart4168
@robertcarhart4168 11 ай бұрын
Strang proves things without you even realizing that you've just experienced a 'proof.' He makes it very conversational and intuitive.
@yourroyalhighness7662
@yourroyalhighness7662 2 ай бұрын
My, I feel so….dense. What a sense of humor this brilliant man must have to have penned a book entitled “Linear Algebra for Everyone”. Sir, I can’t even subract!
@aymensekhri2133
@aymensekhri2133 4 жыл бұрын
Thank you Prof. Strang
@marverickbin
@marverickbin 5 жыл бұрын
vector spaces of matrices! mindblow!
@gavilanch
@gavilanch 15 жыл бұрын
So? This can mean a lot of things, and one of them is that they couldn´t tape this class and Strang had to repeat it in front of the cameras and they didn´t pay to some people to just sit right there so people like you would stop commenting that fact. Great classes, I do not speak english as native language, but certainly this is awesome, I really appreciate it So much Thanks to MIT and Professor Strang!!
@chuckhei
@chuckhei 3 жыл бұрын
I really don't know what to say..... Satisfying? Grateful? OMG I just love it!!!!
@jenniferlai8752
@jenniferlai8752 11 жыл бұрын
Great lectures on linear algebra!
@LAnonHubbard
@LAnonHubbard 11 жыл бұрын
Great video. Thanks Prof. Strang.
@markymark443
@markymark443 8 жыл бұрын
lol funny I'm just first watching this today and it was posted exactly 7 years ago xD thanks for the video, really helpful! I was struggling with this concept for my current linear algebra 2 course since I took the non-specialist version of linear algebra 1 which didn't really test us on proofs at all. I think I have a better understanding of the four fundamental subspaces now! :)
@Ritam_404
@Ritam_404 9 ай бұрын
it's 7 years now !!
@guptaji_uvach
@guptaji_uvach 14 жыл бұрын
Thanks Dr. Strang
@ChandanKumar-ct7du
@ChandanKumar-ct7du 5 жыл бұрын
Thank You Frof. Strang...
@JohnPaul-di3ph
@JohnPaul-di3ph 3 жыл бұрын
My mind got blown when I realized you could get the basis for the left null space from row transformation. I mean, it seems completely obvious after he points it out but I never thought much of it until then.
@AkshayGundeti
@AkshayGundeti 11 жыл бұрын
Thanx a lot Mr.Strang and MIT
@dariopl8664
@dariopl8664 11 ай бұрын
min 18:50 If it's helpful for anybody: the dimension of the null space is the same as the number of basis vectors that form the null space. Just like the dimension of a column space (or rank) is the number of linearly independent columns (i.e. vectors within the matrix), in the case of the null space, its dimension is the number of linearly independent columns, i.e. the number of basis vectors that form the null space.
@xiemins
@xiemins 4 жыл бұрын
May I say that the vectors in R span the same space as vectors in A after row operation because you can do a reverse ROW operation and construct the same vectors in A from R? It can't be true for column space because after row operations you most likely can't reverse and reconstruct the original column vectors from R through COLUMN combinations.
@fuahuahuatime5196
@fuahuahuatime5196 10 жыл бұрын
25:06 So performing row eliminations doesn't change the row space but changes the column space? So to get the basis for the column space, would you have to do column elimination for matrix [A]? Or could you take the transpose, do row elimination, and just use that row basis for [A] transpose as the column basis for [A]?
@readap427
@readap427 8 жыл бұрын
+Pablo P That's what I was thinking as I watched that part of the video. It seems that approach would work. Before this lecture, it's the approach I probably would have used, but now that I see the tie-in to pseudo-Gauss-Jordan, I think I prefer pseudo-Gauss-Jordan.
@miladaghajohari2308
@miladaghajohari2308 3 жыл бұрын
I love these lectures
@Saket-op2xp
@Saket-op2xp 6 ай бұрын
26:15 here can we take basis as first r rows of A also , iff our elimination doesn't involve any row exchanges?
@brogcooper25
@brogcooper25 12 жыл бұрын
He is not only a master lecturer, he is a master of writing on a chalkboard. I swear, it looks like he is using a paint pen.
@thejasonchu
@thejasonchu 8 жыл бұрын
thanks Prof and MIT
@fanggladys9986
@fanggladys9986 Жыл бұрын
He is lecturing to an empty classroom if you look at time 40'53'' !! Even more wonders!
@yiyu9519
@yiyu9519 3 жыл бұрын
love this lecture
@christophercrawford2883
@christophercrawford2883 6 ай бұрын
Nice lecture. Would like to have seen that N(A) and C(A^T) are independent (or even orthogonal!)
@marcuschiu8615
@marcuschiu8615 4 жыл бұрын
this is mind-blowing i don't fully understand it but i know it's mind-blowing
@middlevoids
@middlevoids 10 ай бұрын
Just beautiful
@himanchalsingh1135
@himanchalsingh1135 5 жыл бұрын
Can anyone explain how "length of the linearly independent list ≤ length of spanning list"? TY in advance.
@durgeshmishra9449
@durgeshmishra9449 8 ай бұрын
@ 29:32 the prof said that the basis are the same, but that is not correct, right? Row space are same but with different set of basis for A and R?
@flowewritharoma
@flowewritharoma 12 жыл бұрын
great lecture
@BVaibhav-mt8jx
@BVaibhav-mt8jx 3 жыл бұрын
he is so dam good at explaning! I love him!!!!!!!!!!!
@RomiiLeeh
@RomiiLeeh 10 жыл бұрын
Thank you for sharing this video prof Strang!!! Very helpful! :D
@m1994m1
@m1994m1 10 жыл бұрын
Thank you so much Prof. Greetings from Jordan ^_^
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
*Question:* what is the relationship between rank(A) and rank(A^T)? Does rank(A) = rank(A^T) in general? The professor seems to be hinting at this, but rref(A) only preserves the column space, so it doesn’t seem so trivial to me. Any insight is highly appreciated. Edit: I found the answer. rank(A) = rank(A^T) by virtue of the fact that linear independence of the columns implies linear independence of the rows, even for non-square matrices. I proved this for myself this evening. The main idea for the proof (at least how I did it) is that if you have two linearly dependent rows, one above the other say, row reduction kills the lower one (reduces number of possibly independent rows). Killing off the row (making the row all zeros) also makes it so that the given row can’t have a pivot. Thus, we’ve reduced the number of potential pivot columns by one. That’s the relationship in a nutshell. The math is only slightly more involved
@ostrodmit
@ostrodmit 2 жыл бұрын
rref(A) does not preserve the column space, only the null and row spaces. It does preserve the dim(Row(A)) however, which suffices to prove that the row and column ranks are equal.
@magdaamiridi7090
@magdaamiridi7090 6 жыл бұрын
Hello! Does anybody know any other lecturers like Dr. Strang with such passion in fields like convex optimization, detection estimation or probability theory?
@q44444q
@q44444q 4 жыл бұрын
Look up lectures by Steven Boyd. "Stanford Engineering Everywhere" is like Stanford's version of OCW and has some great courses in convex optimization: EE263 and EE364A. They aren't quite as good as Strang's lectures, but he's hard to beat!
@nonconsensualopinion
@nonconsensualopinion 3 жыл бұрын
John N. Tsitsiklis has great probability lectures on MIT open courseware here on KZbin. Highly recommended.
@Afnimation
@Afnimation 11 жыл бұрын
I don't know if if made my point, but if you see the other lectures you will understand better... in fact i just realize i made a mistake in the grammar at the end... next time i'll review before posting anything.
@saadsaad77869
@saadsaad77869 Жыл бұрын
How intersection of upper triangular matrix and symmetric matrix is equal to diagonal matrix
@phil97n
@phil97n Жыл бұрын
Great lecture thank you.
@gustav87
@gustav87 14 жыл бұрын
This is so helpful, thanks alot!
@alsah-him1571
@alsah-him1571 4 жыл бұрын
9:45 Professor Strang subtly integrates class consciousness into his lecture of the Four Fundamental Subspaces. Truly a genius.
@bokumo7063
@bokumo7063 2 жыл бұрын
Last hired First fired?
@bitstsunami9520
@bitstsunami9520 4 жыл бұрын
suppose I'm in 3D if nullspace is a plane can we not simply write nullspace as an equation of that plane and every x,y,z be the possible values that give b of zeroes similarly if column space is plane and vice versa for row space and null space of A^t? p.s I do understand we can't write any of four subspace as a line in 3D because there is no equation of a line in 3D it's just the equation of the plane
@timelordyunt7696
@timelordyunt7696 5 жыл бұрын
Take another look at the list...the first time I feel glad at so many left unwatched.
@yojansh
@yojansh 4 жыл бұрын
Just when I thought he ran out of blackboard to write he moves to the right and lo and behold there's more of them
@encheng1136
@encheng1136 8 жыл бұрын
There are no students sitting there, but the lecture is still so good.
@scoringwolf
@scoringwolf Жыл бұрын
35:55 Size of identity matrix should be be nxn so that its conformable, shouldn't it?
@imegatrone
@imegatrone 12 жыл бұрын
I Really Like The Video The Four Fundamental Subspaces From Your
@slowpoke7785
@slowpoke7785 2 ай бұрын
Prof Strang said that C(A) != C(R). I'm wondering if this true, because the basis for C(A) are the pivot columns which we got from row operations...
@ermomusic
@ermomusic 4 жыл бұрын
You could also argue that it isnt a basis because -1 time the first vector plus 2 times de second vector gives us the third vector... You really dropped the ball there professor G. hahahaha just kidding, this man is the best thing that ever happened to Linear Algebra right after Gauss
@carlosraventosprieto2065
@carlosraventosprieto2065 11 ай бұрын
Thank you!
@AlexanderList
@AlexanderList 10 жыл бұрын
Class is crowded these days, no worries. Don't know why no one is attending back in 2005!
@omega7377
@omega7377 6 жыл бұрын
It was actually in 2000. But it was uploaded to web in Spring 2005. The dates written in video titles are dates of upload not dates of record.
@abdelaziz2788
@abdelaziz2788 2 жыл бұрын
40:50 is the best plot twist awesomee
@sauravparajuli4988
@sauravparajuli4988 4 жыл бұрын
The twist at the end was better than that of GOT's.
@kevintoner6068
@kevintoner6068 3 жыл бұрын
How many blackboards do you want? Dr Strang: Yes
@ozzyfromspace
@ozzyfromspace 4 жыл бұрын
It's not that she found a numerical error, it was the power of her reasoning for it. I'm shook, whoever that girl is, she's clearly brilliant.
@webstime1
@webstime1 3 жыл бұрын
He made that story up to drive a point
14. Orthogonal Vectors and Subspaces
49:48
MIT OpenCourseWare
Рет қаралды 515 М.
Why? 😭 #shorts by Leisi Crazy
00:16
Leisi Crazy
Рет қаралды 46 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 118 #shorts
00:30
Computing the Four Fundamental Subspaces
10:45
MIT OpenCourseWare
Рет қаралды 39 М.
Singular Value Decomposition (the SVD)
14:11
MIT OpenCourseWare
Рет қаралды 590 М.
Linear Algebra 21 : 4 Fundamental Subspaces
11:08
Derek Banas
Рет қаралды 18 М.
Necessity of complex numbers
7:39
MIT OpenCourseWare
Рет қаралды 2,5 МЛН
Change of basis | Chapter 13, Essence of linear algebra
12:51
3Blue1Brown
Рет қаралды 1,8 МЛН
15. Projections onto Subspaces
48:51
MIT OpenCourseWare
Рет қаралды 511 М.
Jonathan Blow on solving hard problems
3:11
eswarnichtsmehrfrei
Рет қаралды 168 М.