I don't know what he's doing differently, but everything is easier when I listen to this guy. It might be the way he repeats certain phrases or makes everything less abstract. I'm currently struggling in my Linear Algebra class, so seeing this video and realizing that it isn't as hard as my current professor makes it seem is a huge relief. This guy is great. Thanks MIT.
@aibattileubaiuly78304 жыл бұрын
Did you pass your class with a good mark?
@noddlexxx91614 жыл бұрын
@@aibattileubaiuly7830 A+ boiiiiii
@youngjim79874 жыл бұрын
Damn, I share the identical feeling with you bro. My prof in linear algebra thought the outside is way too noisy and she always closes the door of the classroom.So its super stuffy in there and I felt run out of O2 so therefore sleepy. It's all from mr gilbert that i learn sth
@flashg32923 жыл бұрын
The difference I think is that this guy actually taeches linear algebra as part of a bigger picture. Most math profs just teach this stuff as a series of theorems that don't really have anythign to do with each other except you need to use theorem t to answer question a) b) c).etc. but hes trying to explain linear algebra as a holistic system
@vsshappy3 жыл бұрын
The only difference is his passion and love for the subject and quest to impart the same to his students. Without passion, love, and quest - it will be just lecturing the text with articulation, which might bring clarity but does not bind the audience.
@ozzyfromspace4 жыл бұрын
Fun fact, not mentioned in the video: If the matrix A is singular, then the process is losing information because two linearly independent vectors in the input space could be mapped to the same output vector, making them indistinguishable. When the professor discussed the matrix for ordinary differentiation of a polynomial function, notice that it was singular. This corresponds to the arbitrary constant of integration i.e. information will be lost, so you save it ahead of time via initial conditions so you can reconstruct the input vector. Linear algebra is amazing for this kind of insight.
@frazulabrar93982 жыл бұрын
Cool observation!
@f_0_n11 жыл бұрын
Gilbert Strang, the name that all college students around the world know. He's a genius and a great teacher
@os_car.s27685 жыл бұрын
No digas mamadas prro
@adrianho71654 жыл бұрын
Oscar Hernandez ??
@hongvi71704 жыл бұрын
Now,06/02/2021, this is true. Hello, Afonso
@rosadovelascojosuedavid18943 жыл бұрын
@@os_car.s2768 jajaj te cae mal Gilbert Strang?
@abdulmukit44204 жыл бұрын
These lectures make me so regret that I was not fortunate enough to go to MIT and sit in this legend's classes.
@jorgemartinez989212 жыл бұрын
I'm doing a master's in Germany and we were supposed to go through all the lectures as an introduction to a course called Advanced Mathematics for Engineers. At the beginning I was quite reluctant to do that, but after the first lecture I couldn’t wait for the next one. Professor Strang is indeed a great teacher. Thank you MIT for sharing these splendid lectures!
@rosadovelascojosuedavid18943 жыл бұрын
Sorry, what was your master? I'm really zealous to know :)
@alfredomulleretxeberria42392 жыл бұрын
@@rosadovelascojosuedavid1894 Master of baiting.
@victormbebe37973 жыл бұрын
"if you cannot explain what you learnt to a little child in the way that she/he can understand easily, you definitily didint understood what you a trying to explain". This Professor Understood deepily lenear algebra. Congrates and thanks
@webstime13 жыл бұрын
Are you saying you can show this video to a little child and they will understand it easily?
@Robocat7542 жыл бұрын
If you think you fully understand this lecture the first time you watched it, it could possibly means either you knew this subject very well before you watched it or you don't really understand the subject deeply as you think. Please watch it again and think about why everything he said is true. Then you will have more questions unanswered in your mind after you watched this lecture.
@kingplunger14 ай бұрын
Well, thats a nice saying, but in the end its not true for some topics.
@fakwater12 жыл бұрын
Wow what an amazing professor. If only every math professor was as clear as him! THANK YOU MIT!!
@omrastogi52584 жыл бұрын
Prof. Strang, you may not a true physicist but you are a true magician.
@fallensach3 жыл бұрын
He explains it so clearly its amazing. Instead of just throwing around the definition of it all he actually explains and gives examples on why and how it works, in a simplified manner.
@anonym4982 жыл бұрын
Is multivariable calculus a prerequisite to this course?
@benjaminnemelka64842 жыл бұрын
@@anonym498 no
@zionen0115 жыл бұрын
This guy is something else. I go to my Linear Algebra class and I usually go out feeling more confused than when I went in, then I come watch these videos and things make sense in a minutes. It's not just what he knows, but he knows how to explain it, simple brilliant.
@neurolife775 жыл бұрын
I've been watching all the precedent videos understanding most of what happens, but I always found that I had no link between what I saw and what I know already. I could not integrate the new set of concepts to my current knowledge. It was quite puzzling. Now with the first 18 mins of this lecture everything just clicked. It's like I was floating in space having a hard time navigating but still could move from one point to another. Now I got gravity. This is quite impressive.
@nielsota633 жыл бұрын
sounds like you found a basis
@_RajRanjanOjha6 ай бұрын
Gilbert Strang is one of the best teacher in the world. The keywords and key sentences which he uses in his lectures makes it simple to understand. Loved this. ❤❤
@divinusnobilite15 жыл бұрын
At BYU right now. Chem E. Studying this material for the past few days now to get ahead. This lecture is an excellent tool for taking all these concepts and placing them in a clear and simple context. I have greatly benefited from listening to this lecture. Thank you.
@carlosrayasanchez78842 жыл бұрын
12 years later, how is it going?
@CarneysDrums2 жыл бұрын
Pretty incredible to be sitting in my living room while learning from a world class professor. Thank you MIT for sharing these!
@evanmthw10 жыл бұрын
This professor has always been able to clear my confusion on these higher level math courses.
@astrophilip12 жыл бұрын
I switched from engineering to math/physics during college, and so was taught linear algebra with and without coordinates. One course, entirely with matrices. Another course almost entirely without matrices. What a beautiful topic.
@adamjahani44943 ай бұрын
This is how you teach. I wish professors would learn how to teach like this man. Gawd it makes sense now.
@sarvasvarora4 жыл бұрын
Feels like 3b1b's video was inspired by this lecture!
@georgesadler78303 жыл бұрын
These are very niece explanation of Linear Transformations and Their Matrices thanks once again to the godfather of linear DR. Gilbert Strang. I took introductory linear algebra at the University of Maryland Baltimore County in the late 1980's and my professor was nothing close to what Dr. Strang is. This MIT legend is a rare find.
@billz28112 жыл бұрын
this guy is so great. My linear algebra teacher is horrible.. He makes it actually kind of interesting!
@АлександрСницаренко-р4д3 жыл бұрын
If I had tattoos, I would make a tattoo with Gilbert Strang's name on it:) Amazing guy, amazing lecture, amazing MIT
@MoonLight110238 жыл бұрын
Thank you! I'm learning good stuff for free.
@freeeagle60742 жыл бұрын
I think the reason why Professor Strang's courses are so popular lies with his personality. He is very considerate so he would try all means to find a way to explain linear algebra concepts easy to understand. This is the same at workplace. Nice people who always take other people's feelings into account tend to provide popular products and services. Some professors offer hard-to-understand, unpopular courses since they don't care about how their students feel at all. They mainly care about whether they are promoted or receive more funding.
@konobel87057 жыл бұрын
This lecture is the greatest one ever
@elcarle9510 жыл бұрын
this guy is amazing, thank you from Spain!!
@dahl12115 жыл бұрын
Have about this subject at the moment, don't understand anything about it - low selfconfidence. but watching this lecutre - I understand everything again.
@jovanjovancevic91511 жыл бұрын
A great video! I have to notice that T(0)=0 "proof" is a bit funny :)
6 жыл бұрын
wtf this fella just said see ya next monday after thanksgiving and it is actually this weekend for me o.O
@salomiranimerugu96693 жыл бұрын
Very nice explanation. From now I am a big fan of him
@ehorganv15 жыл бұрын
That is correct, it's not a linear transformation. For me, it helps to think of this in terms of similar triangles: T(2v) takes two times the original vector but only adds one times v0 to it. For the resultant vector to be proportional (i.e., =2T(v) ), one would have to add 2v0 to T(2v).
@666mafiamongers12 жыл бұрын
Thanks to MIT for explaining that one. I can think of an explanation for the origin of that phrase. You see, the Summer gets "tricked" into thinking it's gonna last. But alas! The Winter inevitably comes and the few days of its warm reign are over. And just like this short-lived summer period, the Indians got tricked by their conquerors too. Thus the term, "Indian Summer". And Thank You MIT for these wonderful videos. Keep up the great work!! Great Profs teaching in a great way!
@arrangement13 жыл бұрын
@valtih1978 I'm not exactly sure what portion of the lecture you're referring to, but if you're talking about the later 3/4, I think he means that A imparts transforms upon the vectors that are expressed with the basis {v1, v2,...,vn} and transforms them into a new basis {w1, w2,..wn} where T(v1), T(v2), etc. are all functions used to express the original bases {v1, v2, etc.} "in terms of" {w1, w2, etc.} So the system of equations is just a generalization of that concept.
@SahilZen42 Жыл бұрын
The choice of basis is important for linear transformation. Actually it becomes quite easy to use convenient basis as we can represent things in polar as well as Cartesian coordinates. We humans are breaking things and making things.
@dennisyangji15 жыл бұрын
worth spending time to see this wonderful video
@mehmetaliozer24034 жыл бұрын
this lecture made my night before the midterm
@Tyokok Жыл бұрын
there is essential key points behind the example at 36:59 connect to eigenvalue/vector, but not well explained, hope professor could've extended right there
@vishakp8913 жыл бұрын
Professor, really respect you ! Thank you and God bless :)
@Enlightenchannel3 жыл бұрын
Wow, yea this guy is a great teacher!
@zhangjerry27314 жыл бұрын
Give a lot of examples, very helpful, still useful now
@juhxmn214 жыл бұрын
Professor Strang rules!
@MirageScience13 жыл бұрын
first thought: that is some good ass chalk.
@aryankumarprasad15743 жыл бұрын
Same
@avnishpanwar95023 жыл бұрын
The famous MIT chalk
@arbiter7713 жыл бұрын
finally, Zodiark EX Prog
@volkerblock5 жыл бұрын
Ahh! Suddenly everything is easy to understand, there is great enlightenment. In the previous lessons I sometimes had difficulties and now the dark forest is thinning out.
@blakehouldsworth99184 жыл бұрын
I go to texas tech university and amidst covid 19 I have been having a hard time with linear algebra help, this was a nice relief. My question is transforming the basis of numbers I completely get and I want to know if this applies to unit conversion with physical properties. Like in doing stochiometry to convert between units (same plane). Or the relationships we get from integration and derivation like mass and force. Thank you for this resoucre incredibly helpful
@didyoustealmyfood87293 жыл бұрын
this guy is a legend
@rasikajayathilaka35164 жыл бұрын
Thank you!..every thing makes sense now!
@Ghostly12345678911 жыл бұрын
who is this magnificent instructor?! He puts all of my math professors to shame!
@5231019976 жыл бұрын
the derivative is linear thing blew my mind
@hibbibibbi2 жыл бұрын
bro helped me beat Zodiark. thanks man
@imbsalstha12 жыл бұрын
a teacher u really like 2 have
@김유진-m3o3h4 жыл бұрын
44:50 : why a11 a21 ... am1 is the first column of matrix A? Isn't this equation right? >> T(V1) = AV1 = a11w1 + a21w2 + ... +am1Wm
@ozzyfromspace4 жыл бұрын
In General, A = [column_1 column_2 ... column_n] input coordinate vector v = [v1 v2 ... vn]^T So: output vector *w = A * v = (v1 * column_1) + (v2 * column_2) + ... + (vn * column_n)* i.e. an n-term linear combination of m-dimensional vectors. Now imagine v1 = 1 and v2 through vn being zero. Then column 1 is the vector you get when you pass in a unit vector according to your input basis. This works for v2 = 1 while everything else is zero, and so on, and so on. Intuition: say you have a 2x3 rectangular matrix, A, and a 3*1 input vector v. The shape of the output vector w is 2x1. Looking back at A, it is composed of 3 columns, and each column is 2x1, which matches the output 2x1. What's going on? The output vector is a linear combination of the columns of A. This is always true. If you have any other questions or want me to clarify something, lemme know ------------ Fun fact, not mentioned in the video: If the matrix A is singular, then the process is losing information because two linearly independent vectors in the input space could be mapped to the same output vector, making them indistinguishable. When the professor discussed the matrix for ordinary differentiation of a polynomial function, notice that it was singular. This corresponds to the arbitrary constant of integration i.e. information will be lost, so you save it ahead of time via initial conditions so you can reconstruct the input vector. Linear heart is amazing for this kind of insight. Best wishes, friend
@benjaminlin79184 жыл бұрын
@@ozzyfromspace but what happened when input basis isnt one element 1 others 0 (standard basis)?
@vidhankashyap7663 Жыл бұрын
@@benjaminlin7918 As in the comment above, A = [column_1 column_2 ... column_n] Take for example (v1 v2 ... vn) as my standard basis vector ([1 0 0 ...]T, [0 1 0 0 ...]T, ......) Any coordinate v is (c1, c2, ...cn) will represent the corresponding vector v = c1*v1 + c2*v2 + ... Taking an example say c's as my coordinate in the input basis, I want to change it to the new coordinate system(of output basis). I am choosing here eigenvectors as output basis(w1, w2, w3 ....wm) because that is usually the basis of interest but it can be anything (even exactly the input basis). So if I want my new coordinate(d1, d2, d3,...dm) which is the same as a linear combination of eigenvectors with d's as corresponding coefficients. d = A * c = (c1 * column_1) + (c2 * column_2) + ... + (cn * column_n) Since we are doing linear transformation, we must know what my new coordinate looks like if I were to transform just v1, just v2 and so on. This is equivalent to saying I know what will (1,0,0,0..), (0,1,0,0,...)... will become by linear transformation into the output basis. Thus we can say that column_1 = coordinate when transforming v1 or (1,0,0,..) into output basis. Hence we can write this transformation as a linear combination of eigenvectors(my output basis) with coefficients as column_1(my new coordinates). Similarly, now I know what the remaining columns of A represent. Thus I know my matrix completely. If you are with me till this point, then for any arbitrary c's we can get new d's into the output basis. as A*c = d. Your question: what happened when the input basis isn't one element 1 and others 0 (standard basis)? So if I know my input and output basis and what my (1,0,0..), (0,1,0,..) and so on look like in new coordinate(output basis). I can just write these coordinates as columns of my matrix A and I can now find the transformation for any arbitrary c's. The whole point of a linear transformation is to find A such that we get what you asked for. This should answer your question.
@michaelashcroft602812 жыл бұрын
An 'Indian summer' is used in the UK to describe a summer that lasts later than usual.
@hektor67665 жыл бұрын
In the states, it's the warm spell after the first frost.
@chummyigbo88449 жыл бұрын
Thank You Professor!!!
@ДанилЕлфимов-ш8г4 жыл бұрын
I hope that'll help me pass my linear algebra exam
@89HouseMusic13 жыл бұрын
Maldita sea, porque no hay maestros que intenten explicar las matematicas de manera simple... Porque creen que hay tantos reprobados en México? La verdad este Dr.Strang sabe mucho! Gracias por hacer estos videos!! Saludos desde México!
@rosadovelascojosuedavid18943 жыл бұрын
:0 Así es. Gilbert Strang impactando en México incluso 9 años después de tu comentario. Y en verdad que esto me sirve muchísimo.
@3andhalfpac12 жыл бұрын
its because he leaves al the' in math written" proves to the book and get to the point and explains what he's doing where as most teachers explain something by writing the prove down that no-one really understands
@vamshikrishnam92064 жыл бұрын
thanks sir u improved my math knowledge
@coal27107 жыл бұрын
32:24 HE WORKS
@yakunli91118 жыл бұрын
I am not clear about Prof Strang's explanation at 37:22. He said we choose the eigenvector basis [0 1]^T and [1 0]^T, but this is the standard basis in R^2, right? Does he mean that if the input and output basis are the eigenvector basis, then the transformation matrix A is just Λ? Thanks.
@davidlovell7297 жыл бұрын
That's exactly what it means. If x_i is an eigenvector, and lambda_i the associated eigenvalue, then Ax_i = lambda_i * x_i. Notice the thing on the right only includes x_i; it doesn't need any contribution from any of the other eigenvectors (basis vectors). Thus, when you build the matrix A by putting into its columns what you want it to do to the basis vectors, when the basis vectors are independent eigenvectors, you will get the eigenvalue matrix as a result.
@rodolfonavarro36456 жыл бұрын
This helped so much!
@the_basement_files10 жыл бұрын
What an amazing man
@alijoueizadeh84776 жыл бұрын
Thank you.
@pablo_CFO4 жыл бұрын
This give a easiest geometric interpretation of why singular matrices has no inverse. This kind of matrices take n-dimensional vectors as input and the output is in a vector of smaller dimension, and different vectors can have the same output, so you can´t have an inverse transformation because you don't know to wich of all of this original vectors corresponds the vector in the space of smaller dimension. More importat, that's why you need an initial condition in integrals to know the original function, because the derivative is making this same process and only with the inverse ('integral ) you can't tell which is the original function.
@elyepes193 жыл бұрын
Sorry, but beg to differ, in a linear transformation an input coordinate can have an output of smaller and larger dimension, as long as the second case is not infinite, and I'd say that's the geometric interpretation for a matrix to not br invertible: if the output coordinates happen to be infinite
@raghav9o92 жыл бұрын
"Every linear transformation is associated with a matrix":- Gilbert Strang.
@aku75983 жыл бұрын
I think just plotting a vector on new axes with reference to a given axes. New axes not necessarily orthogonal to each other.
@UberMarauder14 жыл бұрын
@RingWarrior12 maybe its for you, it helped me a lot!
@patrikisaksson782311 жыл бұрын
wohooo, all makes sense now :D
@Peter_198610 жыл бұрын
I love these kinds of playlists, now I can learn Linear Algebra from KZbin videos and chill. xD
@ArabicLearning-MahmoudGa3far3 жыл бұрын
God bless you!
@janitarjanitar14 жыл бұрын
remember on looney toons when they would run in place before taking off? thats how this guy is. he just seems so excited his mouth wants to blurt it out but his mind is a fraction behind.... whose idea was it to include stutters in the subtitiles?
@АлександрСницаренко-р4д3 жыл бұрын
Sorry for my humble input, but this lecture on linear transformation should have been at the beginning of the course.
@АлександрСницаренко-р4д3 жыл бұрын
once I finished watching the lecture, I understood that it is where it must be! brilliant
@НикитаЮрченко-э3ь5 жыл бұрын
It's easier to understand than on my native language :)
@hmodywakid15 жыл бұрын
Well i have studied these at the Technion isreal instittute of technology its an easy subject
@valtih197814 жыл бұрын
The brilliant series of lectures. However, this seems unclear one. I failed to understand the A derivation. What the system of equation does? Can you explain what Gilbert wanted to say? Wikipedia meantime does it trivially: A=[T(v1) T(v2) ..T (vn)].
@Robocat7542 жыл бұрын
The input basis v1 to vn is the same as the output basis w1 to wm. Professor Gilbert strang didn't mentioned it when talking about how to find matrix A. You can't construct a transformation matrix A without a basis. The transformation matrix A transforms basis v1.... vn to T(v1).... T(vn). And each T(vi) is vi times the column i of matrix A. Since output basis is the same the input basis, it can be written as a linear combination of the basis w1 to wm where the cofficients are the entries of column i of A. Gilbert Strang's course alone can not help you fully understand the subject. Check other source like Khan academy might help. This lecture and the next one change of basis will only bring more questions unanswered after you watched it.
@zack_1207 ай бұрын
37:11- what's perpendicular here?
@padhai_karle_bhai2 жыл бұрын
beautifully taught!
@rishavdhariwal4782 Жыл бұрын
I have a confusion may be somone can solve it. How did professor say with conviction at around 42:14 that the 1st column of A tells us what happens to the first basis vector?
@MRsegolego14 жыл бұрын
still don't understand however i feel that i need to do some e.g.'s and i'll get there.. great lecture regardless
@srinikethvelivela98774 жыл бұрын
Do you understand now ?
@energyboat46824 жыл бұрын
@@srinikethvelivela9877 I'd hope so, 10 years later
@billwong20395 жыл бұрын
40:01 how did he get the projection matrix? what is the a??
@balajikalva1885 жыл бұрын
a is a vector along the line that makes 45 degrees with the origin . Since it's 45 degrees both x and y coordinates has to be the same giving vector a = [1,1] (transpose) . He took that as a unit vector with each value as 1/sqrt(2) . So performing that operating a*aT will give you the projection matrix . I hope that helps :)
@aviralrai68034 ай бұрын
he is legend
@UberMarauder14 жыл бұрын
amazing!!!
@jamesdiangson956810 жыл бұрын
God bless you.
@peter_castle11 жыл бұрын
His name is Gilbert Strang, mathematician.
@plazzmator4 жыл бұрын
magician
@Abss09669 жыл бұрын
35:00 why does it kill the basevector v2? is it because it transforms it?
@BongboBongbong9 жыл бұрын
+Ahmed Jan If you project the basevector v_2 onto the line (which goes through the origin), you will get the zero vector. In other words: the length of this vector v_2 in the direction of the line is zero.
@yryj10008 жыл бұрын
v2 is perpendicular to the line it is projected onto, so the projection kills v2, which has no component along the line.
@Abss09668 жыл бұрын
Thanks guys
@landsgevaer13 жыл бұрын
This lecture could just as well have appeared in the beginning, as the very first one, to introduce the meaning of a matrix (except for the part about using eigenvectors as a basis, of course).
@volkerblock5 жыл бұрын
that I just noticed
@geverayala8978 Жыл бұрын
Que buen video!!
@mayanksinghshishodia Жыл бұрын
Godly explanation
@BlahBlooBlee42058 жыл бұрын
wow it actually makes sense now :D
@renesax61038 жыл бұрын
Why do physicists avoid using/thinking in terms of basis? why are they unwilling to bring them in?
@elyepes193 жыл бұрын
Because in Relativity the point is that the laws of physics are the same, and independent, for any choice of coordinate system
@kingplunger14 ай бұрын
You lose generality and introduce artifacts that have nothing to do with what you describe, but are just there based on your choice of coordinates. look at tensors for a general approach
@icono__71363 жыл бұрын
I'm struggling so much with the paradigm of linear transformations. I can do with matrices just fine. "See" the concepts even - but throw in the T(x) whateverthefuck and I can feel myelf melting with tears. Hoping this will be the click for me!
@alullabyofpain8 жыл бұрын
is W1=T(V1) W2=T(V2)...? because the transformation output doesnt always have to be a base for the space, it only happens to be when the transformation is an isomorphism between the two spaces.
@davidlovell7297 жыл бұрын
The answer to your first question is no. You can choose the basis for the input space (v1,...), the basis for the output space (w1,...), and the particular transformation T independently. What is required to form the columns of the matrix A is to ask, if input basis vector v1 were expressed in its own coordinates, what would its image under T look like, expressed in coordinates of the output basis? That is column 1 of A. Repeat for v2, ... If the input basis and the output basis span the same space, then you can do something similar to what you are describing. Construct a vector that has, as its columns, the w basis vectors, but written in their v basis coordinates. This linear transformation is the change-of-basis transformation from the w basis to the v basis. When applied to a vector of coordinates in the w basis, it gives you back the same vector, except expressed in coordinates of the v basis. This matrix is invertible, and its inverse is the change-of-basis transformation in the other direction. When the input and output bases span different spaces, then you can still construct this matrix, but it is not square, and hence not invertible.
@roronoa_d_law10757 жыл бұрын
Matrices of linear transformations, isn't this part supposed to be in first courses ?
@ILOVEMINTBREEZE13 жыл бұрын
aww he is so cute
@Abss09669 жыл бұрын
48:06 How does he know that the transformation matrix A must be a 2x3 matrix?
@BongboBongbong9 жыл бұрын
+Ahmed Jan The linear transformation he is describing goes from vectorspace V to vectorspace W. A basis for V is the set (1, x, x²) and a basis for W is the set (1, x). In order to transform elements of V into elements of W, you are going to need a 2x3 matrix (I could be helpful to have a look at the rules for matrix multiplication).
@Abss09669 жыл бұрын
thanks for the help !
@leonardosoto56696 жыл бұрын
At 43.15 how does he know that the constants are a11 etc?? Can someone explain me? I know this works but I can't understand why does it works
@pa15i6 жыл бұрын
I also doubt about this example.
@elyepes193 жыл бұрын
Those are the components of the matrix A, written in row-column format
@quirkyquester4 жыл бұрын
thank youuuuu!!!
@azaz868azaz511 ай бұрын
29:10 i dont know but this idea seme for me as a genus
@coding992 жыл бұрын
43:00 Climax
@shivanshkhare87562 жыл бұрын
at 39:45 can anyone please explain how we got that matrix?
@mind-blowing_tumbleweed Жыл бұрын
Check out the lectures about orthogonality, Gram Schmidt, Projections
@VangelisPs15 жыл бұрын
I saw example 2 which shifts (moves?) the whole plane. Does this mean that Translation is not a linear transformation? If not what is it? I thought it was a linear one...
@davidlovell7297 жыл бұрын
It is affine, not linear.
@rathnam78182 жыл бұрын
It is non linear transformation as zero vector as input will give another non zero vector upon shifting which violates the linearity property as we want zero vector as output