I have so much respect for you for doing this Chris, especially after I learned that this is your side hobby. I finished the Beginners course twice (second time with 1.5x speed :)) and now I’m binge watching this. Your level of empathy towards the listeners/students is exceptional. I feel very grateful that you never assume stuff as obvious/trivial or that everybody just remembers everything you said so far. This makes your a great teacher, please keep going!
@jaeimp6 жыл бұрын
Outstanding material. Probably the most accessible and complete lectures on the topic. The colored LaTeX is priceless, and helps a lot understand at a glimpse what would otherwise be unpalatable chain rules.
@sterlingveil3 жыл бұрын
Seriously, it's incredible how the simple introduction of color makes the algebra so much more digestible.
@nikosalexandrakis94894 жыл бұрын
Among the most brillant introductions I ever followed during my teaching carrier.
@brunojacobs38846 жыл бұрын
Crystal clear explanation of how basis vectors transform oppositely to its components (contravariantly). The worked - out example consolidates the knowledge very well. Thank you very much ! At the end of the video the motivation why one wants to get rid of the position vector R is very intuitive.
@auvski59034 жыл бұрын
I came up with an analogy for covariant/contravariant that I like. For Cartesian basis vectors, mentally substitute the dictionary of the English language, and for polar basis vectors, substitute the dictionary of some other language (say, Russian). A vector is a thought, which is invariant under choice of language, but gets expressed as a sentence in a given language (i.e. choice of coordinate system). Now, the forward transformation corresponds to building Russian words out of English words, and the backward transformation corresponds to building English words out of Russian words. Notice that if you want to translate an English sentence into Russian, you actually need the backward transformation (i.e. "look up each English word in a Russian dictionary"), and if you want to translate a Russian sentence into English, you actually need the forward transformation ("look up each Russian word in an English dictionary"). So English vector components are contravariant and Russian vector components are covariant.
@eubiw36664 ай бұрын
I like your idea but I got confused trying to follow the analogy. Perhaps by "looking up an English dictionary" you mean a Russian person looking up English word definitions written in Russian? But my brain still wants to associate that as the forward direction.
@dr.ambiguous49136 жыл бұрын
Best videos on tensor calculus on youtube by far. Great job! These are very helpful.
@yayamostafa5 жыл бұрын
Thank you very much for this. It is not just you have creative way to derive an intuitive way of explaining the subject, you also put time and effort into preparing the material in a very pleasant graphical panels. Thank You .
@abnereliberganzahernandez6337 Жыл бұрын
its incredible to see this math evolving as it does. tensor calculus is the result of a very refined mathematical theory. linear algebra and calcculus seemed as separete tools yet they converge and form a more powerfull tool. so it is that relativity and quantum mechanics are the result of this math, and the understanding of the universe changed when einstein saw the point on coordinates system, newton becouse of the time of his discovies just could think on a flat space.
@strippins3 ай бұрын
Absolute pleasure to watch these videos. I massively struggled through my physics degree 20 years ago, if only this ajd content like this existed then.
@pan19682 Жыл бұрын
I have never seen such a clear presentation thanks for your help. Iam looking forward presenting us a course on topology.
@eigenchris Жыл бұрын
Thanks for the compliment. But I don't think I'm ever going to do a course on topology. It's one of my least favourite areas of math.
@ΝίκοςΓιαννόπουλος-λ5θ2 жыл бұрын
Thanks for your amazing work! When I first encountered the notion of using differentiation operators as basis vectors, I found it useful to think of the directional derivative. In multivariable calculus we use unit vectors (dot them with del) to get/define the directional derivative. In tensor calculus we go the other way, using directional derivatives to define the basis vectors.
@reinerwilhelms-tricarico3446 ай бұрын
Thanks for this. I'm going through this circus to fix my lifelong confusion about contravariant vs covariant probably for 10th time in my life (I'm 70 ;-). Also it's nice of you to remind people of the usually handy rule to write "contravariant things" in a column vector form.
@jankriz91993 жыл бұрын
I cannot thank you enough for the series you are putting your time to.
@SteamPunkLV3 жыл бұрын
I'm upset at myself for not finding these brilliant videos much earlier
@devvratjoshi91225 жыл бұрын
You explain amazingly. Making things clear that to so simply..
@gowrissshanker91092 жыл бұрын
Hlo sir, Is forward transform and backward transform matrix is different from jacobian and inverse jacobian matrix? Thank you
@eigenchris2 жыл бұрын
They are the same thing. I invented the terms "forward" and "backward" transform for my earlier "Tensors for Beginners" series. But they are the same thing as the Jacobian/Inverse Jacobian matrices.
@samuelmcdonagh1590 Жыл бұрын
I was just wondering about the slide at 4:52. When using multivariable chain rule to express the field in terms of a given basis and then applying the Jacobian to transform the vector field into the new basis, superscripts are used rather than subscripts for the elements of the Jacobian. However, subscripts were used for the elements of the Jacobian at 9:10 of "Tensor Calculus 3: The Jacobian". Should superscripts have strictly been used in that video, or are both of these correct?
@eigenchris Жыл бұрын
When using coordinate variables, you should always use superscripts. So if I used subscripts in TC#3, that's an error.
@Shirsh3 жыл бұрын
why aren't you monetising this? i can only imagine how much effort this must take you!
@eigenchris3 жыл бұрын
I do put a lot of work into these. I have a tip jar on ko-fi that I've made a good amount of money with. At the same time, I have a good job and a pretty comfortable life, so don't feel the need to monetize these too much. You can "pay it forward" and donate to a charity if you like.
@TheMotole6 жыл бұрын
i watched all your video about tensor in two days, your explainations are really intuitive. you make content on youtube as good as 3blue1brown. the only thing that could be better is audio. i'm assuming that you use a denoiser, i'm not sure but i ear some artifact. it's a bit annoying that the audio is not as goog as the video and explanation. I'm sure that you will soon have a lots of view thanks to content quality !
@eigenchris6 жыл бұрын
I do use a de-noiser... and I'm honestly not sure how to get better audio quality. I bought a blue Snowball microphone, and this seems to be the quality it gives me. If you have any suggestions for improvements, I'd be happy to hear them.
@TheMotole6 жыл бұрын
It depends on your setup, i could advise you to use less audio reduction even if there are still a little bit of noise and you can use a noise gate wich will do a better job. The most important thing is how and where you record. Don't bother you to much on that, you have decent audio !
@jaeimp6 жыл бұрын
I haven't noticed any problems with the audio.
@john_paul_qiang_chen6 жыл бұрын
Do you mean denoiser and noise gate are diffrent processes, denoiser will harm the original human voice, but noise gate is only designed to cut low frequencies?
@nilsfollmann21706 жыл бұрын
Thank you so much for these videos! The definition dR/dx as the definition for e_x is absolutely clear and logical, but omitting the "R" really stopped me in my tracks. I also find this strange by looking at dimensions, because basis vectors shouldn't have any, shouldn't they?! But without thr "R" the basis vectors have the dimension of m^-1 I'm looking forward to furthe explanations and understang this new definition.
@eigenchris6 жыл бұрын
I see why you feel that way. I might make an optional "helper" video to explain why I did this in more detail. The short answer is that, vectors are things that can be added and scaled, and since partial derivative operators can be added and scaled (even if the "R" is not there), they can be considered vectors. As I said, the main motivation is to get rid of our need for an origin point (to deal with "intrinsic" geometry) but I'll have to talk about that more later.
@JoseAntonio-ml8yg4 ай бұрын
I have learnt so much.... thank you!
@harrisonbennett71223 жыл бұрын
Fantastic video! Very clear and compact
@amitjagtiani77083 жыл бұрын
Amazing video - been confused about how and why one would want to redefine vectors as derivatives for a while and this certainly clears things up. One question - How do we define vectors of different length at the same point in space?
@eigenchris3 жыл бұрын
So, we need a curve to define tangent vectors on. This curve will have a path parameter (I use λ). If we use a new path parameter κ = 0.5*λ (or λ=2κ) for the same path, we will travel along the path "twice as fast". This will make all the tangent vectors twice as long. You can see this using chain rule: dR/dκ = dR/dλ * dλ/dλκ = dR/dλ * 2.
@amitjagtiani77083 жыл бұрын
@@eigenchris understood. Attempting to learn GR and it's impossible without your videos so thank you.
@lilyjames8948 Жыл бұрын
Excellent as usual
@ilyboc4 жыл бұрын
I wish I watched this series before taking physics class
@nahblue11 ай бұрын
Are F and B tensors, or are they just matrices? The question is, are they said to themselves transform (using themselves - hm this doesn't make much sense, or does it)?
@eigenchris11 ай бұрын
There's a distinction betweem "active" ans "passive" transformations. With active transformation, you take a vector and move it to a new place. This is done with a linear map, which is a tensor. For a passive transofrmation, you're just rewriting things in a new coordinate system. This can be done with a set of coefficents that can be put in a matrix (F and B), but I don't consider them tensors because they don't represent specific geometric objects. They're just coefficients you use to change coordinates. Thid is my view, anyway. The trouble is, any matrix can be taken as either an active transformation or a passive transformation depending on the context, so it's confusing.
@hansdampf42652 жыл бұрын
Great video, I do understand that the partial derivative of a position vector R to a coordinate k (dR/dk) gives the basis vector ek, however this is NOT equal to the partial derivative operator d/dk! (10:54 in the video)
@eigenchris2 жыл бұрын
You might want to watch the "Tensor Calculus 5.1" where I try to justify this better.
@syedzaheerabbas46916 жыл бұрын
well done. could you please suggest me books regarding these topics?
@eigenchris6 жыл бұрын
I've read parts of the general relativity book Gravitation by Misner, Thorne, and Wheeler, but I don't really have any textbook recommendations other than that.
@syedzaheerabbas46916 жыл бұрын
eigenchris thanks. would you like to introduce your self. mine is as assistant professor in hu.edu.pk and currently engage in PhD from BIT.EDU.CD.research area is geometry and your lectures are really supporting me. keep it up.
@eigenchris6 жыл бұрын
I'm glad you're finding these helpful. I did my undergrad degree in physics and my master's degree in computer vision. Right now I work as a programmer, but I have an interest in math and physics as a hobby.
@Sharikkhursheed6 жыл бұрын
Your videos are amazing.. This really clears all concepts.. And i want you to go forward in making.. Till we will b able to understand the mathematics of General relativity... Moreover.. When making video on manifold.. Make topics on Christopher symbols.. Reimann tensor.. Ricci tensor.. And so on thanx very much...
@zhangjin71796 жыл бұрын
I notice that you can put a polar basis (er,0) at the origin, am I right? what does that mean?
@jkot13044 жыл бұрын
@5:30 Why in the lower left box are you using contravariant notation for the Polar and Cartesian partials? In T/C 3 @ 2:26 these equations are just partial derivatives. And in that video you use covariant notation for the c and p @8:50 when referring to the Forward & Backward matrices. Also see that the definition of the B matrix is different in T/C 3 @9:10 and this video @6:26. You notation doesn't seem consistent across these two videos. Can you explain the change? Was the earlier video a mistake? Or is this video with the mistake?
@eigenchris4 жыл бұрын
The short answer is that my notation in TC3 is sometimes inconsistent. You should trust the newer videos. Coordinates like c1, c2, p1, p2, should always have an "upper" index. (The 8:50 timestamp in TC3 is a typo; it should be upper-index.) At 5:30 in this video, the cj and pj are upper-index for the letters, but they are in the denominator, so they are actually covariant (lower-index). When it comes to derivatives, the thing that determines if they are covariant/contravariant is whether the symbol is in the numerator or denominator. (This is the reason my notation in TC3 inconsistent... I was focusing on numerators/denominators.) But the correct thing to do is to write coordinate variables with an upper index.
@jkot13044 жыл бұрын
@@eigenchris Thank you for your response. I believe your later talks on covariant vectors is also inconsistent. You show covariant vectors with a super-script. But covariant, per your definition, have sub-scripts. In T/C 6, 7, & 8, you super-script the epilson bases vectors and call them covectors. Now are they covectors or contravectors? Is the notation incorrect? Please clarify.
@rubixtheslime3 жыл бұрын
@@eigenchris aha! I'm glad I found this, was wondering the same thing all the way through episode 9 so I'm glad I went back.
@anamikashukla18104 жыл бұрын
Question- It’s on our discretion as to which coordinate system (Cartesian or polar) we consider as the tilde ( or “new”) basis - the jacobian/ invers jacobian will change accordingly...????
@eigenchris4 жыл бұрын
Yeah. The words "new"/"old" and "tilde"/"no tilde" are just labels. If you swap labels, the jacobian becomes the inverse jacobian and vice-versa.
@michaellewis78614 жыл бұрын
Isn’t your idea that the partial derivatives are equivalent to the unit basis kind of trivial, because of course any change on a direction of a vector over the change of the vector will always yield the unit vector in that direction. But the derivative in the classically useful sense would be of a nonconstant function? Is there something I’m not seeing?
@eigenchris4 жыл бұрын
It depends on the coordinates you're using. You don't get unit vectors in polar coordinates. The theta/angular basis gets longer as you go farther from the origin, because the "speed" at which you travel around the circles for a given theta-value increases as you go farther from the origin.
@zhangjin71796 жыл бұрын
my understanding of taking derivative (tangent) is to put a free floating vector at the point on the curve... the root of the vector is on the curve and the basis of the vector is with respect to that point ... that component is also wrt to the basis origin (on the curve)... with cartesian, it is easy to understand.... with polar, it is a bit weird: do you put a polar coordinate on every point on the curve?
@eigenchris6 жыл бұрын
Yes. Every point on the curve has its own basis.
@melmel32302 жыл бұрын
Sir Eigenchris, I would like to buy your book about this theme. I am Andrea
@90kiranbatool Жыл бұрын
Chris if this is your side hobby then what's your job
@eigenchris Жыл бұрын
I'm a software developer.
@90kiranbatool Жыл бұрын
@@eigenchris God bless you
@90kiranbatool Жыл бұрын
I learned a lot from your playlist and refered to my students as well
@williambatman45223 жыл бұрын
Honestly I'm just a high school student trying to see what I will be going up against in half a decade
@Vercongent5 жыл бұрын
transform is a bit misleading, F and B are the coefficients of an expansion
@Vercongent5 жыл бұрын
and a contraction respectively
@TmyLV5 жыл бұрын
FABULOUS , eigenchris! Thx!
@marat614 жыл бұрын
Jacobian is pretty much the as Just Forward transform but in case then space 2 are not Just simple and linear as 1?
@eigenchris4 жыл бұрын
I'm having some trouble understanding. What do you mean by "space 2" and "linear as 1"?
@marat614 жыл бұрын
@@eigenchris We can not Just use simple Forward transform to transfer contravector from cartesian to polar because polar is obviosly not linear. So as I understood Jacobian is a more complicated version of Forward transform for more general case?
@eigenchris4 жыл бұрын
@@marat61 The Jacobian matrix has variables in it, so it can change from point to point. At any given specific point the Jacobian Matrix is a linear map, even if globally across all points it is not a linear map.
@marat614 жыл бұрын
@@eigenchris So Jacobian is more General analog of Forward transform right? It can transform from one none linear space to another, right?
@eigenchris4 жыл бұрын
I would phrase it the same way as above. Every individual point has a linear map, even if globally the transformation is not linear. I'm not sure what "non-linear space" would mean in this case. We're just changing the basis vectors, not the space itself.
@tootyytootyy38064 жыл бұрын
Hello , we're going to study tensors this term Can anyone tell me what are the basic subjects I have to review ? Before going into tensors
@eigenchris4 жыл бұрын
You should review the basics of vectors, matrices, and matrix multiplication. If you want a jump-start into Tensors, My channel has a playlist called "Tensors for Beginners" that covers change of basis, contravariance and covariance, and the metric tensor in the first 9 videos (about 90 minutes total). It would be good to review all of these if you want to stay ahead.
@tootyytootyy38064 жыл бұрын
@@eigenchris I know , I saved your playlist in my library ... I'm thankful for you about it .... I just asked my question cuz I thought the videos went into tensors directly