Hell yesss. Clearest and most logical exposition on KZbin. Reasonable definitions, etc. This is a gold mine. Thank you!
@TensorCalculusRobertDavie6 жыл бұрын
Thanks for your comment. Much appreciated!
@abcdef-ys1sb6 жыл бұрын
I was looking for this kind of explanation for a long time
@TensorCalculusRobertDavie6 жыл бұрын
Thank you for your comment.
@logansimon66534 жыл бұрын
Honestly, a very competent run through. Thanks!
@TensorCalculusRobertDavie4 жыл бұрын
Hello Logan and thank you for your comment. Much appreciated!
@logansimon66534 жыл бұрын
@@TensorCalculusRobertDavie Hi, you are very welcome. I browsed through some of your other titles just now, and I am excited to see a rich source of mathematics of my most favorite type. Would you mind if I cite you as a source in the text I am writing on general relativity (with a rigour in tensor calculus and differential/Riemannian geometry) -- especially for any instances that I am inspired to add to my work because of your content? If you would like, this is a hyperlink to my document. drive.google.com/open?id=1-MU7daeZ0Q8TefNOwzImcGD2uZIhktvZl3FO13R5UkQ Thank you for the content!
@TensorCalculusRobertDavie4 жыл бұрын
You are welcome to cite my material andgood luck with your efforts.
@marinajacobo35505 жыл бұрын
Thank you Robert! I really enjoyed this video.
@TensorCalculusRobertDavie5 жыл бұрын
Hello Marina and thank you for your comment. Much appreciated.
@marinajacobo35505 жыл бұрын
Thank you! I really enjoyed this explanation :)
@g3452sgp7 жыл бұрын
The images at 1:59 and at 3:20 are good. They are well organized and help us to get the whole picture of underlying concept. Excellent! Thanks a lot.
@TensorCalculusRobertDavie7 жыл бұрын
Thank you again!
@davidprice18757 жыл бұрын
Very clear and precise summary.
@TensorCalculusRobertDavie7 жыл бұрын
Thank you David.
@TensorCalculusRobertDavie7 жыл бұрын
Hello Sjaak, the content covered here does assume some prior knowledge of vector calculus. The main point of the video are the two forms of basis vectors that can be formed so could I suggest that a good starting point would be to focus on the meaning of the diagrams before moving on to deal with the notation and what it is trying to express. Hope that helps?
@theboombody3 жыл бұрын
I like the ad placements on these videos. "Are you struggling with calculus?" If you're watching a video on curvature and differential geometry, then no, you're not struggling with calculus. You're struggling with something far beyond.
@TensorCalculusRobertDavie3 жыл бұрын
Yes, a bit ironic. I hope there aren't too many ads?
@theboombody3 жыл бұрын
@@TensorCalculusRobertDavie No, it's not too bad. That's the price of posting stuff on youtube. They can put ads in your stuff and there's nothing you can do about it except not post videos. But I think it's a small price to pay for the freedom of being able to post mathematical content. I'm pretty grateful for youtube both as a viewer and as a poster.
@dansaunders69574 жыл бұрын
What happens to the position vector when working with a manifold? how does one typically define a basis without a position vector.
@TensorCalculusRobertDavie4 жыл бұрын
Please have a look at the first few minutes of this video.
@한두혁3 жыл бұрын
Is linear algebra needed (I mean in a rigorous way starting from defining vector spaces and dual spaces and so on...) to fully understand Tensor and General Relativity? Because some Textbooks were pretty hard to read since they start from a very abstract point of view not even mentioning about differentials, chain rules from calculus. I really enjoyed the video by the way I really appreciate it. Thank you!
@TensorCalculusRobertDavie3 жыл бұрын
Hello and thank you for your comment. The answer is no because this video provides you with a basic introduction to basis vectors and one forms (the objects with raised indices). However, the more you learn the better so do continue to study linear algebra if you can. Thank you for the feedback and good luck with your studies.
@한두혁3 жыл бұрын
@@TensorCalculusRobertDavie Thankyou!
@hariacharya55336 жыл бұрын
good presentation. you explain nicely.
@TensorCalculusRobertDavie6 жыл бұрын
Thank you.
@benedekjotu2665 жыл бұрын
Excellent presentation. In general, what is the punch line for working with both covariant and contravariant coordinates? They are both representing the same objects. The metric tensor is usually at hand anyways. At first it seems an unnecessary complication while on the way to general relativity. How come they didn't just go with one or the other? And left the other as a fun fact side note. Thanks
@TensorCalculusRobertDavie5 жыл бұрын
Hello Benedek and thank you for your question. Wikipedia discusses this issue in the quote below and further in the link below that. "The vector is called covariant or contravariant depending on how the transformation of the vector's components is related to the transformation of coordinates. Contravariant vectors are "regular vectors" with units of distance (such as a displacement) or distance times some other unit (such as velocity or acceleration). For example, in changing units from meters to millimeters, a displacement of 1 m becomes 1000 mm. Covariant vectors, on the other hand, have units of one-over-distance (typically such as gradient). For example, in changing again from meters to millimeters, a gradient of 1 K/m becomes 0.001 K/mm." www.wikiwand.com/en/Covariance_and_contravariance_of_vectors
@dsaun7775 жыл бұрын
@@TensorCalculusRobertDavie So it doesnt matter if you use contravariant or covariant they are just used whenever most conveniently for transforms?
@parthvarasani4955 ай бұрын
12:24 , u • v = g(ij) ui vj , not square root of it. I think.(In Numerator)
@TensorCalculusRobertDavie5 ай бұрын
You are right. Thank you for spotting that.
@parthvarasani4955 ай бұрын
@@TensorCalculusRobertDavie Thank you for your all efforts, highly appreciated 👍👏👏
@garytzehaylau94325 жыл бұрын
Excuse me What is Nable i u in 12:53 actually this notation is not clear why g^ij Nablaj u ej = n could you explain to me thank for your great videos i would recommand to other people
@TensorCalculusRobertDavie4 жыл бұрын
Hello Gary and thank you for your question. The inverse metric is the g^ij part and the nabla u is the derivative giving us the maximum direction of increase in the scalar u in each of the directions j. The inverse metric raises the j index on the resultant of nabla u so that we obey the Einstein summation convention and don't end up with two j's down below. We CANNOT have (nabla u)j e_j but we can and must have (nabla u)^j e_j. Hope that helps?
@vicentematricardi35966 жыл бұрын
Muy Buenos sus Videos !!!!!
@TensorCalculusRobertDavie6 жыл бұрын
Vicente Matricardi Muchos gracias.
@vicentematricardi35966 жыл бұрын
Gracias a usted por generar y divulgar tan buena calidad de informacion , le escribo en español por que me agrada que sepa que a mucha gente le interesan estos temas , un saludo !!!!
@TensorCalculusRobertDavie6 жыл бұрын
Vicente Matricardi Thanks Vicente.
@vicentematricardi35966 жыл бұрын
Thanks, Robert Davie
@nicolecui32144 жыл бұрын
Hi, thanks for the video, but why does every vector is written by a covariant component with contravarient basis, and vice versa. Intuitively, I thought isn't the component and basis are consistent?
@TensorCalculusRobertDavie4 жыл бұрын
The two bases are distinct, hence the upper and lower indexes, and behave in different ways unlike in Euclidean space where they really are just the same thing hence no reason to raise or lower indices. Sorry for the short answer. Have a look at this article; en.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors and this video; kzbin.info/www/bejne/eZ3MiGqhiN2rjbc In General Relativity we use a metric to raise and lower these indices that is not the same as the Euclidean metric.
@nicolecui32144 жыл бұрын
@@TensorCalculusRobertDavie Thank you for the reply, will take a look! :)
@박용석-n8y4 жыл бұрын
5:12 thank you so much
@TensorCalculusRobertDavie4 жыл бұрын
You're welcome!
@rontoolsie7 жыл бұрын
At 11:45, line 3 should end up as u(covariant)V(contravariant). Otherwise this is an excellent presentation.
@TensorCalculusRobertDavie7 жыл бұрын
Hello Ron, thank you for your comment and you are correct however, in this case, we have u(covariant)v(contravariant) = u(contravariant)v(covariant) which was the point I was trying to show across lines 3 and 4. The point here is that there are four different looking ways to get the same result. At the time I did um and arrgh about whether I should write it in the form you have pointed out but my goal took precedence in the end.
@abhishekrai12045 жыл бұрын
Thanks sir
@TensorCalculusRobertDavie5 жыл бұрын
You're welcome.
@zoltankurti6 жыл бұрын
At the beginning of the video, you have to assume that the coordinate transformation and its inverse is also differentiable
@TensorCalculusRobertDavie6 жыл бұрын
Thank you Zoltan, that is a good point about differentiability, I should have mentioned it at the beginning.