Best explanation I could find available online! Thanks a lot.
@johnholme7832 жыл бұрын
One word; Brilliant! I’m currently studying general relativity and this video has been very helpful. First port of call for anybody wanting grasp the mathematics of GR!
@edwardmacnab3549 ай бұрын
this is the first time i have seen a video that explains this and CLEARLY explains it . Congratulations and may you have further success !
@ArturoCastan3 жыл бұрын
Thanks a lot, a very clear demostration!
@RD25642 жыл бұрын
You nailed it Fizică, well done.
@singkreality30413 жыл бұрын
Covariant are calculated in perpendicular manner, how to we relate this with the physical meaning. Why covariant components are not measured in parallel.
@pratikshadash64875 ай бұрын
Consider a source of light falling on the vector in a perpendicular direction to one of the axes..
@thevegg3275Ай бұрын
It could be if the definition changed. By the same argument, we could start calling cats, dogs, and dogs cats. It’s just nomenclature true to find a method.
@hjpark21772 ай бұрын
At 7:47, I think you are wrong. When using covariant components, the basis vector should be transformed to contravariant basis vector.
@afreenaa698411 ай бұрын
Good videos and nice presentation. Please go on with more physics areas
@deniz.72008 ай бұрын
Any numerical exampels for both transformations?
@thevegg3275 Жыл бұрын
On the graphs for covariant, I see that I prime doubled, but (i to 2i), but I don’t see any component that doubled. I can’t understand your graph.
@lucasgroves1377 ай бұрын
Is that around 7:30? The parallel and perpendicular components seem to both change contravariantly. I think the graphs in this clip are very poorly done, in particular the representation of unit vectors. But in the stupid era in which we live-Emperor's New Clothes-people wet their pants over mediocre content and call it excellent.
@bimalbahuguna2153 жыл бұрын
Nicely explained. Thanks
@AdrianBoyko2 ай бұрын
Video says vectors/tensors are invariant. Then at 3:30 it proposes a definition for “contravariant vector” and lists a few. This sort of careless language is one of the reasons why people find this subject difficult. The COMPONENTS can be covariant or contravariant, not the vector/tensor.
@Dominic94St3 жыл бұрын
Thank you for sharing!
@guriyakum2971 Жыл бұрын
Best explanation ever ❤
@thevegg32757 ай бұрын
At 7:50 you start that the vector is a sub times the sub plus a sub two times E sub two, but you are ignoring the dual basis factors which give the covariant projection as a sub. One is super script one plus a sub two superscript two I’m wondering What you say true if you only consider the basis factors the sub one and the sub two and do not consider the dual basis factors the script one and the superscript two
@victoriarisko7 ай бұрын
Very nicely done!
@ripz75622 жыл бұрын
Brilliant ❤️
@vayunandanakishore665211 ай бұрын
very beautiful explanation..
@bablurana4573 жыл бұрын
Every one just write the formula but don't explain. except you thank's.
@DargiShameer3 жыл бұрын
Good explanation
@nightshine7513 жыл бұрын
Very good method to explaination. But the quality of video is blur.
@vivekcome982 жыл бұрын
Woaahhh...that was really helpful
@mukeshkumar-dt4mb3 жыл бұрын
Please upload complete series
@thevegg3275 Жыл бұрын
You seem to say the gradients are necessarily covariant and velocity vectors are necessarily contravariant? How so, do we not have the choice of covariant OR contravariant to define ANY gradient or vector?
@AdrianBoykoАй бұрын
You would think so, if the story about parallel vs perpendicular projection told in this video is true.
@mdnkhan1964 Жыл бұрын
Very good. Thanks.
@thevegg3275 Жыл бұрын
For contra variant, you talk about doubling a basis, vector, and the component being half. But when you go to covariant, you don’t talk about doubling the bases factor, and then the component being double for some reason you go into gradients, which is like comparing apples and oranges. Do you agree? Why don’t we discuss gradient with contravariant as well
@edwardmacnab3549 ай бұрын
because gradient is NOT a contravariant tensor ?
@thevegg32759 ай бұрын
@@edwardmacnab354 Not sure you understand my question. I can show both contravariant and covariant components visually in a sketch. What makes perpendicualar projection of a vector on to dual bases related to gradients while contravariant (parallel projection) is not related to gradients? And when used in a tensor I don't see how indices can be lowered unless you actually change the actual numbers (on a graph). ie if contrav indices relates to the contravar components from parallel projection, and one lowers it to covariant, then it seems the number have to also change to the covariant components from perpendicular projection.
@edwardmacnab3549 ай бұрын
@@thevegg3275the covariant tensor and contravariant tensor are two completely different types of tensor to my understanding , and so the components act differently when the base is doubled or halved . But I must add I also am considerably confused by the whole concept . I can see the power of it when doing calculations as it greatly simplifies very complex manipulations , and so I will persist in my struggle to understand . Good Luck !
@thevegg32759 ай бұрын
@@edwardmacnab354 The story goes that if you increase a contravariant basis vector (in order to keep the numerical combination of the basis vector times the component to equal the parallel projection of the the vector on that axis) the contravariant component varies inversely by decreasing...aka contra-variant. The problem comes when considering the covariant situation from perpendicular projection. Intuitively one thinks no matter the length of a covariant basis vector, surely the covariant component has to vary inversely. So it is confusing to read..."if you increase a covariant basis vector then the component will increase...aka co-variant". I believe the above is true because given that a covariant basis vector is actually calculated as the inverse of a contravariant basis vector (times the cos of theta)...as you increase a contravariant basis vector, you necessarily decrease the covariant basis vector, which means the covariant component increases. If I'm correct perhaps it should be taught as follows... A) As you increase or decrease a (contravariant basis vector), you decrease or increase a contravariant component, respectively. B) As you increase or decrease a (contravariant basis vector), you increase or decrease a contravariant component, respectively. Thoughts?
@nikopojani21336 ай бұрын
In the end of point B) you mean (I beleive):.....covariant componentr, respectively.
@SteveRayDarrell7 ай бұрын
Let's say you are representing a vector with covariant components in a given basis (e1,e2). You want to change to a new basis (e'1,e'2). Let's say the matrix of the change of basis is A. if the components of the vector are (a,b)^t in the first basis, to get the components in the new basis you have to multiply A transpose to the left of (a,b), so (a',b')=A^t*(a,b)^t. So the components change with the transpose of A and not with A itself. Is this correct?
@thevegg32757 ай бұрын
Hello, At minute 5:28 you said that doubling the X axis doubles the Y axis, but I’ve seen no evidence of that on your graph. It seems like you double the axis, but the Y axis as in the original.
@perdehurcu21 күн бұрын
Hello. Sir, if I define the size of the base vectors as Plank units, will I need the concepts of Covariant and Contravariant? Because I won't have to worry about the size or number of basis vectors. Sir, could you please explain how you established a connection between displacement, velocity, acceleration and contravariant?. The terms and concepts you use are very wrong. .Thank you.
@darkknight330511 ай бұрын
Thank you very much
@jn39173 жыл бұрын
Thanks a lot sir
@physicsbhakt75712 жыл бұрын
Easy To the point Adequate
@nomachinesinthisroom Жыл бұрын
Thanks man! Wondering why you chose the Romanian name for physics for the channel 👯♀
@nosirovabdurahmon59642 жыл бұрын
Thank you a lot man.
@aniruddhabehere983610 ай бұрын
Excellent
@thevegg3275 Жыл бұрын
Echoing a previous comment, it is rather like comparing eggs to elephants, comparing parallel projection contravariant components to the gradient…ignoring the converse of parallel components aka perpendicular components of covariance. I think we need to a graphic with parallel projection finding contravariant components on the left side with perpendicular (dual basis vector) components on the right side. No need for talking about gradients. Further more if you could somehow connect this graph to the tensor components in a very specific example. For instance, in a skewed coordinate system…if the contravariant components are x=5 and y= 3 and the covariant components are x=-12 and y=7…. how are these numbers placed into the tensor components. This type of graphic would quickly define both contravariant and covariance components as well as how the tensor is defined based on those components. Would you please be able to have a separate video with only that that would be awesome. Love your work! Thanks in advance.
@inquiringhuman25823 жыл бұрын
dil mange more, can you share more on tensor calculus,please?
@mrigankasandilya43883 жыл бұрын
if lecture notes are provided it is much better
@nipunaherath3473 жыл бұрын
thanks
@thevegg3275 Жыл бұрын
I think I’m asking what does needing the gradient to define covariant components have to do with dual basis vectors to find covariant components. I see no connection whatsoever. This is a question I’ve been asking people for years and no one’s ever been able to answer.
@e4eduspace2813 жыл бұрын
Nice class
@floend4810 Жыл бұрын
Why is there no example for using the transformation laws??? For example: You have a curve in the cartesian 2D system and a tangent on this curve in one point! Then there is another coordinate system E with one axis being rotated and the other not! Which of the two transformation laws will give you the correct components of the tangent vector in the new system E??? Obviously you would have to show that it IS the correct representation but parallel and perpendicular projections should do that!
@brianyeh269511 ай бұрын
thank you!!!!!!!
@puremaths96793 жыл бұрын
Nice
@brunoaugustorodriguesalves40802 жыл бұрын
Same gere.
@thevegg3275 Жыл бұрын
Ok, after months of thinking on this, I have an explanation of covariant vs contravariant that no one has ever uttered, as far as I know. Here goes. Hold my beer! --- You combine contravariant (components times their regular basis vectors) tip to tail to reach the tip of the vector you're defining. But with covariant, you combine (covariant components times their dual basis vectors) tip to tail to get to the tip of the vector you're definging. Why does no one explain it like this? But my question is how is covarant components of dual basis vectors relate to the dot product? Please correct me if I'm wrong on the following... DOT PRODUCT: A (vector) dot B (vector) = a scalar quantity CONTRAVARIANT: described by the combination of contravariant (components times regular basis vectors) added tip to tail of A (vector) dot B (vector). COVARIANT: described by the combination of covariant (components times dual basis vectors) added tip to tail of A prime (vector) dot B prime (vector). QUESTION: If we dot product A prime (vector) with B prime (vector), does that scalar quantity equal A lower 1 prime times e upper 1 prime PLUS A lower 2 prime times e upper 2 prime? If so, arent we then saying that a scalr is equal to a vector???
@perdehurcu29 күн бұрын
Hello. Sir, I think what you are telling me is complete nonsense. There is so much uncertainty that you don't understand the issue. You've wasted your time with these videos. Now I'm asking sir, what are Covariant and Contravariant? No one can understand anything when you say this. You have made an explanation full of contradictions. Good luck.
@rashahameed49373 жыл бұрын
Please translate to Arabic
@jameshopkins3541 Жыл бұрын
YOU CAN NOT EXPLAIN IT!!!!!!
@shreyaskumarm35342 жыл бұрын
Thank you very much sir
@thevegg3275 Жыл бұрын
Ok, after months of thinking on this, I have an explanation of covariant vs contravariant that no one has ever uttered, as far as I know. Here goes. Hold my beer! --- You combine contravariant (components times their regular basis vectors) tip to tail to reach the tip of the vector you're defining. But with covariant, you combine (covariant components times their dual basis vectors) tip to tail to get to the tip of the vector you're definging. Why does no one explain it like this? But my question is how is covarant components of dual basis vectors relate to the dot product? Please correct me if I'm wrong on the following... DOT PRODUCT: A (vector) dot B (vector) = a scalar quantity CONTRAVARIANT: described by the combination of contravariant (components times regular basis vectors) added tip to tail of A (vector) dot B (vector). COVARIANT: described by the combination of covariant (components times dual basis vectors) added tip to tail of A prime (vector) dot B prime (vector). QUESTION: If we dot product A prime (vector) with B prime (vector), does that scalar quantity equal A lower 1 prime times e upper 1 prime PLUS A lower 2 prime times e upper 2 prime? If so, arent we then saying that a scalr is equal to a vector???