3/3 Contravariant and Covariant tensor

  Рет қаралды 36,991

Fizică

Fizică

Күн бұрын

Пікірлер: 60
@renmas00
@renmas00 3 жыл бұрын
Best explanation I could find available online! Thanks a lot.
@johnholme783
@johnholme783 2 жыл бұрын
One word; Brilliant! I’m currently studying general relativity and this video has been very helpful. First port of call for anybody wanting grasp the mathematics of GR!
@edwardmacnab354
@edwardmacnab354 9 ай бұрын
this is the first time i have seen a video that explains this and CLEARLY explains it . Congratulations and may you have further success !
@ArturoCastan
@ArturoCastan 3 жыл бұрын
Thanks a lot, a very clear demostration!
@RD2564
@RD2564 2 жыл бұрын
You nailed it Fizică, well done.
@singkreality3041
@singkreality3041 3 жыл бұрын
Covariant are calculated in perpendicular manner, how to we relate this with the physical meaning. Why covariant components are not measured in parallel.
@pratikshadash6487
@pratikshadash6487 5 ай бұрын
Consider a source of light falling on the vector in a perpendicular direction to one of the axes..
@thevegg3275
@thevegg3275 Ай бұрын
It could be if the definition changed. By the same argument, we could start calling cats, dogs, and dogs cats. It’s just nomenclature true to find a method.
@hjpark2177
@hjpark2177 2 ай бұрын
At 7:47, I think you are wrong. When using covariant components, the basis vector should be transformed to contravariant basis vector.
@afreenaa6984
@afreenaa6984 11 ай бұрын
Good videos and nice presentation. Please go on with more physics areas
@deniz.7200
@deniz.7200 8 ай бұрын
Any numerical exampels for both transformations?
@thevegg3275
@thevegg3275 Жыл бұрын
On the graphs for covariant, I see that I prime doubled, but (i to 2i), but I don’t see any component that doubled. I can’t understand your graph.
@lucasgroves137
@lucasgroves137 7 ай бұрын
Is that around 7:30? The parallel and perpendicular components seem to both change contravariantly. I think the graphs in this clip are very poorly done, in particular the representation of unit vectors. But in the stupid era in which we live-Emperor's New Clothes-people wet their pants over mediocre content and call it excellent.
@bimalbahuguna215
@bimalbahuguna215 3 жыл бұрын
Nicely explained. Thanks
@AdrianBoyko
@AdrianBoyko 2 ай бұрын
Video says vectors/tensors are invariant. Then at 3:30 it proposes a definition for “contravariant vector” and lists a few. This sort of careless language is one of the reasons why people find this subject difficult. The COMPONENTS can be covariant or contravariant, not the vector/tensor.
@Dominic94St
@Dominic94St 3 жыл бұрын
Thank you for sharing!
@guriyakum2971
@guriyakum2971 Жыл бұрын
Best explanation ever ❤
@thevegg3275
@thevegg3275 7 ай бұрын
At 7:50 you start that the vector is a sub times the sub plus a sub two times E sub two, but you are ignoring the dual basis factors which give the covariant projection as a sub. One is super script one plus a sub two superscript two I’m wondering What you say true if you only consider the basis factors the sub one and the sub two and do not consider the dual basis factors the script one and the superscript two
@victoriarisko
@victoriarisko 7 ай бұрын
Very nicely done!
@ripz7562
@ripz7562 2 жыл бұрын
Brilliant ❤️
@vayunandanakishore6652
@vayunandanakishore6652 11 ай бұрын
very beautiful explanation..
@bablurana457
@bablurana457 3 жыл бұрын
Every one just write the formula but don't explain. except you thank's.
@DargiShameer
@DargiShameer 3 жыл бұрын
Good explanation
@nightshine751
@nightshine751 3 жыл бұрын
Very good method to explaination. But the quality of video is blur.
@vivekcome98
@vivekcome98 2 жыл бұрын
Woaahhh...that was really helpful
@mukeshkumar-dt4mb
@mukeshkumar-dt4mb 3 жыл бұрын
Please upload complete series
@thevegg3275
@thevegg3275 Жыл бұрын
You seem to say the gradients are necessarily covariant and velocity vectors are necessarily contravariant? How so, do we not have the choice of covariant OR contravariant to define ANY gradient or vector?
@AdrianBoyko
@AdrianBoyko Ай бұрын
You would think so, if the story about parallel vs perpendicular projection told in this video is true.
@mdnkhan1964
@mdnkhan1964 Жыл бұрын
Very good. Thanks.
@thevegg3275
@thevegg3275 Жыл бұрын
For contra variant, you talk about doubling a basis, vector, and the component being half. But when you go to covariant, you don’t talk about doubling the bases factor, and then the component being double for some reason you go into gradients, which is like comparing apples and oranges. Do you agree? Why don’t we discuss gradient with contravariant as well
@edwardmacnab354
@edwardmacnab354 9 ай бұрын
because gradient is NOT a contravariant tensor ?
@thevegg3275
@thevegg3275 9 ай бұрын
@@edwardmacnab354 Not sure you understand my question. I can show both contravariant and covariant components visually in a sketch. What makes perpendicualar projection of a vector on to dual bases related to gradients while contravariant (parallel projection) is not related to gradients? And when used in a tensor I don't see how indices can be lowered unless you actually change the actual numbers (on a graph). ie if contrav indices relates to the contravar components from parallel projection, and one lowers it to covariant, then it seems the number have to also change to the covariant components from perpendicular projection.
@edwardmacnab354
@edwardmacnab354 9 ай бұрын
@@thevegg3275the covariant tensor and contravariant tensor are two completely different types of tensor to my understanding , and so the components act differently when the base is doubled or halved . But I must add I also am considerably confused by the whole concept . I can see the power of it when doing calculations as it greatly simplifies very complex manipulations , and so I will persist in my struggle to understand . Good Luck !
@thevegg3275
@thevegg3275 9 ай бұрын
@@edwardmacnab354 The story goes that if you increase a contravariant basis vector (in order to keep the numerical combination of the basis vector times the component to equal the parallel projection of the the vector on that axis) the contravariant component varies inversely by decreasing...aka contra-variant. The problem comes when considering the covariant situation from perpendicular projection. Intuitively one thinks no matter the length of a covariant basis vector, surely the covariant component has to vary inversely. So it is confusing to read..."if you increase a covariant basis vector then the component will increase...aka co-variant". I believe the above is true because given that a covariant basis vector is actually calculated as the inverse of a contravariant basis vector (times the cos of theta)...as you increase a contravariant basis vector, you necessarily decrease the covariant basis vector, which means the covariant component increases. If I'm correct perhaps it should be taught as follows... A) As you increase or decrease a (contravariant basis vector), you decrease or increase a contravariant component, respectively. B) As you increase or decrease a (contravariant basis vector), you increase or decrease a contravariant component, respectively. Thoughts?
@nikopojani2133
@nikopojani2133 6 ай бұрын
In the end of point B) you mean (I beleive):.....covariant componentr, respectively.
@SteveRayDarrell
@SteveRayDarrell 7 ай бұрын
Let's say you are representing a vector with covariant components in a given basis (e1,e2). You want to change to a new basis (e'1,e'2). Let's say the matrix of the change of basis is A. if the components of the vector are (a,b)^t in the first basis, to get the components in the new basis you have to multiply A transpose to the left of (a,b), so (a',b')=A^t*(a,b)^t. So the components change with the transpose of A and not with A itself. Is this correct?
@thevegg3275
@thevegg3275 7 ай бұрын
Hello, At minute 5:28 you said that doubling the X axis doubles the Y axis, but I’ve seen no evidence of that on your graph. It seems like you double the axis, but the Y axis as in the original.
@perdehurcu
@perdehurcu 21 күн бұрын
Hello. Sir, if I define the size of the base vectors as Plank units, will I need the concepts of Covariant and Contravariant? Because I won't have to worry about the size or number of basis vectors. Sir, could you please explain how you established a connection between displacement, velocity, acceleration and contravariant?. The terms and concepts you use are very wrong. .Thank you.
@darkknight3305
@darkknight3305 11 ай бұрын
Thank you very much
@jn3917
@jn3917 3 жыл бұрын
Thanks a lot sir
@physicsbhakt7571
@physicsbhakt7571 2 жыл бұрын
Easy To the point Adequate
@nomachinesinthisroom
@nomachinesinthisroom Жыл бұрын
Thanks man! Wondering why you chose the Romanian name for physics for the channel 👯‍♀
@nosirovabdurahmon5964
@nosirovabdurahmon5964 2 жыл бұрын
Thank you a lot man.
@aniruddhabehere9836
@aniruddhabehere9836 10 ай бұрын
Excellent
@thevegg3275
@thevegg3275 Жыл бұрын
Echoing a previous comment, it is rather like comparing eggs to elephants, comparing parallel projection contravariant components to the gradient…ignoring the converse of parallel components aka perpendicular components of covariance. I think we need to a graphic with parallel projection finding contravariant components on the left side with perpendicular (dual basis vector) components on the right side. No need for talking about gradients. Further more if you could somehow connect this graph to the tensor components in a very specific example. For instance, in a skewed coordinate system…if the contravariant components are x=5 and y= 3 and the covariant components are x=-12 and y=7…. how are these numbers placed into the tensor components. This type of graphic would quickly define both contravariant and covariance components as well as how the tensor is defined based on those components. Would you please be able to have a separate video with only that that would be awesome. Love your work! Thanks in advance.
@inquiringhuman2582
@inquiringhuman2582 3 жыл бұрын
dil mange more, can you share more on tensor calculus,please?
@mrigankasandilya4388
@mrigankasandilya4388 3 жыл бұрын
if lecture notes are provided it is much better
@nipunaherath347
@nipunaherath347 3 жыл бұрын
thanks
@thevegg3275
@thevegg3275 Жыл бұрын
I think I’m asking what does needing the gradient to define covariant components have to do with dual basis vectors to find covariant components. I see no connection whatsoever. This is a question I’ve been asking people for years and no one’s ever been able to answer.
@e4eduspace281
@e4eduspace281 3 жыл бұрын
Nice class
@floend4810
@floend4810 Жыл бұрын
Why is there no example for using the transformation laws??? For example: You have a curve in the cartesian 2D system and a tangent on this curve in one point! Then there is another coordinate system E with one axis being rotated and the other not! Which of the two transformation laws will give you the correct components of the tangent vector in the new system E??? Obviously you would have to show that it IS the correct representation but parallel and perpendicular projections should do that!
@brianyeh2695
@brianyeh2695 11 ай бұрын
thank you!!!!!!!
@puremaths9679
@puremaths9679 3 жыл бұрын
Nice
@brunoaugustorodriguesalves4080
@brunoaugustorodriguesalves4080 2 жыл бұрын
Same gere.
@thevegg3275
@thevegg3275 Жыл бұрын
Ok, after months of thinking on this, I have an explanation of covariant vs contravariant that no one has ever uttered, as far as I know. Here goes. Hold my beer! --- You combine contravariant (components times their regular basis vectors) tip to tail to reach the tip of the vector you're defining. But with covariant, you combine (covariant components times their dual basis vectors) tip to tail to get to the tip of the vector you're definging. Why does no one explain it like this? But my question is how is covarant components of dual basis vectors relate to the dot product? Please correct me if I'm wrong on the following... DOT PRODUCT: A (vector) dot B (vector) = a scalar quantity CONTRAVARIANT: described by the combination of contravariant (components times regular basis vectors) added tip to tail of A (vector) dot B (vector). COVARIANT: described by the combination of covariant (components times dual basis vectors) added tip to tail of A prime (vector) dot B prime (vector). QUESTION: If we dot product A prime (vector) with B prime (vector), does that scalar quantity equal A lower 1 prime times e upper 1 prime PLUS A lower 2 prime times e upper 2 prime? If so, arent we then saying that a scalr is equal to a vector???
@perdehurcu
@perdehurcu 29 күн бұрын
Hello. Sir, I think what you are telling me is complete nonsense. There is so much uncertainty that you don't understand the issue. You've wasted your time with these videos. Now I'm asking sir, what are Covariant and Contravariant? No one can understand anything when you say this. You have made an explanation full of contradictions. Good luck.
@rashahameed4937
@rashahameed4937 3 жыл бұрын
Please translate to Arabic
@jameshopkins3541
@jameshopkins3541 Жыл бұрын
YOU CAN NOT EXPLAIN IT!!!!!!
@shreyaskumarm3534
@shreyaskumarm3534 2 жыл бұрын
Thank you very much sir
@thevegg3275
@thevegg3275 Жыл бұрын
Ok, after months of thinking on this, I have an explanation of covariant vs contravariant that no one has ever uttered, as far as I know. Here goes. Hold my beer! --- You combine contravariant (components times their regular basis vectors) tip to tail to reach the tip of the vector you're defining. But with covariant, you combine (covariant components times their dual basis vectors) tip to tail to get to the tip of the vector you're definging. Why does no one explain it like this? But my question is how is covarant components of dual basis vectors relate to the dot product? Please correct me if I'm wrong on the following... DOT PRODUCT: A (vector) dot B (vector) = a scalar quantity CONTRAVARIANT: described by the combination of contravariant (components times regular basis vectors) added tip to tail of A (vector) dot B (vector). COVARIANT: described by the combination of covariant (components times dual basis vectors) added tip to tail of A prime (vector) dot B prime (vector). QUESTION: If we dot product A prime (vector) with B prime (vector), does that scalar quantity equal A lower 1 prime times e upper 1 prime PLUS A lower 2 prime times e upper 2 prime? If so, arent we then saying that a scalr is equal to a vector???
How Strong Is Tape?
00:24
Stokes Twins
Рет қаралды 96 МЛН
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
1/3 Tensor - A mathematical language
11:18
Fizică
Рет қаралды 4,8 М.
Contravariant and Covariant Vectors | 1/2
10:48
Faculty of Khan
Рет қаралды 128 М.
Covariance and Contravariance
13:31
Christopher Okhravi
Рет қаралды 17 М.
3/3 Contravariant and Covariant tensor | In Hindi
12:28
Fizică
Рет қаралды 6 М.
Dual Basis - Covariant & Contravariant Components
14:55
The Cynical Philosopher
Рет қаралды 10 М.
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,7 МЛН
Why Tensor Calculus?
1:01:39
MathTheBeautiful
Рет қаралды 605 М.
What is a TENSOR? (Really this time!)
59:24
More in Depth
Рет қаралды 66 М.
The Covariant Derivative of a Tensor
12:53
Faculty of Khan
Рет қаралды 1,5 М.
How Strong Is Tape?
00:24
Stokes Twins
Рет қаралды 96 МЛН