No video

What is a Tensor 5: Tensor Products

  Рет қаралды 55,809

XylyXylyX

XylyXylyX

Күн бұрын

What is a Tensor 5: Tensor Products
Errata: At 22:00 I write down "T_00 e^0 @ e^1" and the correct expression is "T_00 e^0 @ e^0"

Пікірлер: 171
@haashirashraf7289
@haashirashraf7289 6 жыл бұрын
I studied Physics in University for four years and I thought I understood what a tensor product was, but I didn’t until seeing this. Anybody still struggling, just make sure you’re clear on the concept of the dual space and covectors/linear functionals.
@integrosko
@integrosko 4 жыл бұрын
This is literally the most understandable explanation of a tensor. Im really grateful for these lectures.
@XylyXylyX
@XylyXylyX 4 жыл бұрын
Thank you for your kind comment!
@jordantistetube
@jordantistetube 7 жыл бұрын
3:35 "We have delivery of four ordered vector pairs"
@dimitrisnatsios8409
@dimitrisnatsios8409 4 жыл бұрын
We have the same door bell and I was watching the video at 4 am. You are awsome. Thank you very much for all the knowledge you share. Greetings from Greece.
@XylyXylyX
@XylyXylyX 4 жыл бұрын
Thank you for your kind comment
@manodura8132
@manodura8132 2 жыл бұрын
Wao. Videos 1 thru 5 are the BEST introduction to tensors I have seen in all youtube. Many thanks, sir !!!
@XylyXylyX
@XylyXylyX 2 жыл бұрын
Thank you for your kind comment. Good luck with your studies!
@manodura8132
@manodura8132 2 жыл бұрын
@@XylyXylyX jaja I am learning this for my personal and curious interest! Let's see how I progress from here on to your lectures on General Relativity 🙂
@PunmasterSTP
@PunmasterSTP 2 жыл бұрын
The process of building up this machinery just strikes me as very "meta", but in any case, it's been fascinating to watch!
@yamansanghavi
@yamansanghavi 6 жыл бұрын
Thank you so much, sir. I have no words to describe how good your lectures are.
@DrUBashir
@DrUBashir 6 жыл бұрын
Very unconventional and direct teaching style. Very refreshing!
@1PKFilms
@1PKFilms 7 жыл бұрын
Man thank you so much! This series is awesome I am a middle school student who is doing a paper (I am partaking in a competition) on clifford algebra and while I am very good at grasping complexe concepts I am obviously missing some of the basics as I am still pretty early in my education (we just learned what an integral is in math) So there was this one super awesome paper on the subject and I got along fine for the most part but then the auther wrote the tensor product was definitly a requirement from now on and the reader should revise it and thats when I look it up on youtube and MAAAN YOU SERIES IS AWESOME not only did I totally get it riight from the start but also some things I didnt understand before are now clear as while I knew what a vector space was (read the wikipedia article) I had only a pretty rough idea what maps where from what I gathered myself and now all these papers are totally clear thanks!!!
@XylyXylyX
@XylyXylyX 7 жыл бұрын
1PKFilms Thank you for you kind comments and good luck with your studies. I'm glad you are not relying entirely on KZbin lectures. The value of lessons like this is that it will help make a good text less intimidating and fun to read. It is always fun to read about material you feel you already know :) at least a little bit.
@loganthrashercollins
@loganthrashercollins 6 жыл бұрын
What competition? Was it associated with Broadcom Masters? I didn't do that one, but I did ISEF three times.
@EugeneKhutoryansky
@EugeneKhutoryansky 7 жыл бұрын
You may want to add the correction for T_00 into the video description, in addition to having a KZbin annotation. KZbin annotations never appear for people watching on a cell phone, and KZbin annotations also sometimes don't appear even for people watching on a PC.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Losing annotations was a big hit for me. My style of presentation can be a touch extemporaneous and therefore a little error-prone and I need the ability to make corrections. I was aware of the problem for mobile viewers. I may do an "errata" video to capture all the little errors that I discovered. Thanks!
@massimoacerbis8138
@massimoacerbis8138 6 жыл бұрын
Let me add that it was quite obvious it was an errata Who does not get it might better start the lecture from the beginning
@revooshnoj4078
@revooshnoj4078 5 жыл бұрын
I would just like to share how I think of tensors and tensor products for anyone else interested. Essentially tensors are multilinear maps acting on multiple vectors this is a very natural extension of linear maps in linear algebra. Now suppose we have a tensor T acting on 3 vectors, T(A_i e_i, B_j e_j, C_k e_k) Now by multilinearity this becomes A_i B_j C_k T(e_i, e_j, e_k) Notice that T(e_i, e_j, e_k) does not depend on which vectors you choose but only the basis you chose so you can define this as T_ijk Now notice that A_i B_j C_k is a real number and depends on your choice of vectors and is multilinear and hence can be described by a tensor but which tensor exactly? Well obviously just take the ith term in the first vector (using e_i tensor works if it acts on only on vector) the jth in the second (e_j tensor) the kth in the 3rd vector (e_k tensor) and multiply them together.This is where the tensor product comes from it lets us describe/create new tensors from the basic tensors we have (the basis covectors and basis vectors) Essentially we defined how the tensor product acts on these basis elements and now if we require the normal properties such as linearity to hold on the tensor product as well this uniquely defines how it acts on the more general cases and essentially turning into an Algebra.This is how I think about the wedge product as well.
@ikechukwumichael1383
@ikechukwumichael1383 Жыл бұрын
I'm really grateful for these lectures.
@mms-hl1sh
@mms-hl1sh 7 жыл бұрын
9:50 "What was that?" "I don't know, something in perfectly linear fashion?"
@TheDetonadoBR
@TheDetonadoBR 4 жыл бұрын
fecking charged particles going through touch pad in a linear fashion
@chins85
@chins85 3 жыл бұрын
Magic
@hedwigvdmoortel
@hedwigvdmoortel 7 жыл бұрын
brilliant explanation / video series, because of the simplicity. Thanks.
@debendragurung3033
@debendragurung3033 6 жыл бұрын
i loved this part so far. thanks for the effort you put in to explain..
@akashpremrajan9285
@akashpremrajan9285 15 күн бұрын
I think there is a very tiny small detail missing here, that got me confused. The collection of all Tensor Products of any two co-vectors does NOT give you a Vector Space by itself. You need all possible linear combinations of those Tensor Products, for that to be the case. This is because the sum of two Tensor Products need not be factorable into a Tensor Product itself. It is a minor nitpick, but I got very confused, since the number of independent choices available was not matching between the 2 collections.
@chymoney1
@chymoney1 5 жыл бұрын
Idk why I love thinking about coordinate systems and mapping it’s so basic but cool
@jewbaby9143
@jewbaby9143 3 жыл бұрын
Hello again! Love this series. I keep coming back to it and getting more from it every time. I am happy to say that I finally see the light and have a basic understanding of what these things are and how they work :) I do have a question though: You mentioned that the only way for tensor products to remain linear is if they reduce to a scalar product of each covector-vector pair at the end of the day (when we feed them a full list of arguments). Why is that? How do we show that that's the only way it will work? Thanks :)
@jeffspahn8923
@jeffspahn8923 7 жыл бұрын
Thank you for your wonderful videos. I find them immensely helpful. I am a little confused however. What is the difference between a "Tensor" and a "Tensor Product" ? It feels like you used the terms interchangeably.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Jeff Spahn A "tensor" is an element of a "tensor product space". A tensor product space contains elements which are the "tensor product" of vectors and covectors. So a "tensor product" is a multi linear map built as I describe in this lecture, using the tensor product operator. So a tensor is the tensor product of some vectors and covectors. But keep in mind that a tensor product of rank (1,0) or (0,1) is not really "multi" linear, it is just "linear" but we still call it a tensor, so V and V* are tensor product spaces but small ones!. And rank (0,0) tensor product spaces are just real numbers!
@robertwilsoniii2048
@robertwilsoniii2048 7 жыл бұрын
Dude, you're videos, believe it or not, are absolutely world class. Your material mirrors (and ultimately is equivalent to) the second course in the freshman honors sequence at Stanford for math majors (Math 52h/Math 62cm) by Yakov Eliashberg. Check it out if you want. You might find some stuff in the free online lecture notes/textbook to make your video series even better, but its already amazingly good.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Robert Wilson III Thank you for the kind endorsement. I draw from many sources and I will be happy to check out the course you cite. I am currently struggling with a presentation of p-forms and p-vectors (around lecture 23 or so) so I know I have a LOT of room for improvement. Maybe I will find a good angle in those notes?
@robertwilsoniii2048
@robertwilsoniii2048 7 жыл бұрын
XylyXylyX It does. Big time. One if the main focuses on math 52h is to cover the theory of differential forms, exterior algebra and the generalized stokes' theorem. Here's a link to the file: math.stanford.edu/~eliash/Public/177-2015/52htext-2015.pdf Specifically check out the section on exterior products. There an extremely clear an informative approach taken using skew symmetric tensor products. The book uses "k-form" instead of "p-form" and some other notational differences, but you should be able to translate back and forth using whatever notation you like because it's totally trivial.
@robertwilsoniii2048
@robertwilsoniii2048 7 жыл бұрын
Here's the explanation that's given in the book: "For our purposes skew-symmetric functions will be more important. Thus we will concentrate on studying operations on them. Skew-symmetric k-linear functions are also called exterior k-forms. Let φ be an exterior k-form and ψ an exterior l-form. We define an exterior (k + l)-form φ ∧ ψ, the exterior product of φ and ψ, as φ∧ψ := (1/k!l!) (φ⊗ψ)asym. In other words, φ∧ψ(X1,...,Xk,Xk+1,...,Xk+l) = 1/k!l! Σ(i1
@johnstroughair2816
@johnstroughair2816 6 жыл бұрын
Nice explanation- you do have a typo at about 22:21 when you start writing out the components of T
@Sinistersonny
@Sinistersonny 8 жыл бұрын
excellent job (india welcomes you)😊
@XylyXylyX
@XylyXylyX 8 жыл бұрын
amrit pratap wow! U speak for all of India! An honor indeed!
@lonnybulldozer8426
@lonnybulldozer8426 2 жыл бұрын
I'm curious what textbook you are using for these?
@XylyXylyX
@XylyXylyX 2 жыл бұрын
I don’t follow a specific book, but “Introduction to Vectors and Tensors” by Bowen and Yang (Dover) is a book that contains everything we cover in these lectures.
@davidhand9721
@davidhand9721 Жыл бұрын
In vector calculus, suppose we take the derivative of a real scalar field. The value at any given point is expressed with the same indices as the field in which it lives, but it isn't a vector in that space itself, is it? The derivative is not equal to any sum of the original basis vectors; does this make the derivative a covector? It's sort of a map in that you can recreate the field value by summing the integrals of the derivative's scalar coefficients and get a scalar. Am I on the right track here toward tensor calculus?
@alanbilsland1356
@alanbilsland1356 5 жыл бұрын
Hi. Me again. Sorry. Think I'm getting it with some playing around in matlab, at least from the perspective you advised to me earlier of tensors as tools. Just to check that I am on the right track: at the moment I understand the tensor product as defined here to be essentially the Kronecker product (that's as far as my head can go at the moment). If I take this over all combinations of the standard basis in Cartesian coordinates I and sum under vector addition then I get in 4d either the 16x1 vector of ones or its matrix equivalent. The Tmn are then just the components. Is that basically it?
@XylyXylyX
@XylyXylyX 5 жыл бұрын
Well...it is a little difficult to know exactly what you are doing, but generally speaking the Kronecker product of two transformation does represent the tensor product and in the case you describe there would be 16 basis vectors. So it sounds correct to me....
@stinkfoot1
@stinkfoot1 7 жыл бұрын
Hi, Thanx for the great videos. Enjoying them a lot. Regarding the definition of the tensor product as a@b(u,v):=a(u)*b(v) ... would a@b(u,v):=a(u)+b(v) also be a valid definition that is a linear map from (u,v) -> R ? This is similar to David Gillooly and Wayne VanWeerthuizen question. Basically : How do we know that a@b(u,v):=a(u)*b(v) is the ONLY map that maintains linearity (as mentioned in the video)?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
stinkfoot1 It must be *multi*-linear. If you multiply either u or v by a constant "c" then the result should be a factor of c larger. Your plan does not satisfy that. The proof is by contradiction: assume there is another multiliner map and you will conclude that it is the same as the one we defined!
@stinkfoot1
@stinkfoot1 7 жыл бұрын
XylyXylyX ah... that was the bit I was missing. Thanx a lot
@yachen6562
@yachen6562 2 жыл бұрын
Amazing series! Thank you so much~
@klam77
@klam77 4 жыл бұрын
Excellent lectures.
@johnlie8586
@johnlie8586 4 жыл бұрын
Hi sir I really enjoy your videos. My q. Tensor is map like dot product. My question is Can we say normal addition and product on real number is a tensor? Many thanks
@XylyXylyX
@XylyXylyX 4 жыл бұрын
The real numbers are indeed a one-dimensional (real) vector space with standard addition as the vector addition property. A vector space does not automatically have a “product” however. BUT in the case of real numbers, the “scalars” are the *field* of real numbers which DOES have a multiplication property, so in this particular case (that of a one dimensional real vector space) the required “scalar multiplication” is also an inner product!
@johnlie8586
@johnlie8586 4 жыл бұрын
@@XylyXylyX That is exactly I mean. Anyway your teaching style is superb. And thank you for replying me. You are one of the surper stars in the universe. Many thanks.
@johnlie8586
@johnlie8586 4 жыл бұрын
@@XylyXylyX Can we say dual space of a real number vector space is itself?
@mihaisirghe
@mihaisirghe Жыл бұрын
hello. first of all, great series, but i need to adress something here. you say that a tensor is a map ON vector cartesian product TO reals, and then you use the notion of tensor product which is ... something in the realm of dual vector space. this deffiniton is circular, and had a lot of trouble understanding it. I would argue, that this should be detailed a bit. I like the fact that you use the construction aproach....this approach shoud of been used in explaining what the tensor product is.
@XylyXylyX
@XylyXylyX Жыл бұрын
Thank you for your comment. I will probably redo this lecture eventually and this comment will help me make it better.
@mihaisirghe
@mihaisirghe Жыл бұрын
@@XylyXylyX if you decide do redo this segment, would be great. i realy do appreciate the effort you put into presenting such abstract concepts in a quite "intuitive" manner. also i would like to point out the pacing of your speach, which i find very well adapted.
@georgeorourke7156
@georgeorourke7156 8 жыл бұрын
Quick question: At 21:45 you had an equality between BETA Tensor Product GAMMA = T(subscript mu nu) e (superscript mu) Tensor Product e (superscript nu). Could we not explicitly express T(subscript mu nu) as a product of the coefficients of BETA and GAMMA - i.e. BETA = B (superscript i) e(subscript i) and GAMMA = g (superscript j) e (subscript j)?
@bcthoburn
@bcthoburn 6 жыл бұрын
How does the definition of a tensor here relate to the universal property (I’ve seen your next video). This seems quite unconnected to an article I read called “how to conquer tensorphobia” that discusses it. In one tensors are being mapped and the other they ARE maps. Does the universal property force you into this definition, and if so how? Can Cartesian products also be maps? Any insight is welcomed!
@DavidGillooly
@DavidGillooly 7 жыл бұрын
Around 6:32 min., how do I know "..this product is the only product that will maintain linearity..."?
@roccoduquennoy5184
@roccoduquennoy5184 6 жыл бұрын
first of all thank you for your explanations of such an usefull and important topic! your videos are really helping me understanding those notions. I also have a question for you: is it correct to say that, considering a tensor T= A@B where A and B are elements of V*, the real numbers T_00, T_01, .... T_33 are the real probuct A_0*B_0, A_0*B_1, ...., A_3*B_3 ?? thank you very much :) (sorry if my english is not quite good, I'm writing you from italy)
@XylyXylyX
@XylyXylyX 6 жыл бұрын
The components of A@B as you have defined it will be T_00=A_0B_0 ...T_ab = A_a B_b....so yes, you have it correct.
@roccoduquennoy5184
@roccoduquennoy5184 6 жыл бұрын
thank you :D
@ThomasImpelluso
@ThomasImpelluso 7 жыл бұрын
I am sorry to come way back here to ask this question... I do not know where to put it. Now, much later, in this wonderful series, we learn what a second rank tensor is. We see it operates on two vectors. So I imagine the STRESS tensor. As I now understand, it is a (0,2) tensor When it operates on one vector (a (1,0) tensor we get the TRACTION on a surface perpindicular to that vector: (anothe (1,0) tensor) But we never continue and multiply the stress by a second vector to get a scalar So does this mean that we do not have to contract all, say, second rank tensors into scalars? And that this tensor math is useful, but WE decide how to use it?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Thomas Impelluso As you are applying it the stress tensor is a vector-valued function that takes a vector argument. That is, the answer IS a vector. You could take that vector and do a lot with it: insert a form and get some sort of energy, for example.
@michaelten-pow5467
@michaelten-pow5467 3 жыл бұрын
Really well done. Thank you.
@ILoveeemeee
@ILoveeemeee 2 жыл бұрын
I'm a bit confused with the following: around 17:57 you've written arbitrary vectors as linear combination from V and V*, and you've previously said that you would use Cartesian product VxV vectors (v,w) where v is from V and w is from V, not VxV*. I am understanding arbitrary vectors written as A_i e_i is from V, and B^j e^j is from V*.What am I missing?
@XylyXylyX
@XylyXylyX 2 жыл бұрын
IT should be “A^i e_i” is an arbitrary vector in V and B_j e^j is an arbitrary covector from V*.
@zolokur6702
@zolokur6702 7 жыл бұрын
Thank you so much,.this is reaallly helpful.
@abhishekgy38
@abhishekgy38 4 жыл бұрын
At 6:57, why should the results (the real numbers) obtained after the co-vectors individually operate on the two vectors be multiplied? Why can't they be added? Or is it just how it is defined?
@XylyXylyX
@XylyXylyX 4 жыл бұрын
It is how it is defined, but that is the ONLY way of defining the product so it is bi-linear. Try your addition definition and you will see B@A (aV, bW) =/= ab B@A (V,W).
@jasonbroadway1608
@jasonbroadway1608 4 жыл бұрын
Notwithstanding, I enjoyed the video!
@stefanopalmieri9201
@stefanopalmieri9201 8 жыл бұрын
Is there a mistake around 22:14 when we are expanding the Einstain summation and have T_00 e^0 X e^1 . Should it be T_00 e^0 X e^0?
@XylyXylyX
@XylyXylyX 8 жыл бұрын
Nice catch! Yes, I simply misspoke and mis-wrote. I have made an annotation. Thank you very much :)
@TheBigBangggggg
@TheBigBangggggg 8 жыл бұрын
I'm confused. Take the example after 17:45 : < e1 * e2 > . < A.e , B.e > Why don't you write it in this form right away? They are both maps, aren't they?
@XylyXylyX
@XylyXylyX 8 жыл бұрын
If I understand your question, you may be confusing the brackets, parentheses, and angled brackets. The object in the brackets is a tensor, which is a map. The parentheses are there to identify the two vectors which will be fed to the map.
@grgr279
@grgr279 4 жыл бұрын
Hey Xyly, thanks for this video! I wish I had known about this video in school! Kinda of a silly question, but could we just think of a Two Tensor space as being equivalent to a 16 dimensional vector space? I know for convenience it is better to call it a two tensor but technically they are the same correct? Thanks
@grgr279
@grgr279 4 жыл бұрын
I’m sorry, I meant two tensor space of dual space of dimension 4
@XylyXylyX
@XylyXylyX 4 жыл бұрын
Gregory Krulin Well, if you mean a (0,2) tensor space where the underlying vector space, V, is dimension 4, then the tensor product space V* @ V* is a 16 dimensional vector space. Now remember that all finite dimensional vector spaces with the same dimension are isomorphic. So the answer to your question is “yes”!
@robertbrandywine
@robertbrandywine 2 жыл бұрын
I think at least once you said a tensor product is a tensor. Isn't it more correct to say that when a tensor product is used as a map to the reals on Cartesian Product pairs of a vector space that it is a tensor?
@XylyXylyX
@XylyXylyX 2 жыл бұрын
Yes. They way you said it is more precise and correct.
@eastquack3342
@eastquack3342 3 жыл бұрын
at ~6:30 he introduces the notation for the tensor product; shouldn't a coherent notation involve where x is the tensor product since tensor products reside in V* instead of the more involved [βxα](v,w) notation?
@robertbrandywine
@robertbrandywine 2 жыл бұрын
Didn't he say he put those square brackets around the tensor product just to show that it was one object?
@ElaaxV
@ElaaxV 7 жыл бұрын
At 19:12 you wrote the arbitrary vectors as A^u·e_u , B^n·e_n, but in previous videos (and this one aswell) you made the distinction B_n·e^n for the arbitrary vector of V^*. Is this a mistake? And if it is, does this change the notation for the delta and the result A^1·B^2? Thank you.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Alex Ksr It is not a mistake, but I did use the symbol B for two different things. Earlier I used it as a vector and then when I needed to write something down as an arbitrary covector I just grabbed the letter B again. I should have grabbed the letter "C" instead, I guess!
@ElaaxV
@ElaaxV 7 жыл бұрын
Oh yes, sorry. I should pay more attention. Thanks for answering.
@sdmartens22
@sdmartens22 8 жыл бұрын
Love your videos, keep up the good work!
@XylyXylyX
@XylyXylyX 8 жыл бұрын
Thank you Shannon, I am glad someone is watching these videos!
@DrIlyas-sq7pz
@DrIlyas-sq7pz 7 жыл бұрын
Thank you. I got cleared about some of the concepts. But i found one problem in this video may be i am wrong. that is, we are going to define what is tensor product but you already took two covectors and tensoring it and saying that it is a tensor product. in this way it is not clear that what exactly is mean by 'beta tensor alpha' how would it look like. first need to know this tensor symbol then we will say that beta tensor alpha would give real number when apply on product of vectors. And what is mean by e^0(tensor symbol)e^0. tensor is a map from vectors to real numbers but this map itself is tensor of two vectors. look like circular reasoning. isn't it?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
m ilyas e^0 @ e^0 is a tensor. A tensor is a map. In this case it is a map that takes any two vectors A^i e_i B^j e_j to the real number given by < e^0 , A^i e_i> = A^i B^j \delta^0_i \delta ^0_j = A^0 B^0. When we name a vector "A" or a covector "\alpha" you can always replace that name, if you wish, by A^i e_i a or \alpha_j e^j. That may help you understand how to combine vectors and covectors into tensors better.
@DrIlyas-sq7pz
@DrIlyas-sq7pz 7 жыл бұрын
Thank you dear. To me, just saying that it is a map is not always clear what that map or a what is inner working of the machine(map) in which we put our inputs. But the thing was little more cleared in next video with more examples. Thank you for all your comments in other videos too. I am going to watch all of these videos.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
m ilyas Great! Remember: every map can be completely described if you know how the map acts on the basis vectors.
@Arlesterc
@Arlesterc 6 жыл бұрын
Howdy. At 21:59 you have alpha tensor product gamma equals T with two indices (which you say is a set of real numbers) times the tensor products of the basis vectors. So what is the 'tensor'? The alpha gamma tensor product? The T? The tensor product of the basis vectors? Or is that alpha gamma tensor product is a tensor and all the stuff on the right considered as a single entity is also a tensor because it equals the alpha gamma tensor product? So two different ways of expressing a tensor ? If these are two ways is there a name for each of the ways? Your attention to this is appreciated in advance.
@XylyXylyX
@XylyXylyX 6 жыл бұрын
THink of it this way: The basis vectors of a tensor product space is a tensor. The product of a tensor with a real number is a tensor. The sum of any two tensors is a tensor. Therefore T_ab e^a @ e^b is a tensor.
@Arlesterc
@Arlesterc 6 жыл бұрын
Thanks. When you say the basis vectors of a tensor product space 'is a tensor' do you mean the tensor product of the basis vectors is a tensor or each basis vector is a tensor all by itself? Also what is alpha tensor product gamma on the left hand side - - a tensor? So if I see any tensor product like alpha tensor product I can just call it a tensor and whatever its right hand worked out equivalent is, that totality is also a tensor.? And the indices of the T represent a bunch of real numbers that are the coefficents of the tensor? And after all the dust settles the right hand side ends up being a single number?
@XylyXylyX
@XylyXylyX 6 жыл бұрын
Each basis vector of the tensor produce space, e^i @ e^j i , is a tensor all by itself. It is an object that takes two vectors (in this example) and produces a real number. \beta is a *covector* and \gamma is a *covector* so their tensor product is a (0,2)-rank tensor. If you substitute \beta = B_i e^j and likewise for \gamma you may be able to see this better. Your last three sentences are correct except for the very last statement. After all the dust settles you have a *map* not a real number. That map takes two vectors and produces a real number. The real number doesn't appear untill you feed the (0,2)-rank tensor two vectors!
@blacksunprison13
@blacksunprison13 8 жыл бұрын
How can all tensors be vectors if vectors are all tensors of rank 1? Does that mean rank 2 tensors as well as scalars are vectors since they are also tensors? Could you clarify slightly? Great videos, by the way.
@XylyXylyX
@XylyXylyX 8 жыл бұрын
The answer lies in the fact that the word "vector" is overloaded. The most fundamental meaning of "vector" is "an element of a vector space." Every tensor, regardless of rank, is a member of a tensor product space and every tensor product space is a vector space, hence all tensors are vectors in the literal sense. On the other hand, the word "vector" is often used to mean "tensor of rank 1". In that usage not all tensors are vectors. I like to point this out as sort of an exercise to force a solid understanding of literal vector spaces.
@blacksunprison13
@blacksunprison13 8 жыл бұрын
Thanks!
@evanjoshychittilappilly7959
@evanjoshychittilappilly7959 6 жыл бұрын
What is the difference between e super 0 (tensor product) e super 1 and e super 1 (tensor product) e super 0? You have specified them...16:45
@XylyXylyX
@XylyXylyX 6 жыл бұрын
They resolve into two different numbers: e^1 @ e^0 (A,B) = and e^0 @ e^1 (A,B) =
@bhavyasingh355
@bhavyasingh355 4 жыл бұрын
can you explain interferometry
@waynemv
@waynemv 7 жыл бұрын
At 5:58 when you say "the ONLY real number in sight....", and at 6:27 when you say "the ONLY way to preserve...", do you mean excluding symmetries? I am wondering if v and w could have been swapped (as in: ) and it still meet all the criteria you set out?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Wayne VanWeerthuizen No, it can not be swapped. The mapping is defined for a *ordered* pair, so you can not arbitrarily replace b @ a (v,w) with b @ a (w , v) and the product you wrote down in your question is in general a *different* real number. It is important to understand this because there is not supposed to be any choice in the matter. All of this is forced upon us.
@waynemv
@waynemv 7 жыл бұрын
But, doesn't the expression, also "keep things linear?" Doesn't it also "only use stuff in the dual space?" Besides those criteria that you mentioned around this point in the video, did I miss any further criteria that are also required of the desired expression? Thus, if both and both equally well satisfy all the specified criteria, how can one of the expressions be forced here rather than the other? What exactly determines which of the two expressions is to be the one that's forced? Or is the choice of going with the first expression instead of the second merely a matter of convention? So, is there anything else important here I've missed? At this point I'm unable to know if I've misunderstood your video or your answer; or if rather your answer above misunderstood my question. Anyway, I'll watch the video again and see if I can find anything failed to catch the first time watching it that would clarify this point for me. There's other details I still need to review again anyway. And I do really appreciate you making these videos and putting them online. I found there are a few different series about tensors on KZbin, and after given them a quick look over, yours was the series that stuck out at me as the most promising for how I learn.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Wayne VanWeerthuizen The tensor b @ a takes an element of V x V and returns a real number. An element of V x V is *ordered pair* (v,w). The mapping you propose is simply a *different* (1,1) tensor, a @ b, than the one I propose. Alternatively, it is the same (1,1) tensor but with the opposite argument (w,v). This is very strict. The tensor you propose is indeed linear, as you say. The tensor mapping is defined to be b @ a (v, w) = with the first covector of the tensor paired with the first member of the ordered pair. What you are missing is how the mapping is defined.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Wayne VanWeerthuizen And thank you for the complement. I do suggest that you use the lessons as a way to help you with a real text about differential geometry. My intent is to take the intimidating edge off of learning this material. I don't think these lessons can fully replace a text, but it can really help to get started.
@waynemv
@waynemv 7 жыл бұрын
Okay, although I'm still not fully satisfied with the answer (how can the expression I proposed be different, given that the one you presented is supposed to be the only one possible?), at this point I assume my disagreement with that point in the video is just a matter of semantics so I don't wish to take up any more of your time with it. But, I am also wondering if I have all the prerequisites for this sort of course. I've taken a trimester of introductory linear algebra, a year of calculus, a trimester of introductory differential equations from a community college, and a semester of introductory vector analysis at university. And I've taken a first year college course in physics with calculus. Are there any other important mathematics prerequisites I may have missed?
@Achilleeeez
@Achilleeeez 7 жыл бұрын
Got a bunch of questions.. First of, are beta and alfa tensors? Or do they become a tensor only through the tensor product? You said in some earlier video that every tensor is a vector. But is not the other way around, that every vector is a tensor of rank 1?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Randomian Every vector is a rank (1,0) tensor. Every covector is a rank (0,1) tensor.
@Achilleeeez
@Achilleeeez 7 жыл бұрын
But the defintion of a tensor used Beta and Alfa, but they are themselves tensors? Seems circular. As always, thank you for the series and thank you for taking the time to answer my questions.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Randomian (0,1) and (1,0) and (0,0) tensors are defined to be vectors, covectors, and constants. All higher order tensors are defined via the inner product.
@Achilleeeez
@Achilleeeez 7 жыл бұрын
Thanks for the answer!
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Opps..........I meant the "tensor product"
@aleksanderaksenov1363
@aleksanderaksenov1363 5 жыл бұрын
Suggest fine literature on tensor analysis
@ThomasImpelluso
@ThomasImpelluso 7 жыл бұрын
May I ask you for a suggested name of a textbook MOST CLOSELY aligned with YOUR style of presenting this material?
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Thomas Impelluso Well, nothing is exactly on point, but I have been recommending "Introduction to Vectors and Tenors" by Bowen and Yang (Dover). For the upcoming material on p-forms I and working with "Gravitation" by Misner, THorne, and Wheeler as well as a few others that I will mention in the lesson itself. GOod Luck!
@robertwilsoniii2048
@robertwilsoniii2048 7 жыл бұрын
You know, this is totally coincidental, but the lecture notes written by Yakov Eliashberg for Stanford Math 52h are almost identical to both the content and order of the material presented in this video series. Check it out: math.stanford.edu/~eliash/Public/177-2015/52htext-2015.pdf I'd say that these video lectures + these lecture notes = you having a better education than 99% of all math majors on multilinear algebra, tensors and exterior algebras, and by extension, differential forms, integrals and differential geometry. This is no understatement. This is also another decent text to look at too www.math.harvard.edu/~shlomo/docs/Advanced_Calculus.pdf by Loomis and Sternberg And of course: www.scribd.com/doc/88420665/34158129-Calculus-on-Manifolds-Spivak-M-PDF# by Michael Spivak And: www.scribd.com/book/271592208/Tensor-Analysis-on-Manifolds by Richard L. Bishop and Sammuel L. Goldberg And the awesome classic: www.scribd.com/book/271567923/Differential-Geometry by Erwin Kreyszig
@ThomasImpelluso
@ThomasImpelluso 7 жыл бұрын
Oh I am aware of all of those. But this guy, whoever he is (and I wish he would just email me and say "hello") is SO MUCH more joyful. And that motivates me. Even with the errors. I WANT to listen to him.
@robertwilsoniii2048
@robertwilsoniii2048 7 жыл бұрын
Thomas Impelluso Well, that set of lecture notes by Eliashberg, if I was forced to pick one, would be hands down the perfect text for these lectures. But yes, his lectures are extremely good. Perfect for a second course in rigorours linear algebra and multivariate calc.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Robert Wilson III Those are all excellent suggestions, I think. For a lean into general relativity I also suggest "Gravitation" by Misner, Thorne, and Wheeler, but only after one complete pass through the topic.
@jasonbroadway8027
@jasonbroadway8027 4 жыл бұрын
No correspondence? 19:15?
@nickthepostpunk5766
@nickthepostpunk5766 4 жыл бұрын
XylyXylyX: please can I ask what whiteboard software you are using? It seems simple but very effective :-)
@XylyXylyX
@XylyXylyX 4 жыл бұрын
Indeed! It is called "Vittle" and it is an app on iPad. apps.apple.com/us/app/vittle-pro-video-whiteboard/id629037418
@nickthepostpunk5766
@nickthepostpunk5766 4 жыл бұрын
@@XylyXylyX Thank you: I'll take a look! :-)
@codyfan7161
@codyfan7161 5 жыл бұрын
So a rank 2 tensor product is a bilinear form?
@XylyXylyX
@XylyXylyX 5 жыл бұрын
As we are doing it any inner product can be expressed as a (0,2)-rank tensor. A (0,2)-rank tensor is a bilinear form.
@NilodeRoock
@NilodeRoock 3 жыл бұрын
04:35 "Since they are members of the dual space they are maps that take vectors to the real numbers." (...)
@kamilkonieczny3613
@kamilkonieczny3613 4 жыл бұрын
So we can also create tensor product of V @ V by defining it's basis to be e_i @ e_j , i assume that in physics it's more usefull to have tensors from V* @ V* ?
@XylyXylyX
@XylyXylyX 4 жыл бұрын
Hmm....well....the mathematics is certainly symmetric. I think the way to say it is this: If the objects in your physical model are associated with vectors, then we use covectors to measure them. If the objects in your model are covectors, we use vectors to measure them. However, with a metric the vectorial objects can be converted to covector objects, so the mathematical symmetry persists into physical models. So no, I don’t agree with your assumption as you stated it. We just have a little conventional favoritism for models with vector objects and so we lean on the *covector*-based form of the metric (that is the “covariant” metric tensor g_a_b). In any model some objects will necessarily be dual to each other and by converting them from vector-type to covector-type (lowering all up-indices and raising all down-indices) does not change the model at all. Does that help? For example, we can write the Einstein equation as G^a^b = 8\pi T^a^b just as easily as its more common form G_a_b = 8\pi T_a_b.
@kamilkonieczny3613
@kamilkonieczny3613 4 жыл бұрын
@@XylyXylyX this helps, thank you
@shihabistiakesabbir3690
@shihabistiakesabbir3690 4 жыл бұрын
it would be really helpful if you name the books that you are following
@XylyXylyX
@XylyXylyX 4 жыл бұрын
I am not following any particular book! However, I learned much of this material from “Introduction to Vectors and Tensors” by Bowen and Yang (Dover).
@bhavyasingh355
@bhavyasingh355 4 жыл бұрын
at 23:00 how tensor indicies has T32 when maximum no. of dimensions are 16 can you please explain
@XylyXylyX
@XylyXylyX 4 жыл бұрын
I think you are confusing “T_3_2” with “T_32”.... the first is two indices each going from 0 to 3. The second has a single index that I guess you think god is to 32!
@bhavyasingh355
@bhavyasingh355 4 жыл бұрын
Oh, now i understood
@jasonbroadway1608
@jasonbroadway1608 4 жыл бұрын
Everything you said makes sense, except for 'no correspondence between V and V* at around 19:15...... I don't agree there....otherwise I am good...
@jacquessmeets4427
@jacquessmeets4427 4 жыл бұрын
There is no relation/correspondence at all between one (individual) vector in V and another (individual) vector in V*. The only thing we know is that each vector can map a covector to R and each covector can map a vector to R. There is no information at all (yet) WHAT vector, acting on WHAT covector results in WHAT number.
@jasonbroadway8027
@jasonbroadway8027 4 жыл бұрын
@@jacquessmeets4427 ok...an isomorphism can be constructed, no one will not exist willy nilly...
@no-one-in-particular
@no-one-in-particular 9 ай бұрын
11:04 Don't you mean the set SPANNED by tensor products of 2 covectors
@XylyXylyX
@XylyXylyX 9 ай бұрын
No, I meant it in the sense “tensor product of ANY two co-vectors”…. The “any” was missing!
@no-one-in-particular
@no-one-in-particular 9 ай бұрын
@@XylyXylyX Even if you meant "any" it is still untrue. In general the sum of two tensor products is not a tensor product
@XylyXylyX
@XylyXylyX 9 ай бұрын
@@no-one-in-particular Yes. Understood. I really don’t think that section woudl be interpreted otherwise by anyone who was following the series however. It is not my intent to be defensive, but I’m not to worried about this particular spot in this lecture. I do think I should redo the whole series a bit though … my style and technique has improves some over the years and this is a great topic.
@TheBigBangggggg
@TheBigBangggggg 8 жыл бұрын
Confused again after 18:48 : why is mu = 1 and nu = 2? Why can't mu and nu have the value of 0 or 3 e.g.?
@XylyXylyX
@XylyXylyX 8 жыл бұрын
I have simply chosen 1 and 2 as an example of a specific case for the mapping. The key is that the delta function forces mu to be 1 and nu to be 2 in this example because = \delta^1_\mu which is zero unless \mu = 1, say.
@cnp4655
@cnp4655 7 жыл бұрын
Nice video. But the emphasis on all "All tensors are vectors" - I'd say, not quite.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Chris Nwachioma Really? Why not? This idea is very important for my presentation.
@cnp4655
@cnp4655 7 жыл бұрын
I'd use analogy - All women are humans but not all humans are women.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
Chris Nwachioma Help me understand your point. What tensor exists that is not a "vector" in the way I intend: a member of a vector space. There a plenty of vectors that are not tensors, for sure, but my statement is "all tensors are vectors."
@cnp4655
@cnp4655 7 жыл бұрын
I understand that you called them a vector because they live in vector space. But strictly speaking and because vector has some properties which tensors of higher or lower ranks do not, it will be confusing to say "all tensors are vectors". For instance, saying that a Pseudotensor is a vector would lead one to misjudge its properties.
@XylyXylyX
@XylyXylyX 7 жыл бұрын
A pseudo tensor added to a pseudotensor is a pseudo tensor. A pseudo tensor multiplied by a scalar is a pseudo tensor. There is a zero pseudo tensor and every pseudo tensor has its opposite. That is, pseudo tensors are vectors! The objection you have only exists if you bring *extra* baggage to the meaning of the word "vector" and my pedagogical angle is to tear away all of the extra baggage we bring to the word "vector" and return it to its roots.
@jonalderson5571
@jonalderson5571 Жыл бұрын
Your voice sounds like Henry Creel
@18amarage
@18amarage 7 жыл бұрын
3:45 . He eats ice cream.
@plmwd
@plmwd 6 жыл бұрын
I started watching these for curiosity, and I am so lost
@XylyXylyX
@XylyXylyX 6 жыл бұрын
Paul Wood Sorry! THese are not lectures fo the curious! These videos are meant to supplement regular coursework and are meant as a *second* swing through the subject. I cant imagine leaning this material for the first time using these lectures.
@codyfan7161
@codyfan7161 5 жыл бұрын
Come back when you have went thru an abstract linear algebra course! Also there are v nice readings online about Einstein notation and all the Levi-Civitas, and kroneker deltas and stuff.
@putin_navsegda6487
@putin_navsegda6487 Жыл бұрын
I don't like your explanation. Usually, youtube tutors explain better than my local math professors but here for me it's more clear to follow my prof. Video is too long and misses clairity
@nesaralititumir6153
@nesaralititumir6153 6 жыл бұрын
I don't get some of the concepts here. At first you define the tensor product between alpha and beta by showing us how it works on (u,w). Then you define how the combination of two tensor products work on (u,w). What I don't get is, what is the proof of the fact that if you define the addition of two tensor products in that way, that combination of two tensor products is actually a member of V*@V*. We only know that V*@V* is just the set of (r@s). How can we also include (r@s)+(p@q) in the set V*@V* by just defining how (r@s)+(p@q) works on (u,v) is not clear.
@XylyXylyX
@XylyXylyX 6 жыл бұрын
Because the sum is still a linear map that takes two vectors and returns a real number. V*@V* is the set of ALL such maps. Ergo the sum in in that set.
@nesaralititumir6153
@nesaralititumir6153 6 жыл бұрын
Well, you did not define (V*@V*) as the set of all maps that takes (u,w) and maps it to real numbers. You specifically defined (V*@V*) as the set of all tensor products (r@s) such that r and s are members of V*. You "defined" how (r@s) works on (u,w) and how linear combinations like (r@s)+(p@q) works on (u,w). But you did NOT define (V*@V*) as the set of all maps that takes (u,w) to real numbers. You defined it to be the set of previously defined (r@s).
@XylyXylyX
@XylyXylyX 6 жыл бұрын
Nesar Ali Titumir OK, good point. It is important to think of tensor product spaces as sets of multilinear mappings, so I am glad you took the time to get that straight.
@nesaralititumir6153
@nesaralititumir6153 6 жыл бұрын
XylyXylyX Okay, so you are saying that instead of defining (V*@V*) as the set of (r@s) we should define it as the set of all bilinear mappings that takes (u,w) to real numbers. Am I right?
@XylyXylyX
@XylyXylyX 6 жыл бұрын
Nesar Ali Titumir You are correct, but it is both too, with the caveat that is the set of all possible linear combinations of objects like r@s. Each possible r@s is a mapping (u,w)->R and all linear combinations of them are also mappings (u,w)->R.
@lowersaxon
@lowersaxon 5 жыл бұрын
My god, this is really boring! Poor Albert.
@XylyXylyX
@XylyXylyX 5 жыл бұрын
The exact moment when students come to understand the underlying principles of tensors is also the exact moment when the boring simplicity of all this becomes apparent. It is similar to playing music: after struggling to learn a beautiful piece of music the moment you succeed is the same moment when the fascination with the piece fades. In a nutshell: “this is simpler and more boring than I thought” = “I understand, finally”
@jacobvandijk6525
@jacobvandijk6525 5 жыл бұрын
@@XylyXylyX That's your positive interpretation ;-) Or is it justification?
What is a Tensor 6: Tensor Product Spaces
27:50
XylyXylyX
Рет қаралды 30 М.
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,6 МЛН
CHOCKY MILK.. 🤣 #shorts
00:20
Savage Vlogs
Рет қаралды 30 МЛН
КАКУЮ ДВЕРЬ ВЫБРАТЬ? 😂 #Shorts
00:45
НУБАСТЕР
Рет қаралды 3,2 МЛН
Tensor products
7:30
Jim Fowler
Рет қаралды 103 М.
What the HECK is a Tensor?!?
11:47
The Science Asylum
Рет қаралды 750 М.
What is a Tensor 7: Rank of a TSP
17:21
XylyXylyX
Рет қаралды 20 М.
Demystifying The Metric Tensor in General Relativity
14:29
Dialect
Рет қаралды 336 М.
Russell's Paradox - a simple explanation of a profound problem
28:28
Jeffrey Kaplan
Рет қаралды 7 МЛН
Pi is IRRATIONAL: animation of a gorgeous proof
23:20
Mathologer
Рет қаралды 748 М.
How big is a visible photon?
20:34
Huygens Optics
Рет қаралды 726 М.
CHOCKY MILK.. 🤣 #shorts
00:20
Savage Vlogs
Рет қаралды 30 МЛН