No video

What is a tensor anyway?? (from a mathematician)

  Рет қаралды 173,639

Michael Penn

Michael Penn

Күн бұрын

Suggest a problem: forms.gle/ea7P...
Please Subscribe: www.youtube.co...
Patreon: / michaelpennmath
Merch: teespring.com/...
Personal Website: www.michael-pen...
Randolph College Math: www.randolphcol...
Randolph College Math and Science on Facebook: / randolph.science
Research Gate profile: www.researchga...
Google Scholar profile: scholar.google...
If you are going to use an ad-blocker, considering using brave and tipping me BAT!
brave.com/sdp793
Buy textbooks here and help me out: amzn.to/31Bj9ye
Buy an amazon gift card and help me out: amzn.to/2PComAf
Books I like:
Sacred Mathematics: Japanese Temple Geometry: amzn.to/2ZIadH9
Electricity and Magnetism for Mathematicians: amzn.to/2H8ePzL
Abstract Algebra:
Judson(online): abstract.ups.edu/
Judson(print): amzn.to/2Xg92wD
Dummit and Foote: amzn.to/2zYOrok
Gallian: amzn.to/2zg4YEo
Artin: amzn.to/2LQ8l7C
Differential Forms:
Bachman: amzn.to/2z9wljH
Number Theory:
Crisman(online): math.gordon.edu...
Strayer: amzn.to/3bXwLah
Andrews: amzn.to/2zWlOZ0
Analysis:
Abbot: amzn.to/3cwYtuF
How to think about Analysis: amzn.to/2AIhwVm
Calculus:
OpenStax(online): openstax.org/s...
OpenStax Vol 1: amzn.to/2zlreN8
OpenStax Vol 2: amzn.to/2TtwoxH
OpenStax Vol 3: amzn.to/3bPJ3Bn
My Filming Equipment:
Camera: amzn.to/3kx2JzE
Lense: amzn.to/2PFxPXA
Audio Recorder: amzn.to/2XLzkaZ
Microphones: amzn.to/3fJED0T
Lights: amzn.to/2XHxRT0
White Chalk: amzn.to/3ipu3Oh
Color Chalk: amzn.to/2XL6eIJ

Пікірлер: 640
@qm_turtle
@qm_turtle 2 жыл бұрын
I always loved the definition for tensors in many physics textbooks: "a tensor is a mathematical object that transforms like a tensor".
@Erik_Caballero
@Erik_Caballero 2 жыл бұрын
Wait! You mean to say that the floor here is made of floor?!
@qm_turtle
@qm_turtle 2 жыл бұрын
@@Erik_Caballero precisely that
@theadamabrams
@theadamabrams 2 жыл бұрын
I mean, a "vector" in mathematics is "an element of a vector space", so that tensor definition seems perfect to me.
@iheartalgebra
@iheartalgebra 2 жыл бұрын
@@theadamabrams And let's not forget that a "vector space" in turn is simply a "space consisting of vectors" xD
@jankriz9199
@jankriz9199 2 жыл бұрын
my friend used to always say aloud when heard this definition "ok boomer" :D
@AndrewDotsonvideos
@AndrewDotsonvideos 2 жыл бұрын
I'm gonna have to come back to this one for sure
@vinvic1578
@vinvic1578 2 жыл бұрын
Hey Andrew so cool to see you here! You should cover this in a drink and derive video ;)
@cardboardhero9950
@cardboardhero9950 2 жыл бұрын
ayyy smart people
@axelperezmachado3500
@axelperezmachado3500 2 жыл бұрын
Wait. Wasn't a tensor just something that transforms like a tensor? We have been lied!
@captainsnake8515
@captainsnake8515 2 жыл бұрын
@@axelperezmachado3500 physicists have subtly different definitions for vectors and tensors than mathematicians do.
@rickdoesmath3945
@rickdoesmath3945 2 жыл бұрын
By the way i loved the video on motivation
@hensoramhunsiyar3431
@hensoramhunsiyar3431 2 жыл бұрын
Great exposition. A video on how this concept is related to the concept of tensors in physics and other fields would be great.
@k-theory8604
@k-theory8604 2 жыл бұрын
Hey if spamming videos in the comments isn't allowed, I'm happy to not do it in the future, but I just made a short video about this on my channel.
@poproporpo
@poproporpo 2 жыл бұрын
@@k-theory8604 if it is related to the topic at hand, it’s not really shameless advertisement and we appreciate the reference. In this case it’s fine.
@athens9k
@athens9k 2 жыл бұрын
Tensors as defined in this video are objects defined over a vector space. Tensors used in physics are really tensor fields on a manifold M. For every point p on a manifold, we have a natural vector space given by the tangent space TpM. On each tangent space, we can define some sort of tensor. A physicist's tensor is a smoothly varying field of tensors. A nice fact about manifolds is that they can be locally parameterized by Euclidean space. These parameterizations are usually called charts. We can use charts to parameterize all tangent spaces within a neighborhood of any given point. This allows us to write down the components of a smoothly varying field of tensors using the coordinates given by the chart. Then comes the "tensor transformation law." If some point p sits in the intersection of two different charts, we have two sets of local coordinates coming from each chart. The tensor transformation law simply says that changing from one set of coordinates A to the other set of coordinates B should change the components of the tensor in such a way that the local description of the tensor in the B coordinates matches the transformed version of the tensor in the A coordinates. You can think about this compatability condition as gluing the local descriptions of the tensor together to form a global tensor field on the whole manifold. The concepts are related but not identical. To summarize: a mathematician's tensor is an object defined over a vector space, a physicist's tensor is a field of mathematician's tensors on a manifold.
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
Its the same thing, this is just the definition.
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
It is related by being the same fucking concept.
@buxeessingh2571
@buxeessingh2571 2 жыл бұрын
I absolutely want to see a video comparing different types of tensors. I got confused in a graduate algebra class and a graduate geometry class because I was first exposed to tensors in mechanical engineering and then in modern physics.
@JM-us3fr
@JM-us3fr 2 жыл бұрын
My understanding is that the tensors in physics are just these simple tensors that Michael is talking about, but using its matrix representation instead. But just like you can talk about a whole matrix (a_ij) by only considering a single general entry a_ij, physicists prefer to only talk about a single entry in their tensors, which may also require superscripts.
@TheElCogno
@TheElCogno 2 жыл бұрын
I think tensors in physics, at least in general relativity, are really tensor fields from differential geometry.
@rtg_onefourtwoeightfiveseven
@rtg_onefourtwoeightfiveseven 2 жыл бұрын
​@@JM-us3fr AFAIK in physics there's an explicit restriction with how tensors transform under coordinate transformations. Is this restriction also implicit in the mathematical definition, or no?
@JM-us3fr
@JM-us3fr 2 жыл бұрын
@@rtg_onefourtwoeightfiveseven I’m not sure if these are the same, but I know when you want to change the basis of your vector space, and you want to represent an alternating tensor with respect to the new basis, then you can just multiply the 1st representation by the determinant of the basis transformation matrix. But you’re probably referring to the reference frame transformation in general relativity. I’m not quite sure how this works.
@schmud68
@schmud68 2 жыл бұрын
@@rtg_onefourtwoeightfiveseven Yes it is implicit in the mathematical definition. You start with a differentiable manifold which has local coordinate charts, but you define tensor fields on the manifold independently of coordinate charts. So, they don't depend on the choice of coordinates. Let us work point-wise on the manifold. When you see the tensors in GR, what you are looking at is a choice of coordinate chart which determines a basis for the tangent/cotangent spaces (these are vector spaces associated to points on the manifold), which then allows you to express tensors as linear combinations of tensor products of these basis vectors. The coefficients of these linear combinations are the coefficients you see in Einstein notation in GR; these are, mathematically, NOT tensors. Now, I said the tensors themselves are completely coordinate independent. What this means is that the basis vectors (determined by a coordinate transformation) will transform inversely to the coefficients so that they cancel out. This is where the Jacobian/inverse Jacobian come in for the coordinate transformations. Finally, the upper and lower indices of tensors really refer to the tangent and cotangent spaces, where the cotangent space is the dual of the tangent space. A rank (1,1) tensor is built from a linear of combination of tangent vectors tensor producted with cotangent vectors. Now, if the manifold has a Riemannian metric, then it can be used to provide a canonical isomorphism between tangent and cotangent spaces (given by raising/lowering indices with the metric in GR notation). Hopefully this gives some ideas. There is quite a lot going on and I have swept some things under the rug, but if you want to formalise this further you need: vector bundles, tensors products of vector bundles, sections of bundles, and universal property of tensor products
@cicciocareri85
@cicciocareri85 2 жыл бұрын
I'm an engineer and I've used tensors and dyadic product in mathematical physics, electromagnetism and continuum mechanics. I'd like to see more of this!
@snnwstt
@snnwstt 2 жыл бұрын
I second that. Maybe with implication of a base change for the vectors (like cartesian to oblique, or polar (2D), or curved). Sure that will be even greater with derivatives (first and second order).
@mastershooter64
@mastershooter64 2 жыл бұрын
lol I thought the most advanced math engineers used was basic math like vector calc and linear algebra and some integral transforms
@cicciocareri85
@cicciocareri85 2 жыл бұрын
@@mastershooter64 it depends where you study :-)
@mastershooter64
@mastershooter64 2 жыл бұрын
@@cicciocareri85 which field of engineering uses the most advanced math?
@snnwstt
@snnwstt 2 жыл бұрын
@@mastershooter64 Finite elements technique often use simple "reference elements" which are then ... tortured. Depending of the desired "continuity", the numerical derivatives implied by the (partial) differential equation(s) borrow heavily from tensor theory.
@TIMS3O
@TIMS3O 2 жыл бұрын
Really nice video. One thing that is worth mentioning is that the reason why one wants the relations mentioned is that they are the relations needed to turn bilinear maps into linear maps. This essentialy reduces the study of bilinear maps to the study of linear maps which one already knows from linear algebra. This property of turning bilinear maps to linear maps is called the universal property of the tensor product.
@schmud68
@schmud68 2 жыл бұрын
Yeah, I thought this was very important too
@alonamaloh
@alonamaloh 2 жыл бұрын
This is exactly how the tensor product was introduced to me in my first year Linear Algebra class in college, but I think your exposition is more clear than what I remember getting at school. At the time I couldn't understand what this was about at all. The definition through the universal property regarding bilinear maps made a lot more sense to me. You may want to mention it if you make further videos on this subject.
@Handelsbilanzdefizit
@Handelsbilanzdefizit 2 жыл бұрын
MachineLearning-Student: "Hey, look at my tensors!" Physics-Student: "No, better not. But look at my tensors!" Maths-Student: "No, better not."
@thenateman27
@thenateman27 2 жыл бұрын
Don't fear my Levi-Civita operator and my Electromagnetic tensor
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
@@thenateman27 Electromagnetic 2-form*
@minimo3631
@minimo3631 9 күн бұрын
​​@@mikhailmikhailov8781quotient rings of antisymetric tensors, exterior algebra of the cotangent space, potato potahto...
@sslelgamal5206
@sslelgamal5206 2 жыл бұрын
Nice, thank you! All the guide on internet only look at it through physicist POV, an abstract mathematical construction was missing. I hope this series continues 👍👌
@renerpho
@renerpho 2 жыл бұрын
Yes, more on this, please! Particularly on tensors in physics and how they do (or don't) relate to your formal definition.
@MasterHigure
@MasterHigure 2 жыл бұрын
The short version is, the "tensors" (letters with upper and lower indices) that appear in physics (at least in general relativity; I know little about, say, material stresses) are the coefficients of the linear combinations we see here. The basis vectors and the tensor product symbol are elided.
@k-theory8604
@k-theory8604 2 жыл бұрын
​@@MasterHigure No, the tensors are not the coefficients. In physics, we are often concerned with vector fields. Essentially, we imagine these as some sort of vector sitting at each point in space, which vary continuously. However, in order to formally define such a thing, one first needs to define the tangents space to a (manifold) space at every point. Then, one thinks about all of these tangent spaces together as forming one large geometric object, called the tangent bundle Even more generally, one takes the tensor products of each tangent space with itself a number of times, and then consider that as one big collection of objects. Then, tensor fields in physics are just maps from the underlying space to the (generalized) tangent bundle. TL;DR: They are related, but the physics tensor is much more complicated. One needs the definition in this video to define the version from physics.
@schmud68
@schmud68 2 жыл бұрын
Yeah exactly, then you also need the cotangent bundle and the notion of the universal property of tensor products. Then, after specifying a coordinate chart, you can evaluate everything using the basis provided by the chart and get the usual physics Einstein notation. Quite a journey lol
@dhaka_mathematical_school
@dhaka_mathematical_school 2 жыл бұрын
This lecture is insanely beautiful. Please prepare/make a couple more videos on tensor products: (1) Tensors as multilinear operators, (2) tensor component transformation (covariant and contravariant), and (3) tensors as invariant/geometric objects. Thanks so much.
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@rachmondhoward2125 Insanely beautiful construction example. Let's (A)(D)simplicyan be positional natural A-ary SUM(D)-digit number system. where A,D -some natural number sequences arbitrary, but equal length. Then: d-vertex simplex is a (2)(d)simplicyan. square is a (3)(2)simplicyan. cube is a (3)(3)simplicyan. d-dimensional (n-1)*2 -gon is a (n)(d)simplicuan pentagon is a (11)(1)simplicyan. d-dimensional (n-1)/2 -gon is a (n)(d-1)simplicuan Maya/Egypt pyramid is a (2,3)(1,2)simplicyan, let's enumerate it: 100 is a vertex above square. 022 (=NOT 100) is opposite edge 020 002 is are 2vertices between wich lies edge 022=002+020. 120 (=100+020) is edge between verices 100 and 020. 102 (=100+002) is edge between verices 100 and 002. 122 (100+020+002) is infinity(content) located between 3 vertices 100 020 002 and 3 edges 120 102 022. from that moment we have 2 variants: a)vertex 001 connected to vertex 020.(i choose it) b)vertex 001 connected to vertex 002. 001 (=100+100) is vertex connected to vertex 100. 021 (=001+020) is edge between vertices 001 and 020. 010 (=001+001) is vertex connected to vertex 001. 011 (=001+010) is edge between vertices 001 and 010. 012 (=002+010) is edge between vertices 002 and 010. 101 (=001+100) is edge between vertices 001 and 100. 110 (=100+010) is edge between vertices 100 and 010. 111 (=100+010+001) is triangle between vertices 100 010 001. 121 (=100+020+001) is triangle between vertices 100 020 001. 112 (=100+010+002) is triangle between vertices 100 010 002. 000 is zero is externity of pyramid . As you see: 1. infinity(content) is like triangle. 2. square doesn't exists, sguare + volume above it upto vertex 100 is Zero.
@Mr.Not_Sure
@Mr.Not_Sure Жыл бұрын
@@user-ys3ev5sh3w Your construction has numerous mistakes and inconsistencies: 1. You count interior of a k-simplex, but you don't count interiors of a square and cube. 2. Compared with (n-1)/2-gon, your (n-1)*2-gon has 4 times more "gon"s (whatever they are), but its corresponding simplician has n times more indices. 3. Equation "001=100+100" is false. 4. And what about dodecahedron? It's 3-dimensional 12-hedron with 30 edges, but doesn't correspond neither to (7)(3) nor (16)(3) simplician.
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@@Mr.Not_Sure you are right. In equation "001=100+100" i mean that carry rotate. Preface. Positional natural a-ary (az^m) d-digit number systems can represent some kind of d-dimensional polytopes in geometry. If a=z^m then a-ary d-digit number system equals z-ary (m*d)-digit number system, because z^m^d=z^(m*d). For example: binary d-digit number system is a d-vertex simplex.(vertices is a numbers with digital root=1, edges is a numbers with digital root=2 and so on). 3-ary d-digit number system is a d-cube. Let d=2: if 1-chain (2 vertices + 1 edge) shift 1 times we receive 2-cube with 3^2=9 faces, i.e. square. 6-ary d-digit number system is a d-thorus. Let d=2: If 3-ring (1D triangle) shift in 3D 3 times and "press"(glue) first and last 1D triangles we recieve 2-thorus with 6^2=36 faces = 3*3 square + (3+3)*3 edges + 3*3 vertices. Conclusion. All positional natural a-ary (az^m) d-digit number systems with a^d numbers are represented by 3 types of d-polytopes (simplex (binary), cube (odd-ary) , thorus (even-ary) ) with a^d faces . For simplex d means amount of vertex or hyperplanes, for any other polytopes dimension. For me as a programmer, it's curious to know that difference in faces between consequent such polytopes is hexagonal numbers. To enumerate more complex polytope may be used positional natural A-ary D-digit number system , (A)(D)simplician. Where A,D - some natural number sequence arbitrary but equal length. Then 2D Maya/Egypt pyramid is are positional natural (2,3^2)-ary (1,1)-digit number system or (2,3^2)(1,1)simplician. digital root of (1,1)=2 is dimension of pyramid. 10 is a vertex above square. 08 (=NOT 10) is one of opposite edges. 06 02 is are 2 vertices between which lies edge 08=02+06. 16 (=10+06) is edge between vertices 10 and 06. 12 (=10+02) is edge between vertices 10 and 02. 18 (10+06+02) is triangle, infinity(content) located between 3 vertices 10 06 02 and 3 edges 16 12 08. 01 (=10+10) is vertex connected to vertex 10. 07 (=01+06) is edge between vertices 01 and 06. 03 (=01+opposite vertex 02) is vertex connected to vertex 01 and 02 instead of absent edge between 01 and 02.(exeption from rule?). 04 (=01+03) is edge between vertices 01 and 03. 05 (=02+03) is edge between vertices 02 and 03. 11 (=01+10) is edge between vertices 01 and 10. 13 (=10+03) is edge between vertices 10 and 03. 14 (=10+03+01) is triangle between vertices 10 03 01. 17 (=10+06+01) is triangle between vertices 10 06 01. 15 (=10+03+02) is triangle between vertices 10 03 02. 00 is square(zero). 18 faces=5 vertices+8 edges+4 triangles+1 square=3^2*2^1 numbers. 2D dodecahedron has 12 pentagons+30 edges+20 vertices=62 faces= 31^1*2^1 numbers of (2,31)(1,1)simplician (even thorus-like 2-polytope). 3D dodecahedron has 62+I=63 faces =7*3*3 numbers of (7,3)(1,2)simplician (odd cube-like 3-polytope). Simplex differs from any other polytope. Simplex is only "not preesed" polytope. Outside and Inside parts of this polytope is considered by me to be faces. As result simplex has 2^d faces (no 2^d-1). If to increment dimension they became from potentional to real faces. Outside and Inside parts both are counted. In other polytopes 1 or 1/2 of dimension is "pressed"("curved") in variouse ways. For example in even(thorus-like) 2-dimensional in 3D (no 3-dimensional) M aya/Egypt pyramid Outside part of "pressed" dimension is square, Inside part is triangle, so both Outside and Inside parts (interior and exterior) are not counted. In odd (cube-like, in A odd numbers only) polytope only Outside part (zero) is pressed in one of hyperplanes, so Outside part is not counted, only interior is counted, . And so on. "No distinction between numbers and shape. Numbers could not exist without shape." Pythagoras (reincarnation of Euphorbos).
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@@Mr.Not_Sure Lets enumerate 3D dodecahedron ,wich is (7,3)(1,2)simplician with 7^1*3^2=63 faces. It's evident: 1. (m+n,..)(1,..)=(m,..)(1,..)+(n,..)(1,..) because of (m+n)^1=m^1+n^1. 2. (m,..)(i+j,..)=(m,..)(i,..)*(m,..)(j,..) because of m^(i+j)=m^i*m^j. 3. (m,m,..)(i,j,..)=(m,..)(i+j,..) because of m^i*m^j=m^(i+j). 4. (m^n,..)(i,..)=(m,..)(n*i,..) because of m^n^i=m^(n*i). 5. (m,n,..)(i,i,..)=(m*n,..)(i,..) because of m^i*n^i=(m*n)^i. 6. ( m ,1,..)(i,j,..)=(m..)(i,..) because of m^i*1^j=m^i.. Use 1..4: (7,3)(1,2)=(4+3,3)(1,2)=(4,3)(1,2)+(3,3)(1,2)=(2,3)(2,2)+(3)(3)=(6)(2)+(3)(3); So 3D dodecahedron equals 3D cube (3)(3) + 2D thor (6)(2). Lets enumerate 1D chain (3)(1). It's easy. Let edge be 2 because 2 is maximal number(infinity) Let rest two vertices be 0 1. Lets enumerate 2D square (3)(2)=(3)(1)*(3)(1). It's easy. Let 2D chain in center be located in 2 of first digit. Let rest two 1D chains be located in 0 1 of first digit. Lets enumerate 3D cube (3)(3)=(3)(1)*(3)(2). It's easy. Let 3D chain in center be located in 2 of first digit. Let rest two 2D chains be located in 0 1 of first digit. Lets enumerate 1D ring(triangle) (6)(1)=(3)(1)+(3)(1). It's easy. Let 1D chain be located in 0..2 of digit Let rest inversed 1D chain located in 3..5 of digit. Lets enumerate 2D thor (6)(2)=(6)(1)*(6)(1). It's easy. Let 3 1D rings be located in 0..2 of first digit Let 3 2D rings be located in 3..5 of first digit. Lets enumerate 3D dodecahedron (7,3)(1,2)=(6)(2)+(3)(3). It's easy. Let 3D cube (3)(3) be located in 0..2 of first digit unmodified as above. Let 2D thor (6)(2) be located in 3..6 of first digit in such way: (6)(2)=(4,3)(1,2)=(2,3)(1,2)+(2,3)(1,2)= 2D maya/egypt pyramid +2D maya/egypt pyramid. Let 2D maya/egypt pyramid be located in 3..4 of first digit Let 2D maya/egypt pyramid be located in 5..6 of first digit At last we must "press" 6 vertex of cube to triangles. As result 6 vertex of cube became interiors of 6 hedrons. 6 rings of thor are added to close this 6 new interiors. We receive 12 hedrons=6 old + 6 new. 30 edges=12 edges of cube + 3*3 edges + 3*3 square of thor 20 vertices=(8-6) vertices of cube + 3*3 vertices of 3 1D rings +3*3 edges of 3 2D rings of tohr.
@Fetrovsky
@Fetrovsky 2 жыл бұрын
It all makes perfect sense, but it could've helped to define the star operation between bectors, and then what a tensor is and why that name was chosen right around the time it was defined.
@kenzou776
@kenzou776 2 жыл бұрын
What is the star operation between vectors??
@mr.es1857
@mr.es1857 2 жыл бұрын
What is the star operation between vectors?? Please develop
@philippg6023
@philippg6023 2 жыл бұрын
The small star between two vectors is a 'dummy' Operation. So * Maps v and w to v * w wich is an Element in V * W. Therefore the Star is just notation. Then he says all these star products are defined to be linear Independent. Hence all Elements v * w are a Basis of V * W. (This yields for example immediatly, that V*W is Infinite dimensional). I hope this helps you. Edit: an example: in the karthesian product (a, b) + (c, d) = (a+c, b+d), however a*b + c*d is not equal to (a+c)*(b+d).
@Fetrovsky
@Fetrovsky 2 жыл бұрын
@@philippg6023 So, if I get you correctly, there is no "star" operation per se but it's a representation of any linear operation that has certain characteristics. Is that correct?
@philippg6023
@philippg6023 2 жыл бұрын
@@Fetrovsky maybe you mean the correct Thing but you say it a little bit wrong. v * w is Just a symbol, which has No meaning at all. Hence the Star product is called formal product If you have a good understanding of Basis, then you might Like this explenation. V*W is a vectorspace with the karthesian product as Basis.
@pablostraub
@pablostraub 2 жыл бұрын
I would have liked the critical concept of "span" to be explained. Also, at the beginning there was a comment that "obviously" the dimension of the star product would be infinite (I was guessing 5 or 6, but no infinite) so I got lost.
@TheElCogno
@TheElCogno 2 жыл бұрын
The way I understand it, the span of a set is the set of all finite linear combinations of elements of that set, and the formal product is infinite dimensional because any v*w is a basis vector, since it cannot be expressed as a linear combination of other elements of that type.
@user-jc2lz6jb2e
@user-jc2lz6jb2e 2 жыл бұрын
The dimension is the maximum number of independent vectors you can have. R^3 has at most 3: for example, [1,0,-9], [0,4,0], [1,2,3]. You put any more vectors, and you get relations between them. For the star product, every v*w is independent from every r*s (where either component is different). You have infinitely many options for the first component, then infinitely many options for the second, so the dimension is infinite.
@brooksbryant2478
@brooksbryant2478 2 жыл бұрын
Is the formal product essentially just the Cartesian product or is there a difference I’m missing?
@user-jc2lz6jb2e
@user-jc2lz6jb2e 2 жыл бұрын
@@brooksbryant2478 The basis of the star product is just the Cartesian product. But the span is much bigger. You're taking every (finite) linear combination from this basis to make the star product space.
@pbroks13
@pbroks13 2 жыл бұрын
Formal product is basically a “dummy” operation. You basically just stick the events together, but you are not allowed to do anything to them. So, for example, if * is a formal product, you could NOT say that 2*3=3*2 or anything like that. 2*3 and 3*2 are their own separate things.
@lexinwonderland5741
@lexinwonderland5741 2 жыл бұрын
PLEASE post more!! Especially loved how you went through the formal product and coset (although I would like some more background/elaboration there), and I would love to see the other proofs and the isomorphism to R6 you teased at the end!
@Julian-ot8cs
@Julian-ot8cs 2 жыл бұрын
I've been dying to get a good introduction to tensors from a mathematical perspective, and this video really does it for me! This was very fun to watch, and I'd love to see more!!!!
@muenstercheese
@muenstercheese 2 жыл бұрын
i’ve only heard about tensors from machine learning, and this is blowing my mind. i love this, thanks for the awesome production
@alonamaloh
@alonamaloh 2 жыл бұрын
I think the word "tensor" in machine learning is closer to the programming notion of "array": The objects described in this video have more structure, in some sense.
@AdrianBoyko
@AdrianBoyko 2 жыл бұрын
“More structured” or “more constrained”?
@alonamaloh
@alonamaloh 2 жыл бұрын
@@AdrianBoyko I think I meant what I said. Tensors in math are associated with the notion of multilinear functions, but in a neural network that does image recognition, the input can be a "tensor" with dimensions width x height x 3(rgb) x minibatch_size. However, as far as I can tell, that's just an array, without any further structure.
@drdca8263
@drdca8263 2 жыл бұрын
@@alonamaloh many of the operations done with them are multilinear maps. Specifically, the parts with the parameters that are trained are often multilinear maps with these, with non-linear parts between such multilinear maps. This seems to me somewhat tensor-y ? Like, yeah, it is all done with a basis chosen for each vector space, but, well, you *can* reason about it in a basis-free way ? Like, for an image with 3 color channels, the space of such images can be regarded as the tensor product of the space of greyscale images with the space of colors, or as the tensor product of greyscale 1D horizontal images with greyscale 1D vertical images with colors. The space of translation-invariant linear maps from greyscale images to greyscale images (of the same size) (so, convolutions) is a vector space, and I think it is the tensor product of the analogous spaces for the spaces of the horizontal and vertical 1D images. This all seems like tensors to me?
@FT029
@FT029 2 жыл бұрын
Good video! It's really funny how to get desired properties (scalar mult, distributivity), you just quotient out by a bunch of stuff! My abstract algebra professor constantly referred to tensor product as a "universal, bilinear map", and all our homework problems only used that definition-- nothing concrete. So it's great to see some examples! I would be curious to see more on the topic.
Жыл бұрын
I think getting desired properties via a quotient is a fairly general technique, isn't it? That's also how you can get complex numbers out of pairs of real numbers. Or how you can get interesting groups out of free groups etc.
@somgesomgedus9313
@somgesomgedus9313 2 жыл бұрын
If this construction scares you, don't worry. It's very technical and just made for having these nice properties we want the tensor product to have. What you need when you work with the tensor product is the universal property that gives you really control over the thing. Like many objects that are constructed for a universal property, the construction is very technical but once you know that it exists you can almost forget the explicit construction and just concentrate on the universal property! Great video btw I love it when I see stuff like that on KZbin!
@myt-mat-mil-mit-met-com-trol
@myt-mat-mil-mit-met-com-trol 2 жыл бұрын
I barely remember the similar approach from the perspective of my introductory notes to multilinear algebra, where category theory style proofs stood before the tensor product formal definition. Despite that I got a really good mark on the subject, I did not really grasped the idea, and how it is related to the way tensors are approached in other disciplines, say General Relativity. At first I promised myself to study the well-known Spivak's book (calculus on manifolds), but I couldn't keep the promise, as I got busy with obligations which, say, never required to fulfill it, neither asked me to use tensors. So now more than twenty years after, barely remembering, books still on shelves, obligations not really changed, hopefully I watch KZbin videos enough to preserve my reflections alive. So you have just me as a keen viewer.
@Evan490BC
@Evan490BC 2 жыл бұрын
Read Spivak's book even now. It is the most intuitive (to me at least), abstract exposition of tensors, chains, differential forms, etc I could find.
@glenm99
@glenm99 2 жыл бұрын
This is incidental to the main topic, but I love the way that's worded at 15:36 ... "But now we can extract something from I, at no cost, because something from I is deemed as being equal to zero in the quotient." When I took algebra (so many years ago), despite knowing the statements of and being able to prove the theorems, I struggled for a long time with the practical application of this idea. And then, when you put it like that... it's so simple!
@rickdoesmath3945
@rickdoesmath3945 2 жыл бұрын
10:50 this always happens in mathematics: Analysis: Ok so i have integrals now! I finally can define this seminorm on the set of integrable functions. I hope it's a norm. What the f? The seminorm of this function is 0 but the function is not identically 0 even though it SHOULD BE. Wait not all is lost. I can quotient my set with the equal almost everywhere relation f yeah! This is literally L^1. Algebra: Look at this cute integral domain! Let's build his fraction field! I can use ordered pairs. But wait, something is weird. Look: 2/3 and 4/6 SHOULD BE equal but they ain't. Wait not all is lost. I can quotient using the equivalence relation a/b equivalent to c/d when ad = bc. This is literally the smallest extension of an integral domain that is also a field.
@azeds
@azeds 2 жыл бұрын
Content. Video quality. Voice All this deserves 100/100
@cheeseburger118
@cheeseburger118 2 жыл бұрын
I've always been really curious what tensors were (never went deeper into math than multivariable calc). This was a little abstract for me, so I'd love to see how they are actually applied!
@jacobadamczyk3353
@jacobadamczyk3353 2 жыл бұрын
"If you're in I, in the quotient space you're equal to zero" never thought of cosets/quotients like this but very helpful!
@josephmellor7641
@josephmellor7641 2 жыл бұрын
I wanted to find this video, but I forgot your name for a second, so I searched for "tensors good place to stop." This video was the first result.
@samuelbam3748
@samuelbam3748 2 жыл бұрын
Another element which shows the difference between V*W and the tensor product of V and W really good is the vector 0*0 which is just another independent generator but NOT equal to the zero vector of V*W
@Racnive
@Racnive 2 жыл бұрын
Thank you! This helped demystify tensors for me. I appreciate starting from the generic span of infinite possibilities, specifying the basic necessary properties for this object, and showing what emerges from those conditions.
@Walczyk
@Walczyk 2 жыл бұрын
Hell yeah! This will be useful for all of us physicists needing a better understanding of tensor algebra
@caldersheagren
@caldersheagren 2 жыл бұрын
Love this, straight out of my abstract algebra class - although we did tensor modules instead of vector spaces
@Impatient_Ape
@Impatient_Ape Ай бұрын
Excellent! Your efforts to emphasize the difference between the formal products and the cosets built from the quotient really help clear up a lot of what gets frequently omitted in many textbooks -- especially in multiparticle quantum mechanics. Really good job sir!
@jamma246
@jamma246 2 жыл бұрын
23:11 the v is missing a subscript j. Other comment: when introducting the formal product, I think it's a bit odd to call this a "span". I'd call it the "vector space generated by the set VxW" or the "free vector space product of V and W", or something like that. I tend to think of "spans" as finding the smallest vector subspace of a subset of some larger vector space. What's important here is that the vector space sum and scalar product on VxW is completely irrelevent to the definition of V*W (that only comes back in when defining the subspace I, and the quotient tensor product). I think highlighting this a bit more at this stage (or taking more time on what you mean by the "span") might have made that easier to follow.
@rainerausdemspring3584
@rainerausdemspring3584 Жыл бұрын
Exactly. I got my master in mathematics ages ago, but as far as I remember I was introduced to tensor products in Homological Algebra/Algebraic Topology and as universal object.
@user_2793
@user_2793 2 жыл бұрын
Exactly what I need before looking at multiparticle states in QM! Thank you.
@AppliedMathematician
@AppliedMathematician 2 жыл бұрын
Well, to use the span you need to define the scalar multiplication on the set of symbols (v*w) first. This is a simple additional formal construction, but as a student I would have complained, that the span is not well defined on some fresh algebraic datatype construction. Well, as an applied mathematician I would suggest to learn Haskell or ML first. Especially learn what algebraic data types are. Then express the concept explained in this video in either of these languages. It all will be clear and much less confusing. In Haskell you can almost literally do the discussed construction. The constructor of the formal product would be ":*" or ":*:" instead of just a formal "*". If you approach learing that way you have learned a real world programming language, that is usable for formal verification, and you understood some pure mathematics as well. Further, in physics a tensor is usually what mathematicians call a tensor field ...
@OMGclueless
@OMGclueless 2 жыл бұрын
By this do you mean you first need to define axioms such that a(v*w) + b(v*w) = (a+b)(v*w) and v_1*w_1 + 0(v_2*w_2) = v_1*w_1 and the like are true? If so I agree, that was confusing why he was allowed to do the things he was doing but I just chalked it up to Michael assuming that by introducing "span_R{something algebraic}" he's introducing all the natural axioms of such a thing in some well-understood way.
@AppliedMathematician
@AppliedMathematician 2 жыл бұрын
@@OMGclueless : Yes, its actually trivial but only if you know it and in terms of actual implementation its not uniquely defined - Algebra usually is about formal structure, that can appear in many types of instantiations. If you are learning math and trying to achieve mathematical maturity a hint on the constructions is helpful - I think. Further learning functional programming, a lot of functional analysis construction will become quite natural - but different topic.
@jamma246
@jamma246 2 жыл бұрын
I was a bit perturbed first as to what he meant by the "span of ..." to define V*W. As a pure mathematician, I tend to use "span A" to mean the smallest subspace which contains the subset A of some bigger vector space V (so to take a span of something, that thing already needs to be a subset of some vector space). But he doesn't really mean "span", he means "the vector space generated by ...". This is the vector space which has these elements as a basis. You're wrong then to say you need to know how to define scalar multiplication on symbols v*w, by definition this "span" is just all formal (finite) linear combinations of elements v*w. It seems like you misunderstood, but I can appreciate that given the use of the word "span" here. Also, as a pure Mathematician, I love Haskell, especially because of its links to Catetgory Theory. But to say that you'd "suggest to learn Haskell or ML first", before learning about tensor products, is completely absurd. I completely agree it's a beneficial exercise, but I work in a Mathematics department and most people don't code in Haskell or ML, and many don't code at all. _"The constructor of the formal product would be ":*" "_ What the heck are you talking about? This isn't a typical type constructor in the Prelude. Looking it up on Hoogle, it seems to have different implementations, but I'd highly doubt there's one which takes two sets and constructor a vector space from it. Again, I think you've misunderstood the construction.
@OMGclueless
@OMGclueless 2 жыл бұрын
@@jamma246 The creation of a vector space from a set is not *just* a formal construction though, is it? It's defined in terms of scalar multiplication by the reals, which is why there's a subscript "R" in "span_R".
@jamma246
@jamma246 2 жыл бұрын
@@OMGclueless So? All of that is wrapped up in what he means by the "span" of a set. He didn't need to add the subscript R, in the sense that it was clear from context (he said that all vector spaces were considered to be over R). What's really going on here is that he took two functors: the lazy functor L (which takes a vector space and discards its sum and scalar product, remembering only the underlying set) and the functor G which generates a vector space (sure, over R) from any set S, more precisely the vector space of finite linear sums of elements over s (so a generic element looks like lambda_1 s_1 + ... + lambda_n s_n, where each lambda_i is a real number and each s_i is an element from S). So he defined V*W := (G o I)(VxW), where VxW is the usual Cartesian product of vector spaces and G, I are as above. The confusing thing about this is that the vector space structures on V and W (i.e., addition and scalar multiplication) are _completely irrelevent_ to V*W, all that we remember is what it is _as a set_. The extra structure only comes back in when you define the subspace I, and the tensor product. Indeed, intuitively, the game is to reintroduce rules that "should" be there from the original vector spaces. I do think he should have drawn more attention to all of this, and been more careful in his language i.e., talk about the "set generated by" a set S, rather than just calling it "the span".
@mattc160
@mattc160 2 жыл бұрын
Please do more of this!!!!
@Stobber1981
@Stobber1981 11 ай бұрын
I know it's a year later, but this construction of tensors really helped to back-fill the hackneyed explanations given by engineers and physicists all over YT. I especially appreciated the definition of the R2*R3 basis at the end and its comparison to the matrix space basis. I'd LOVE more takes on tensors from your perspective, please!
@ed.puckett
@ed.puckett 2 жыл бұрын
Thank you, your videos are so good. This is my vote for more on this subject.
@lewisbulled6764
@lewisbulled6764 2 жыл бұрын
I’ve only ever used tensors in continuum mechanics (most notably the stress tensor, but also strain tensors and deformation-gradient tensors), so it was interesting to see a completely different perspective on them!
@andreyv3609
@andreyv3609 2 жыл бұрын
What a pleasure to finally heave a clear, pure (actual) math exposition on the subject. Good job!
@nelprincipe
@nelprincipe 2 жыл бұрын
Indeed, life is better with more tensors in it. Give us moar!
@bilmoorhead7389
@bilmoorhead7389 2 жыл бұрын
Yes, please, more on this topic.
@maximgoncharov9027
@maximgoncharov9027 Жыл бұрын
Nice explanation, friendly for beginners. The best way to define the tensor product, from my point of view (there are at least 4 different definitions). BTW, as a confusing example of non-equal elements from U*V, I would suggest elements 0*w, v*0, 0*0 and 0 (these vectors are different in V*W).
@TheMauror22
@TheMauror22 2 жыл бұрын
Please keep these series going! It was very cool!
@WindsorMason
@WindsorMason 2 жыл бұрын
"...And that's a good place--" Cliffhanger! We never actually stopped, so the video must have a sequel.
@noahtaul
@noahtaul 2 жыл бұрын
I think this could be a good series. If you’re thinking about continuing in a physics sense, or an algebraic geometry sense, I think that would be useful, but I think it would make a good series to put together a set of videos on homological algebra. Exact sequences, hom-tensor adjoint, cohomology/homology of chain complexes, etc.
@RichardTalcott
@RichardTalcott 2 жыл бұрын
EXCELLENT! Please MORE videos on this & related topics.
@MasterHigure
@MasterHigure 2 жыл бұрын
To be strict, if you want your isomorphism between tensor product and matrix space to be "more natural" than the isomorphism with R^6, the 3-vectors should be row vectors. That's the safest way to keep your sanity through any base changes you might encounter, as well as keeping track of how they naturally act with other vectors they might come across. If you keep them both as columns, you should pair the tensor product with 6x1 matrices. For those of us who are used to the index notation from differential geometry, and particularly general relativity, this is like not properly keeping track of which indices are upper and which are lower. That's a recipe for disaster.
@markkennedy9767
@markkennedy9767 2 жыл бұрын
I would definitely look forward to more stuff on tensors. Your exposition is really good.
@kennethvanallen4492
@kennethvanallen4492 Жыл бұрын
I much prefer the mathematician’s approach to tensors. Thank you for this explanation - it will help me communicate the concept!
@peterg76yt
@peterg76yt 2 жыл бұрын
This really needs a definition of 'span', because it seems to mean something different from the last time I encountered it.
@iabervon
@iabervon 2 жыл бұрын
It's still "all the sums of multiples of elements of the space", but normally, you've got a space where these sums and multiples will simplify, so everything beyond a few basis vectors doesn't matter. Here, nothing simplifies, so it blows up.
@HilbertXVI
@HilbertXVI 2 жыл бұрын
Span here really means he's taking the free vector space over VxW-the space of all formal linear combinations of members of VxW. He's basically considering the set of all tuples (v,w) as a basis for this new, enormous vector space.
@descheleschilder401
@descheleschilder401 Жыл бұрын
Hi there! I love your way of making things clear! Very clear and in outspoken English! You make each step easy to follow, as if it all comes about very naturally! You're a great teacher! So keep up the good work!
@chrstfer2452
@chrstfer2452 2 жыл бұрын
Edit i finally got it! Thanks Professor Penn!! Original comment:Is there a good video on quotient spaces? Thats where i got lost, when you introduced v(+)w = v*w/I Edited to add: a few months and a lot of studying (though surprisingly not much on linear algebra or group theory) and i think i finally understand every part of this video and im so excited for it. That quotient space construction to derive the necesary structure out of the formal product space is absolutely beautiful. If anyone wants any tips, i found getting some "physics side" intuition for tensors and a light brush up on group theory to better understand the wikipedia page on quotient spaces together made a lot of difference. Also a bunch of general math videos, especially a ton of professor Penn's here, to grease your math muscles if youre rusty.
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
It's my "physics side" intuition for number systems. Let's (A)(D)simplicyan be positional natural A-ary SUM(D)-digit number system. where A,D -some natural number sequences arbitrary, but equal length. Then: d-vertex simplex is a (2)(d)simplicyan. electric monopole is a (2)(1)simplicyan magnetic dipole is a (2)(2)simplicyan mass 3-pole is a (2)(3)simplicyan gravitational 4-pole is a (2)(4)simplicyan square is a (3)(2)simplicyan. cube is a (3)(3)simplicyan. d-dimensional (n-1)*2 -gon is a (n)(d)simplicuan pentagon is a (11)(1)simplicyan. d-dimensional (n-1)/2 -gon is a (n)(d-1)simplicuan Maya/Egypt pyramid is a (2,3)(1,2)simplicyan, let's enumerate it: 100 is a vertex above square. 022 (=NOT 100) is opposite edge 020 002 is are 2vertices between wich lies edge 022=002+020. 120 (=100+020) is edge between verices 100 and 020. 102 (=100+002) is edge between verices 100 and 002. 122 (100+020+002) is infinity(content) located between 3 vertices 100 020 002 and 3 edges 120 102 022. from that moment we have 2 variants: a)vertex 001 connected to vertex 020.(i choose it) b)vertex 001 connected to vertex 002. 001 (=100+100) is vertex connected to vertex 100. 021 (=001+020) is edge between vertices 001 and 020. 010 (=001+001) is vertex connected to vertex 001. 011 (=001+010) is edge between vertices 001 and 010. 012 (=002+010) is edge between vertices 002 and 010. 101 (=001+100) is edge between vertices 001 and 100. 110 (=100+010) is edge between vertices 100 and 010. 111 (=100+010+001) is triangle between vertices 100 010 001. 121 (=100+020+001) is triangle between vertices 100 020 001. 112 (=100+010+002) is triangle between vertices 100 010 002. 000 is zero is externity of pyramid . As you see: 1. infinity(content) is like triangle. 2. square doesn't exists, sguare + volume above it upto vertex 100 is Zero.
@yf-n7710
@yf-n7710 Жыл бұрын
The only reason I understood the quotient spaces part was because I learned about them in a class I took two months ago. I was very pleased to see them pop up again.
@chrstfer2452
@chrstfer2452 Жыл бұрын
Coming back to this again and its all felt pretty intuitive. Learning is neat.
@vorfreu
@vorfreu 2 жыл бұрын
It would be great if you could make a series on tensors like the one on number theory. There are not many pure mathematics perspective on this subject. Also great video
@burkhardstackelberg1203
@burkhardstackelberg1203 2 жыл бұрын
Another perspective to look at tensors I like, is the one from multilinear maps. And this video is a good starting point. To every vector space, there exists a dual space of mappings of those vectors to real numbers (or whichever field you choose). A direct product of two mapping spaces allows you to map a direct product of vectors to your field, giving you the ability to create a metric of your vector space. A direct product of a vector space and its dual space creates very much of what we know as quadratic matrices - mappings of that vector space to itself. But without the need of spelling out components.
@dominikstepien2000
@dominikstepien2000 2 жыл бұрын
Great video, I had similar introduction in Differential Forms course, but your explanation was far better than my professor's one. I would love to see more about this. I don't know much about different perspectives on tensor products, but I have heard about usage of them in physics and machine learning and I have no clue how it could be related.
@JimMcCann500
@JimMcCann500 2 жыл бұрын
If you could do more on Tensors, Covariant Derivatives, Yang Mills (maybe), d'alembertian operators, that would be great!
@cpiantes
@cpiantes 2 жыл бұрын
Having to change considering 2(a,b) = (2a,2b) as a distribution operation in a vector space to having (a,b) /= (2a,2b) as essentially an infinite indexing was mind-blowing.
@infinityinf1
@infinityinf1 2 жыл бұрын
I thought this was a numberphile video from the thumbnail. I love that brown paper!
@stabbysmurf
@stabbysmurf 2 жыл бұрын
Thank you for this video. I've tried to learn tensors from mathematical physics texts, and I've had an awful time. It helps a great deal to see a mathematician providing a proper structural definition with a level of precision that I can understand.
@LorenzoChiaveriniDBTF
@LorenzoChiaveriniDBTF 2 жыл бұрын
Great video. I have been studying tensors for continuum mechanics and I am definitely interested. It would be great if you could show the relationship between this more general definition and the other ones i saw (linear mapping, and multilinear mapping).
@bonsairobo
@bonsairobo Жыл бұрын
Wow something clicked for me with quotients when you defined the subspace of vectors that you essentially want to equal zero. I had never thought of doing this, but it makes a lot of sense with the fact that elements of the normal subgroup act like the identity when applied to a coset. Very cool.
@scollyer.tuition
@scollyer.tuition 2 жыл бұрын
Very nice explanation. I'd be interested in seeing more along these lines. Maybe you could talk about how tensor products convert bilinear maps into linear maps.
@evankalis
@evankalis 2 жыл бұрын
I know id watch more videos on this topic for sure!
@KAKABOTINI
@KAKABOTINI 2 жыл бұрын
finally a formal explanation on tensors that doesn't rely on that confusing example involving a block and shearing forces
@sageofsixpack226
@sageofsixpack226 2 жыл бұрын
Hi. I'm so glad to see this topic on your channel! Btw, have you ever seen a derivation of the Riemann Tensor expression involving covariant derivatives and Lie brackets from definition of it as a tool of measuring holonomy that doesn't assume the Lie brackets of vector fields is 0? I've been looking for it for a while, read a lot of books, tried to prove it on my own - but no result whatsoever 😔. So maybe you know any resources or (even WAY better) can make a video on this 😏
@Hank-ry9bz
@Hank-ry9bz 5 ай бұрын
4:50 bookmark: examples in R2*R3 10:55 quotient space, 18:15 basis, 24:00 basis example
@jongraham7362
@jongraham7362 2 жыл бұрын
Great video... I have been trying to understand tensors for many years... I've seen it from the Physics viewpoint and the Abstract Algebra derivation. This helps, thanks. More would be good.
@yf-n7710
@yf-n7710 Жыл бұрын
Six months ago I would have been lost here, but I actually understood everything in this video!
@elramon7038
@elramon7038 2 жыл бұрын
I just started studying physics and literally everybody is talking about tensors. I luckily stumbled across this video which happened to be really helpful for my understanding. Also if you could do videos on different uses of tensors in different fields, that would be awesome. Keep up the great work!
@jmafoko
@jmafoko 5 ай бұрын
DR Penn always makes it look easy using simple examples. Will like to hear more about the quotient space, i think that is key to this structure. By the way it wil be nice to also tie it to differential forms so that we are closer to apps
@AlisterMachado
@AlisterMachado 2 жыл бұрын
This is such a nice view on the Tensor product! Definitely hoping to see more :)
@kapoioBCS
@kapoioBCS 2 жыл бұрын
I an looking forward on more videos on more advanced subjects! I just finished a course on representation theory for my master.
@guyarbel2387
@guyarbel2387 2 жыл бұрын
What an amazing video ! Thank you so much :)
@Nickelicious7
@Nickelicious7 2 жыл бұрын
Love this, please do more
@mr.es1857
@mr.es1857 2 жыл бұрын
Thanks for your great lecture, thank you everyone for your comments. Two questions here : What is spanning or span ? What is the meaning of the "co-set"? BR, Tara,
@nosy-cat
@nosy-cat 2 жыл бұрын
Beautifully done. Watched this today for the second time, and this time around I feel like I really understand what's going on. Would love to see more on this topic!
@wongbob4813
@wongbob4813 2 жыл бұрын
More please! Loved this gentle introduction to tensors :)
@geoffrygifari3377
@geoffrygifari3377 2 жыл бұрын
Hi michael, i've had the chance to use tensor product in physics before, and honestly it baffled me. I think you can clear up these confusions: 1. What do you think are the properties of tensors, from mathematics point of view, that makes it useful in physics? 2. Do you happen to know why the idea of tensor "transformation" is important at all? 3. How can we see tensor-vector contraction in mathematics point of view (summing the products of vector and tensor components of equal entries, to get another vector) 4. Is it correct to say the number of tensor "indices" describes how many vector spaces being combined by tensor product? 5. Is that last part about isomorphism between tensor product of vector spaces and matrices (also can be built from row vectors) what "representation" is about? thank you for your time.
@bandreon
@bandreon 2 жыл бұрын
Brilliant class! Another of those things you should know but did not really.
@mathboy8188
@mathboy8188 2 жыл бұрын
Hear "tensor", think "multi-linear". See "tensor", think "coefficients fluidly glide between the factors".
@a_llama
@a_llama 2 жыл бұрын
Just came from the differential forms playlist! Very useful for gr and diff geometry
@christopherlord3441
@christopherlord3441 Жыл бұрын
This is really great stuff. Approaching tensors from a physics or engineering point of view is very confusing. As a pure mathematics concept it is much clearer. Keep up the good work.
@IustinThe_Human
@IustinThe_Human 2 жыл бұрын
this video connects for me so many dots, it was amazing
@AnyVideo999
@AnyVideo999 Ай бұрын
Let me clarify for everyone: tensors from physics are generally just multilinear maps which can have vector valued outputs. There are three objects in linear algebra: - scalars - vectors - dual vectors/covectors/linear functionals A physics tensor is a multilinear map which can be uniquely represented as a linear combination of vectors and functionals tensor producted together. The numbers with superscripts and subscripts are exactly the coefficients of these tensor product terms.
@tomhase7007
@tomhase7007 2 жыл бұрын
The last isomorphism R^2 \otimes R^3 -> M_2x3(R) becomes even more natural if you view the first R^2 as row vectors rather than column vectors; then it is just matrix mulitplication. Abstractly this is due to the fact that for vector spaces we have an isomorphism V^* \otimes W \to Hom(V,W), where V^* is the dual space of V and Hom denotes linear maps.
@Nickelicious7
@Nickelicious7 2 жыл бұрын
Could you explain a little more as to why a tensor product space is more naturally isomorphic to the matrix vector space of that dimension then to F^n of that dimension?
@MichaelPennMath
@MichaelPennMath 2 жыл бұрын
It is because in the category of finite dimensional vector spaces the only thing that "counts" is dimension. That is, everything of the same dimension is isomorphic. If you broaden the setting, however, these spaces have different "fine structure".
@perappelgren948
@perappelgren948 2 жыл бұрын
@0:25: "...relation of OUR language of tensors (to) those of other fields of mathematics..." 🤣🤣🤣 This really makes me fell special! So many thanks, Prof. P, for undisguising and defusing tensors.
@him21016
@him21016 2 жыл бұрын
The formal star vector product was never defined. This makes sense, as it was formal and I assume a placeholder for [any operation of interest]. This suggests to me that the notion of "tensor product" by itself is ill defined, as v * w first needs to be defined. Is there a standard concept of v * w and thus a canonical tensor product? Or does the implementation of * vary wildly?
@98danielray
@98danielray 2 жыл бұрын
it is not a placeholder.
@98danielray
@98danielray 2 жыл бұрын
you can define v*w to be an orderes pair if you want, but you gotta be careful in the sense that it does not come equipped with the usual direct product operations, since you cannot group sums and multiplication by scalars
@petergregory7199
@petergregory7199 Жыл бұрын
Merry Vector Michel, and a Happy New Tensor!
@GordonKindlmann
@GordonKindlmann 2 жыл бұрын
Thanks for making this great video. I am very curious how you would present the idea of covariant vs contravariant indices within a tensor.
@krelly90277
@krelly90277 2 жыл бұрын
Thank you for this video. Please consider making more videos on tensor theory.
@deltalima6703
@deltalima6703 2 жыл бұрын
Not done watching, but here is some comments. the fact that you first said x is an element of reals is amazing, everybody should do this. I like that you edit to avoid making me watch you scribble on the chalkboard. No chalkboard is better, like mathologer, but this does not waste my time so its good. Content is not for dummies, I appreciate that too, I might not know everything, but if I was completely incompetent I would be watching monster trucks and not this.
@djsmeguk
@djsmeguk 2 жыл бұрын
You can just about see how to get from here to physics world tensors, but I'd love to see your map of the path!
@jeffreycloete852
@jeffreycloete852 2 жыл бұрын
Thanks Prof Penn..that was nice!..Tensors and Universal Objects/Properties would be nice..keep well!!
@herogpi1
@herogpi1 2 жыл бұрын
Nice! I was thought that tensors were objects that could accept only one vector space. And for exemple, in a vector space V in R^n, a tensor of order m would have n^m elements. But it's far more complex
@Czeckie
@Czeckie 2 жыл бұрын
Since you asked, I personally would be more interested in tensors from the geometrical point of view. I know this algebraic definition and appreciate its usefulness in representation theory. I went pretty far with it, dabbled in some enriched categories. But these are just formal abstract construction. On the other hand, in geometry tensors mean something more tangible. It's a very useful kind of map. Even though I can work out the connection between these notions, I was never very confident about it. Frankly, the only tensors I can work with at least a little bit are differential forms. Btw, what the hell is a stress tensor?
@conoroneill8067
@conoroneill8067 2 жыл бұрын
I'd love to hear how the notion of contravariant and covariant tensors from physics relate to this picture described here.
@duncanw9901
@duncanw9901 2 жыл бұрын
There's a differential geometry definition that's really nice: a tensor is a function, linear in each argument, from the n-fold Cartesian product of a vector space and the k-fold Cartesian product of its dual to the field underlying the vector space. Under this definition (which is nice for e.g. curvature tensor, which takes inputs in the space of vectors tangent to a manifold at a point and outputs a scalar), n is the number of covariant indices and k is the number of contravariant indices.
@TD-ev7uj
@TD-ev7uj Ай бұрын
Wish this guy was my abstract algebra professor. Thank you!
@WarhavenSC
@WarhavenSC 2 жыл бұрын
I was going to say "a floating disk," but never mind. Please continue.
@cesarjom
@cesarjom 2 жыл бұрын
This is awesome you are exploring this topic of Tensors. Please continue with plans to develop more topics on applications of Tensors, especially in physics and of course general relativity.
@MrsHarryStlyes
@MrsHarryStlyes 2 жыл бұрын
Very cool! I've never seen this sort of construction using quotient vector spaces in my mathematical physics studies. Love to see more on this viewpoint and/or others!
@MasterHigure
@MasterHigure 2 жыл бұрын
Abstract algebra is all about quotients. They are the dual of subobjects (subspaces, subgroups, whatever), and that's really saying something about their importance ^^
@schmud68
@schmud68 2 жыл бұрын
In my algebra course we defined tensor products as the unique vector space satisfying a universal property (universal property of tensors products). We didn't really cover much of the foundational proofs for this (like existence), but uniqueness follows quickly from the universal property assumption. Then we used quotients of tensor product spaces to create the symmetric product and exterior product
Matrix exponentials, determinants, and Lie algebras.
25:47
Michael Penn
Рет қаралды 87 М.
What the HECK is a Tensor?!?
11:47
The Science Asylum
Рет қаралды 749 М.
Harley Quinn's desire to win!!!#Harley Quinn #joker
00:24
Harley Quinn with the Joker
Рет қаралды 16 МЛН
If Barbie came to life! 💝
00:37
Meow-some! Reacts
Рет қаралды 69 МЛН
Happy birthday to you by Tsuriki Show
00:12
Tsuriki Show
Рет қаралды 10 МЛН
One of the most important algebras -- The Witt Algebra
36:37
Michael Penn
Рет қаралды 93 М.
The Meaning of the Metric Tensor
19:22
Dialect
Рет қаралды 211 М.
What are these symbols? - Numberphile
21:19
Numberphile
Рет қаралды 228 М.
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,6 МЛН
the equation Ramanujan couldn't solve!!
37:03
Michael Penn
Рет қаралды 64 М.
What is a hole?
9:24
Aleph 0
Рет қаралды 82 М.
The strange cousin of the complex numbers -- the dual numbers.
19:14
Tensor Calculus For Physics Majors #1| Preliminary Vector Stuff part 1
53:11
Tensors/tensor products demystified
1:04:15
mlbaker
Рет қаралды 56 М.
Tensors Explained Intuitively: Covariant, Contravariant, Rank
11:44
Physics Videos by Eugene Khutoryansky
Рет қаралды 1,1 МЛН