Tensor Calculus 4g: Index Juggling

  Рет қаралды 37,699

MathTheBeautiful

MathTheBeautiful

Күн бұрын

This course will eventually continue on Patreon at bit.ly/PavelPat...
Textbook: bit.ly/ITCYTNew
Errata: bit.ly/ITAErrata
McConnell's classic: bit.ly/MCTensors
Table of Contents of bit.ly/ITCYTNew
Rules of the Game
Coordinate Systems and the Role of Tensor Calculus
Change of Coordinates
The Tensor Description of Euclidean Spaces
The Tensor Property
Elements of Linear Algebra in Tensor Notation
Covariant Differentiation
Determinants and the Levi-Civita Symbol
The Tensor Description of Embedded Surfaces
The Covariant Surface Derivative
Curvature
Embedded Curves
Integration and Gauss’s Theorem
The Foundations of the Calculus of Moving Surfaces
Extension to Arbitrary Tensors
Applications of the Calculus of Moving Surfaces
Index:
Absolute tensor
Affine coordinates
Arc length
Beltrami operator
Bianchi identities
Binormal of a curve
Cartesian coordinates
Christoffel symbol
Codazzi equation
Contraction theorem
Contravaraint metric tensor
Contravariant basis
Contravariant components
Contravariant metric tensor
Coordinate basis
Covariant basis
Covariant derivative
Metrinilic property
Covariant metric tensor
Covariant tensor
Curl
Curvature normal
Curvature tensor
Cuvature of a curve
Cylindrical axis
Cylindrical coordinates
Delta systems
Differentiation of vector fields
Directional derivative
Dirichlet boundary condition
Divergence
Divergence theorem
Dummy index
Einstein summation convention
Einstein tensor
Equation of a geodesic
Euclidean space
Extrinsic curvature tensor
First groundform
Fluid film equations
Frenet formulas
Gauss’s theorem
Gauss’s Theorema Egregium
Gauss-Bonnet theorem
Gauss-Codazzi equation
Gaussian curvature
Genus of a closed surface
Geodesic
Gradient
Index juggling
Inner product matrix
Intrinsic derivative
Invariant
Invariant time derivative
Jolt of a particle
Kronecker symbol
Levi-Civita symbol
Mean curvature
Metric tensor
Metrics
Minimal surface
Normal derivative
Normal velocity
Orientation of a coordinate system
Orientation preserving coordinate change
Relative invariant
Relative tensor
Repeated index
Ricci tensor
Riemann space
Riemann-Christoffel tensor
Scalar
Scalar curvature
Second groundform
Shift tensor
Stokes’ theorem
Surface divergence
Surface Laplacian
Surge of a particle
Tangential coordinate velocity
Tensor property
Theorema Egregium
Third groundform
Thomas formula
Time evolution of integrals
Torsion of a curve
Total curvature
Variant
Vector
Parallelism along a curve
Permutation symbol
Polar coordinates
Position vector
Principal curvatures
Principal normal
Quotient theorem
Radius vector
Rayleigh quotient
Rectilinear coordinates
Vector curvature normal
Vector curvature tensor
Velocity of an interface
Volume element
Voss-Weyl formula
Weingarten’s formula
Applications: Differenital Geometry, Relativity

Пікірлер: 70
@user-pt-au-hg
@user-pt-au-hg 6 жыл бұрын
It's so nice to have fundamental points actually expressed or spoken out and not assumed, as in, the metric tensor didn't disappear, it was absorbed ( or is implied in the expression for the dot product U^i dotted Vi) in showing the orthogonality between the two vectors (@11:10). Thank you so much, it's appreciated greatly. S.
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Thank you, I'm very glad you're enjoying these videos!
@klikkolee
@klikkolee 9 жыл бұрын
when ever I see the time lapse at the beginning of the video, I think youtube broke and is fast-forwarding through he video at breakneck speeds.
@user-yg4bk9mg5r
@user-yg4bk9mg5r Жыл бұрын
Thanks a lot i was struggling with this inverse tensor stuff for several months now you have cleared my doubt
@MathTheBeautiful
@MathTheBeautiful Жыл бұрын
that's the idea tbh
@paulharten4894
@paulharten4894 6 жыл бұрын
The orthonormal relationship between the covariant basis vectors and the contravariant basis vectors is very convenient and relates the geometry to the calculus.
@joeboxter3635
@joeboxter3635 Жыл бұрын
It's nice to hear you say ... I remember it confused me when I first heard it. There is hope. Lol.
@orientaldagger6920
@orientaldagger6920 3 жыл бұрын
Nice explanation at 22:40. If you throw away the idea that delta_m_k (both lower indices) have anything to do with 1 when m=k and zero otherwise....
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
Yeah, delta_mk does fit.
@isaackulp2885
@isaackulp2885 3 жыл бұрын
How do you decide whether some object should have an upper or a lower index? Say I had a matrix equation Ax=b. Should I write it as a_ij x^i=b_j or a^ij x_i=b^j , or even throw caution to the wind and write a_ij x_i=b_j? What is the logic behind choosing whether an index should be upper or lower?
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
That will crystalize for you over time. Ultimately, it's dictated by the manner in which the object transforms under a change of coordinates. The unbreakable rule that reduces your number of choices is that a repeated index needs to show up once as a superscript and once as a subscript. For Ax=b, the most natural placement of indices is A^i_j x^j = b^i. My new book (I could send you a draft) explains this in much greater detail.
@isaackulp2885
@isaackulp2885 3 жыл бұрын
I watched more of your videos and thought about it a bit more. I understand now that you could write the vectors in terms of either the contravariant or covariant basis, as either x=x^i Z_i =x_i Z^i. From there, in order for the indecies to be balanced, the indecies of A must take a certain form. I know how it works for vectors, but is there a rule for how matricies such as A can be represented with basis vectors? A draft of your book would be great! I think a lot of other people watching your channel would also benefit from that!
@sajidhaniff01
@sajidhaniff01 6 жыл бұрын
Thanks! This video really clarified the section from your book
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Yes, Index juggling can be quite tricky! Indices make it almost invisible!
@MrFischvogel
@MrFischvogel 6 жыл бұрын
Thank you so much for your great videos, Sir. They really deepen my understanding! I also really liked the question at the end about the purpose of two different kinds of basis. I am not a native speaker and I could not find something about the fluid theorem you mentioned in the end. I understand "Hulerian fluids" and "conservation of auticity". Could someone spell the name of the fluid theorem to me?
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Thank you, MrFischvogel. I said "Eulerian fluids", but I don't know what I meant by that. There are Eulerian and Lagrangian descriptions of fluid flows. In an Eulerian description, the metric tensor is generally static. In Lagrangian, it's always time-dependent. I should have said "in the Lagrangian description". Conservation of vorticity. Holds in the point-wise sense for inviscid fluids.
@MrFischvogel
@MrFischvogel 6 жыл бұрын
Thank you so much for your immediate answer, Sir! I can't believe it :)
@Yoyimbo01
@Yoyimbo01 6 жыл бұрын
Dear professor, in your book we are asked in exercise 106 to show that for symmetric tensors we have: T_{*i}^j = T_j^{*i}. I am only able to show T_{*i}^j = T_i^{*j}. Is there a typo? I am forced to ask since I cannot access the errata!
@sashastrelnikoff4192
@sashastrelnikoff4192 9 жыл бұрын
In which video do you define a "tensor"?
@MathTheBeautiful
@MathTheBeautiful 9 жыл бұрын
In the next one! Some objects have the word "tensor" in their name so I call them that, but in the next video that name is justified.
@georgefrank7405
@georgefrank7405 8 жыл бұрын
+Sasha Strelnikoff I was gonna ask the same question. It really confused me.
@mattmolewski7475
@mattmolewski7475 7 жыл бұрын
I don't understand why you don't use the same g for the metric tensor that everyone and their mom uses, but thanks for making the videos. =)
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
I try to use the letter Z for most metrics. I make an exception for the position vector R and the Levi-Civita symbol epsilon.
@orientaldagger6920
@orientaldagger6920 3 жыл бұрын
Why would you use g, that is purely a physics thing.
@orientaldagger6920
@orientaldagger6920 3 жыл бұрын
@@MathTheBeautiful R with an arrow is the position vector but R by itself is the radius of the circle and a constant...sometimes very confusing. I have seen books that use P with an arrow for the position vector and it is easier on the eye the first time...especially if "r" is THEN introduced as a coordinate.
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
@@orientaldagger6920 Thats' a very valid point. I spend a lot of time thinking about what to name functions.
@adammln1725
@adammln1725 9 жыл бұрын
To Prof G: thank you very mach for ALL this. I have Question if you please. when we see lower index with upper index by same latter we think that is sum but now with index juggling that has become another prespective : when we see two different indeces together one upper and the other lower so we think also sum and the trick is index juggling. is that correct?
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
I'm not sure I understand the question. The same index appearing as a superscript and a subscript always implies summation.
@dennydaydreamer
@dennydaydreamer 5 жыл бұрын
Using delta to denote the contravariant metric tensor kind of goes against the notion of using one letter to denote the same object, no?
@Leon-qi5zx
@Leon-qi5zx 3 жыл бұрын
Hi dear professor Grinfeld thank you for this brilliant series! I have a question which confuses me a lot when I study this vedio index juggling. Like you said, inverse metric tensor lowering indeices then for any vector v, we have v=v^i e_i=(v_k g^(ki)) e_i=v_k (g^(ki) e_i)=v_k e^k=v_i e^i, where v^i is vector component and e_i is basis vector and e^i be covector basis and v_i be covector components, g^(ki) be inverse metric tensor. However, this is saying that v=v^i e_i=v_i e^i, i.e., any vector can be expanded as linear combination of vector basis and covector basis. Whereas v_i e^i is a covector! Isn't this equality implies that a vector actually a covector? But we all know that vector and covector are obviously not same thing, isn't that a contradiction?
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
Hi Leon, No, it's just the vector *v* itself. If you forget all of the covector stuff, you will find everything easier to understand. Pavel
@Leon-qi5zx
@Leon-qi5zx 3 жыл бұрын
@@MathTheBeautiful Thank you so much for your reply! Can I understand in this way: in above equation v=v^i e_i=v_i e^i, where e^i is contravariant basis vector of v, it doesn't mean that it is basis covector, but some times we write basis covector to be the same as contravariant basis vector of v? Once again, thank you so much for your help dear professor Grinfeld!
@hssy2jrocker
@hssy2jrocker 8 жыл бұрын
I have been watching your videos for the past 10 hours. I am a bit confused about the difference betwen T(subscript i) and T(superscript i). What is it?
@hssy2jrocker
@hssy2jrocker 8 жыл бұрын
Also I didnt quite pick up the part where you Z(super k) as Z(super ki) Z(sub i). Which part should I refer to in order to understand that?
@MathTheBeautiful
@MathTheBeautiful 8 жыл бұрын
+himangshu sarma Right at the beginning of this video. This notational system is so seamless, it can almost sneak by you.
@hssy2jrocker
@hssy2jrocker 8 жыл бұрын
I have a few doubts. Here, are we moving in the direction of defining tensors as multidimensional arrays instead of multi linear mappings? Or are both the same things? I need it as a part off Mathematical Physics where I need Minkowski space, Lorentz transformation as coordinate transformation and EM tensor under Lorentz Transformation. Do you have them in this series? Also, excellent work. Very easily explained. :)
@Revan176
@Revan176 8 жыл бұрын
why is a vector decomposed in covariant basevektor and conravariant component (or the other way round)?
@planelsmederevo3705
@planelsmederevo3705 3 жыл бұрын
Nice, reminds me on one of math professors when I was a student. 😀
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
Were you a student at Drexel?
@planelsmederevo3705
@planelsmederevo3705 3 жыл бұрын
I am elecrical engineer. I graduated in april 2000 at Faculty of elecrical engineering of University of Belgrade, Serbia. But unfortunately I ve never attended tensor course. Now, I am trying to watch some lectures about GR but I need to study this topic first. 😀🇷🇸 Sory, I've never heard about Drexel but if all the teachers there do this way it must be an exelent school.
@prabhatp654
@prabhatp654 4 жыл бұрын
You are considering these objects as numbers now again, which defies the purpose of using tensors in the first place. So is it just laziness or really tensor is losing the geometrical sense now?
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
The goal of Tensor Calculus is to empower analytical methods without giving up the geometric insight. Analytical methods work best with numbers so we are trying to get to numbers.
@prabhatp654
@prabhatp654 4 жыл бұрын
@@MathTheBeautiful this is not satisfying to me because there's much dependence of geometrical objects on coordinate systems as it is in linear algebra. Anyways, you constant replies to my comments surprises me. Thank you so much. Hope to interact with you in future too as I continue through your lectures.
@MalcolmAkner
@MalcolmAkner 7 жыл бұрын
The question at the end left me unsatisfied with your answer. This is otherwise a fantastic series on tensors, but I've yet to see you once actually draw one of the possible ways to explain the difference between covariant and contravariant bases. I might be wrong here so please correct me if I am, but I've thought the difference between these are how you denote coordinates. Imagine you have a Cartesian base, but one of the axes is at an angle |= 90 degrees, you'll run into an ambiguity. If you put a point in this space and try to draw the component lines to each of the axes, how do you do it? Tangential or normal to the axes? In a pure Cartesian base this ambiguity melts away since normal and tangential are separated by 90 degrees in that case and doesn't produce two different images. Is this the case? If it is, why have you left it out in this series? If it isn't, what was my maths professor trying to teach me? Appreciate the incredible pedagogic skill you have, peace! :)
@MathTheBeautiful
@MathTheBeautiful 7 жыл бұрын
You are asking several questions. I will answer a couple now and you'll let me know if that answers them all. A. For what the contravariant basis looks like in relation to the covariant basis, see Figure 5.3 in the textbook. The relationship is characterized by "mutual orthonormality". B. Even though this is a nifty insight, the beauty of algebra (and therefore tensor calculus) is that it can carry you through the analysis of a problem without continuous geometric insight. Geometric insight is important most of the time, but the ultimate tensor skill is to be able to move fluidly between the geometric and algebraic points of view. Index juggling is an example where temporarily letting go of the geometric picture and enjoying the algebra is a good idea. C. Geometrically, decomposition with respect any basis works the same way: in 3D, you construct a parallelepiped grid that fills the entire space and see where the target vector falls on that grid. To do the decomposition one coordinate at a time, you draw a plane through the tip of the target vector parallel to the plane spanned by the other two basis vectors and see where it crosses the third basis vector. D. Algebraically, you accomplish the same task by dotting the target vector with the corresponding basis vector from the basis of the opposite "flavor".
@MalcolmAkner
@MalcolmAkner 7 жыл бұрын
A) I don't have the book and can't find it online so I can't see what figure 5.3 looks like. Any possibility of sending that specific figure? B) I thought it was a great insight when I studied this, since before that I had no clear idea of how to separate "covariant" from "contravariant" conceptually. It captures how it results in an invariant if you do what you speak of in D) and it's easily generalisable to any coordinate-system - from there on I can take the abstracted step and stop thinking about specific coordinate systems and their geometric insights, cause I know how to unpack it if I get unclear. I fully realize that the beauty of tensor calculus is to be able to manipulate these expression without consideration to a specific coordinate-system, which renders that specific geometric insight into an extremely special case. But how else can you think of the difference? You seem to speak about the difference as something more abstract. My question is specifically this one: How would you characterize the difference between covariant and contravariant? What are you asking your class to think of when these names are used? C) That was an interesting way to think of decomposition, I've never heard that one before! Thank you for an insightful answer, you write the way you speak! :)
@MathTheBeautiful
@MathTheBeautiful 7 жыл бұрын
A: prntscr.com/er7ssv B. I have not thought about answers to these questions. They are primarily algebraic constructs. Their geometric interpretation does play a role in some areas, but not in this generic discussion. It's more important to NOT think about visualizing these vectors and instead focus on embracing algebra.
@ian-haggerty
@ian-haggerty 4 жыл бұрын
Here are my notes for Lecture 4 drive.google.com/open?id=1sjWLRGvSY99HsvQ_gC8UgFuX8kW4MpKF Lot of content here, but I'm a forgetful soul - and I need to look at this stuff over and over again for it to sink in! Many thanks to Pavel
@MathTheBeautiful
@MathTheBeautiful 4 жыл бұрын
Hi Ian, Outstanding work! Would you like me to send you a draft of the new version of the book (a complete rewrite). Pavel
@ian-haggerty
@ian-haggerty 4 жыл бұрын
@@MathTheBeautiful Send it over! I'm no authority on the subject, but if you need a fresh perspective from a relative beginner - perhaps another pair of eyes could be of use. Few errors lurking in those notes - I'd personally like to flesh them out with some more applied examples. Putting abstract in context makes things click for me.
@likeplayinghello04
@likeplayinghello04 2 жыл бұрын
@@ian-haggerty kby
@reinerwilhelms-tricarico344
@reinerwilhelms-tricarico344 9 жыл бұрын
dunno, but if you begin to juggle indices of the metric tensor itself it would be nice to use a different letter, like G_ij. Too many Z's for my taste. But otherwise it's very nicely presented. thanks.
@MathTheBeautiful
@MathTheBeautiful 9 жыл бұрын
Reiner Wilhelms-Tricarico I'm sure you'll come around to our point of view!
@marxman1010
@marxman1010 6 жыл бұрын
At 5:20, what is the real meaning of T dot (upper index j) (lower index k)? Shouldn't it be T (upper index j) dot (lower index k)?
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Are you referring to the third line? The index "i" came down, but remained the first index. So the dot is the placeholder for "i" in case it ever wants to come back up. The index "j" is the second index so it belongs its, second, sold.
@marxman1010
@marxman1010 6 жыл бұрын
Yes. I mean the third line. But why after contraction, j is placed after the k? If T and Z are matrices, the multiplication result is T (upper index j) (lower index k) with j as row index and k as column index. I feel it is more acceptable to put j before k. For multiplication, in this case, T is required to be transposed to Tji, so naturally j comes the first.
@gautomdeka581
@gautomdeka581 4 жыл бұрын
Does it have any proof
@rkpetry
@rkpetry 9 жыл бұрын
Try audio dub at 00:56 should be "T" not "I". (It seems important though replaced.) Less noticed is 02:44 should be "delta-I-K" not "...I-J". (It's corrected momentarily.) COMMENT: It seems curious that the tensor should never have complex numbers...
@MathTheBeautiful
@MathTheBeautiful 9 жыл бұрын
Thanks for the catch. Can you dub post upload?
@rkpetry
@rkpetry 9 жыл бұрын
Not that I know (I was thinking your video set was to become a DVD companion to the text), But, Annotations might suffice e.g. "He meant 'T' not 'I'..." (maybe like a running caption like translating from "Oopsinglish' to 'Ahhsenglish').
@isaackulp2885
@isaackulp2885 3 жыл бұрын
There are so many Z's. It is hard to remember which one is the metric tensor and which ones are just arbitrary tensors.
@MathTheBeautiful
@MathTheBeautiful 3 жыл бұрын
It's a valid point, but you'll change your mind soon.
@isaackulp2885
@isaackulp2885 3 жыл бұрын
@@MathTheBeautiful I think I get it now. Z with one lower index is the covariant basis, Z with one upper index is the contravariant basis, Z with 2 lower indecies is the covariant metric tensor, which is the dot product of Z_i and Z_j, and Z with 2 upper indecies is the contravariant metric tensor, which is the dot product of Z^i and Z^j, Would there be any meaning to Z with both upper and lower indecies? Or would that just be delta?
@isreasontaboo
@isreasontaboo 7 жыл бұрын
Wait, what? How is Zik = δik unless you're in Cartesian coordinates only? It's a different thing in polar or spherical coordinates, right?
@quententhomas7583
@quententhomas7583 7 жыл бұрын
I was stumped by this when I first saw it. The point is that the 'identity' version of the kronecker delta is d^i_j. (I'm denoting upper with the caret ^ and lower with underscore _ ) When you juggle indices on the kronecker delta its no longer 1 for i=j and 0 otherwise. The whole point of this video is that you can juggle indices using the metric tensor. You can raise an index by multiplying by the contravariant metric tensor Z^ij or lower an index by multiplying by Z_ij. But then you can ask what happens if I use the metric tensor to juggle an index on the metric tensor. So you write Z^ij Z_jk. This expression gets contracted over k to give Z^i_k. But we also know that the covariant metric tensor and the contravariant metric tensor are inverse. So multiplying them gives you the identity or kronecker delta (with one upper and one lower index). So we've got Z^i_k = d^i_k. But now look at what happens when we lower an index on d^i_j. d^i_j Z_ik = d_jk = Z_jk, so it seems Z = d.
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Quenten is exactly right. Z_ik = δ_ik is all coordinate system by an immediate consequence of index juggling, but δ_ik is no longer zeros and ones.
@debendragurung3033
@debendragurung3033 6 жыл бұрын
ok but does this work on affine basis. I tried with alternate affine bases, everything inckuding finding the components of a vector with respect to both covariant and contravariant bases worked. But the dot product didnt work as the above statement @11:08 Heres what I did. I defined an alternate basis e1 = (2,1) and e2 = (2,3). I worked its dual basis (3/4, -1/2) and (-1/4, 1/2).. Then I introduced two invariant vectors in standard basis v = (2,2) and u=(3,4), wgise dot product is 14.... I worked out each of their components in alternate basis and its dual basis. Then I trued to work out theur dot product using the above formula and I get teo different answers but not 14...
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
Hi Debendra, I think you calculated the metric tensor incorrectly. (If A=[e1 e2] then M = AᵀA. I think you did AAᵀ instead.) Pavel
@debendragurung3033
@debendragurung3033 6 жыл бұрын
Lemma I was trying to work out the relation υν= (υ^i) (ν_j) Z_ij. Using a set of basis e1 = (2,1) and e2 = (2,3). Considering them as covariant basis, I calculated their corresponding dual basis(to obtain the contravariant basis pair). I used the method that if E=[e1 e2], then E^(-T) is a Matrix whose column vectors are dual basis. The I formulated the covariant metric tensor from basis e1 and e2 pairwise dot product rule Z_ij =[5 7 , 7 13] . Then I introduced two vectors in standard basis form u =(3,4) and v =(2,2). Their components for in basis E. for vector u, u^i =(1/4, 5/4) and for v, v^i= (1/2 1/2). For which I used u^i = E^(-1) u. I even checked that relation u^i = u.e^i, holds for both indices of vector u and v... But the dot product relation didn't work out. uv = 14. But (u^i).(u^j) Z_ij = 12.. Here I used the determinant. |Z_ij|, I now know where I went wrong as I write this comment... Neverthless to finish , I just worked out that v_i = (10 18). using method u_i = [E^T ] u . And indeed uv = (u^i ).(v_i )= 14. Also =( u_i).( v^i).
@MathTheBeautiful
@MathTheBeautiful 6 жыл бұрын
I think you got it! ...but I summarized it for you anyway: drive.google.com/file/d/1ciKwT8UXJAbLO2fprvVmpA7YMaKthmWh/view?usp=sharing Just wanted to mention that you don't need the dual basis or the contravariant metric tensor here. Keep up the great work! Pavel
Tensor Calculus 5a: The Tensor Property
1:42:05
MathTheBeautiful
Рет қаралды 71 М.
Tensor Calculus 6a: The Christoffel Symbol
21:06
MathTheBeautiful
Рет қаралды 75 М.
escape in roblox in real life
00:13
Kan Andrey
Рет қаралды 42 МЛН
Whoa
01:00
Justin Flom
Рет қаралды 58 МЛН
Новый уровень твоей сосиски
00:33
Кушать Хочу
Рет қаралды 2,6 МЛН
Tensor Calculus 4e: Decomposition by Dot Product in Tensor Notation
13:25
Tensor Calculus 4c: A Few Tensor Notation Exercises
18:50
MathTheBeautiful
Рет қаралды 50 М.
Tensor Calculus 3b: Change of Coordinates
39:13
MathTheBeautiful
Рет қаралды 87 М.
Lecture 1 | String Theory and M-Theory
1:46:55
Stanford
Рет қаралды 2,3 МЛН
Limits of Logic: The Gödel Legacy
58:16
The Flame of Reason
Рет қаралды 202 М.
Rory Stewart OBE: "Failed States - and How Not to Fix Them"
1:20:38
Yale University
Рет қаралды 411 М.
escape in roblox in real life
00:13
Kan Andrey
Рет қаралды 42 МЛН