The unreasonable effectiveness of linear algebra.

  Рет қаралды 170,110

Michael Penn

Michael Penn

5 ай бұрын

To apply for an open position with MatX, visit www.matx.com/jobs.
🌟Support the channel🌟
Patreon: / michaelpennmath
Channel Membership: / @michaelpennmath
Merch: teespring.com/stores/michael-...
My amazon shop: www.amazon.com/shop/michaelpenn
🟢 Discord: / discord
🌟my other channels🌟
mathmajor: / @mathmajor
pennpav podcast: / @thepennpavpodcast7878
🌟My Links🌟
Personal Website: www.michael-penn.net
Instagram: / melp2718
Twitter: / michaelpennmath
Randolph College Math: www.randolphcollege.edu/mathem...
Research Gate profile: www.researchgate.net/profile/...
Google Scholar profile: scholar.google.com/citations?...
🌟How I make Thumbnails🌟
Canva: partner.canva.com/c/3036853/6...
Color Pallet: coolors.co/?ref=61d217df7d705...
🌟Suggest a problem🌟
forms.gle/ea7Pw7HcKePGB4my5

Пікірлер: 235
@MichaelPennMath
@MichaelPennMath 5 ай бұрын
To apply for an open position with MatX, visit www.matx.com/jobs.
@nripdave673
@nripdave673 5 ай бұрын
isn`t this website truly for math....? and Is there any age restriction on this website for applying...?
@oni8337
@oni8337 5 ай бұрын
Truly a video a representation theorist would've made
@AutoDisheep
@AutoDisheep 5 ай бұрын
Thank you for this post. I didn't even know Linear Algebra could do so much. I knew it was a great math but I was just ignorant as to how great it is.
@jneal4154
@jneal4154 5 ай бұрын
Wow. Thanks! Just the kind of work I'm looking for.
@jaideepshekhar4621
@jaideepshekhar4621 5 ай бұрын
The explanations could have been clearer.
@felipelopes3171
@felipelopes3171 5 ай бұрын
For people who want to know more, what Michael Penn is hinting at is called Representation Theory. One very popular line of attack to classify mathematical structures is to represent them as compositions of linear transformations in vector spaces. In many cases of interest, you can prove that if you cannot find a representation with certain properties, then it means that the thing you are trying to study does not have an important property. And since studying representations is much easier than studying the abstract structure, it simplifies things a lot. That's how Fermat's Last Theorem was ultimately conquered. They reduced the problem to the nonexistence of a given structure, and through some long arguments could reduce it to properties of the representations, which could be brute forced to prove no solution would exist.
@Wielorybkek
@Wielorybkek 5 ай бұрын
"In many cases of interest, you can prove that if you cannot find a representation with certain properties, then it means that the thing you are trying to study does not have an important property" is it some kind of theorem? could yout throw some keywords so I could learn more?
@felipelopes3171
@felipelopes3171 5 ай бұрын
@@Wielorybkek Look at the Cartan subalgebra of a finite dimensional Lie algebra, in particular it's defined in terms of a property of a representation of a Lie algebra. It's the properties of the Cartan subalgebra that allow us to classify finite dimensional Lie algebras. This is widely considered one of the most powerful results in Lie theory.
@jplikesmaths
@jplikesmaths 5 ай бұрын
Representation theory will be my thesis. Writing about rep theory of special linear groups.
@repbacchista
@repbacchista 5 ай бұрын
that was a nice comment! gonna look into it! vlwsss! =D
@emanuellandeholm5657
@emanuellandeholm5657 5 ай бұрын
I'm reminded of linear filtering done in frequency-like domains instead of convolution in the original domain. Transform your filter kernel, transform your signal, pointwise multiply, inverse transform. Where "transform" is something like the FFT. You can do this operation in blocks and then there are various methods with various tradeoffs used to "stitch" together the inverse transform blocks.
@godelianconfucianism8184
@godelianconfucianism8184 5 ай бұрын
"If you can reduce a mathematical problem to a problem in linear algebra, you can most likely solve it, provided you know enough linear algebra". This was a quote in the preface of Linear Algebra and its Applications by the great mathematician Peter D. Lax. It was my first book on the subject and that sentence stuck with me ever since
@trevoro.9731
@trevoro.9731 5 ай бұрын
But for real-life modelling, if you can reduce a a problem to a problem in linear algebra, it very likely means that your understanding of the problem is wrong or you have made a huge mistake in your modelling.
@arthurswanson3285
@arthurswanson3285 5 ай бұрын
​​@@trevoro.9731bro people have sent rockets to other planets, built trillion dollar search engines, and built compression algorithms you're using to watch this KZbin video right now with linear algebra. Huh?
@trevoro.9731
@trevoro.9731 5 ай бұрын
@@arthurswanson3285 First of all, you are talking about data abstraction, not real processes modelling. I'm not sure about rockets, as the it involves non-linear processes modelling. As well as electronics and other things.
@misterlau5246
@misterlau5246 5 ай бұрын
🤓 My first linear algebra book was the very easy to read, Grossman 🤗🤗🤓🤓🤓
@pik910
@pik910 5 ай бұрын
Wonderful sentence. I think of a lot of math as spatial analogies.
@michakuczynski2987
@michakuczynski2987 5 ай бұрын
My quantum mechanics professor once mentioned that "we are very lucky that the fundamental laws of nature are expressed using the language of linear algebra". This video really changed my perspective on this matter...
@iyziejane
@iyziejane 5 ай бұрын
An old school perspective on this is that classical mechanics rapidly gives rise to nonlinear differential equations, like the pendulum theta'' = -sin(theta), but the dynamics of quantum systems are always linear equations (time derivative of a state is equal to a matrix applied to the state). The traditional explanation is that the dimension of the matrix grows exponentially with the number of particles, and a compact nonlinear equation is in some sense better to work with than an exponentially large linear set of equations. But yes it could be that quantum mechanics is linear because that's the only part of it we can access (like a tangent approximation to a full theory, Hilbert spaces as approximations to Kahler manifolds).
@TheThreatenedSwan
@TheThreatenedSwan 5 ай бұрын
I hate that way of thinking of things when people mean it. It's kind of like how the question of is light a wave or a particle doesn't make sense as a question. People take these conceptions of things that are systematized that are useful in describing reality which is immanent and run with them getting further and further away from the point or science. It's like how since science has become high status, now a bunch of people rush in an refer to the science or the form of science (sometimes not even that) without the essence.
@boomerzilean
@boomerzilean 5 ай бұрын
I think when you learn linear algebra as a student you don't really get this impression, but I think it's an important thing to realise when you study mathematics, that linear algebra is just conceptually extremely easy and is basically "solved" as a subject
@Rudenbehr
@Rudenbehr 5 ай бұрын
When I was studying it, it felt like we were just doing different versions of the same matrix multiplication/addition/subtraction at the beginning of the course, but obscured through vocabulary and proofs.
@dougsherman1562
@dougsherman1562 5 ай бұрын
So true, in my path to a physics degree, we really didn't spend lots of time on linear algebra. I was always fascinated by it and appreciate these videos. Once I retire it will be fun to study this again. Thanks for this video!
@Myrslokstok
@Myrslokstok 4 ай бұрын
I pased linear algebra and I always woundered what real mathematicians do with representation teory, i had no idé I was soo close!
@jessewolf7649
@jessewolf7649 4 ай бұрын
Not true. Particularly Applied Linear Algebra. “Applied” here still means pure mathematics: e.g., solving a linear system more efficiently than standard techniques under certain conditions. An extremely vibrant field. Actually used in AI, for example. (Which is a true “application” of pure mathematics).
@aweebthatlovesmath4220
@aweebthatlovesmath4220 5 ай бұрын
Algebra is a really important subject of math and almost everything in algebra can be understood with linear algebra via representation theory this makes linear algebra a really powerful tool!
@raphaelreichmannrolim25
@raphaelreichmannrolim25 5 ай бұрын
If anyone here also likes Number Theory, look up the concept of Arithmetic Space which I invented in my book, Foundations of Formal Arithmetic.
@thirdeyeblind6369
@thirdeyeblind6369 4 ай бұрын
@@raphaelreichmannrolim25 Is there an English version of your Masters Thesis? I am afraid the only copy I can find is in Brazilian Portuguese.
@raphaelreichmannrolim25
@raphaelreichmannrolim25 4 ай бұрын
​​​@@thirdeyeblind6369Sadly, there isn't yet. I had the objetive of translating It myself, but I didn't yet. The most fundamental concept there exposed is the nilpotent arithmetic space of order N. It's algebra behaves as a finite dimensional projection of the infinite algebra of Arithmetic operations. When I was researching this, used this finiteness while applying abstract harmonic analysis and Gelfand Theory to obtain trigonometric representations of functionals defined on the group of invertible arithmetic operations os these algebras. In particular, we can use this to represent any function obtained by additive and multiplicative convolutions, as the Mertens functions. However, despite the simplicity of the method, I didn't see how this could help us bound the Mertens functions. Stopped working on the subject a few months ago.
@ke9tv
@ke9tv 5 ай бұрын
I've done a lot of numerical analysis in a long career. I've long claimed that 90% of the job is finding the integral transform that maps your impossible problem into linear algebra, and then letting a computer do the linear algebra. If asked what piece of the subroutine libraries that I would re-implement first if I didn't have them, I'd have to say that it's the singular value decomposition. It's the Swiss Army Knife of numerical linear algebra.
@cosmicspectrum4507
@cosmicspectrum4507 5 ай бұрын
Nice!
@MatheusOliveira-dk9zq
@MatheusOliveira-dk9zq 5 ай бұрын
I have a question, why is the SVD important outside of your general lossy compressom or least norms and how is it used in these cases, for example you mentioned calculus, any keywords for those methods?
@christressler3857
@christressler3857 5 ай бұрын
Would you be willing to recommend books on this?
@PhilBoswell
@PhilBoswell 5 ай бұрын
At 16:50 should the adjacent matrix be adjusted a bit? It seems to suggest that node 1 is only connected to itself and node 5, missing out the connection to node 2, and that node 2 is connected to itself…
@joelklein3501
@joelklein3501 5 ай бұрын
Yep I think so too
@Alan-zf2tt
@Alan-zf2tt 5 ай бұрын
Thank for your post - at first viewing I had involuntarily stopped listening as Row_n = Column_n for the obvious 1
@spogel9981
@spogel9981 5 ай бұрын
And second colum should be 10010 because 2 is connected to 1 and 4.
@spogel9981
@spogel9981 5 ай бұрын
Many thanks for this video. Short remark: Second colum should be 10010 because 2 is connected to 1 and 4.
@krisbrandenberger544
@krisbrandenberger544 5 ай бұрын
Yes. Node 2 is connected to 1 and 4, not itself and 4.
@janvesely3279
@janvesely3279 5 ай бұрын
The best application of linear algebra is certainly the functional analysis. It transfroms the mess of differential/integral equations into something really elegant and easy to be used.
@raphaelreichmannrolim25
@raphaelreichmannrolim25 5 ай бұрын
If you are also interested in Number Theory, look up the concept of Arithmetic Space which I invented in my book, Foundations of Formal Arithmetic.
@jhfoleiss
@jhfoleiss 5 ай бұрын
That first example is so refreshing!
@user-gs6lp9ko1c
@user-gs6lp9ko1c 5 ай бұрын
Wow! Great overview! My favorite applications of linear algebra: spherical geometry (makes the equations intuitive), Fourier analysis, multivariate Gaussian distributions, affine transformations of random variables, linear regression, and engineering problems combining some of the above, especially when the matrices can be manipulated to make the solution methods (almost) unreasonably elegant! 🙂
@ejuicy7673
@ejuicy7673 5 ай бұрын
I like to tell people who don't know so much math that pure maths can be divided into two topics: 1. Linear Algebra 2. Turning problems into Linear Algebra
@JalebJay
@JalebJay 5 ай бұрын
I remember using it for mono directional paths. With the trace of A^n being the number of ways to complete a cycle after n steps. Useful in one of my classes back in 2013.
@bigbroiswatchingyou2137
@bigbroiswatchingyou2137 5 ай бұрын
That is indeed a wonderful picture that you've drawn, thanks for the video!
@mathisnotforthefaintofheart
@mathisnotforthefaintofheart 5 ай бұрын
I found the idea of using Cayley Hamilton to find the four square roots of a 2 by 2 matrix stunning. Because the algebra behind it ultimately is so easy...Linear is a must for every math inclined person.
@raphaelreichmannrolim25
@raphaelreichmannrolim25 5 ай бұрын
My favorite application of linear algebra is in the Foundations of Mathematics and Number Theory, through the concept of an Arithmetic Space, which I developed, showing a way to study the Peano axioms using linear algebra. Didn't solve the hardest problems yet, though! You can find it in the book Foundations of Formal Arithmetic
@icenarsin5283
@icenarsin5283 5 ай бұрын
Wow!!! I never saw this connection before. Integrating, by using and inverse matrix!!! So awesome. Thank you!
@muhammadkumaylabbas8513
@muhammadkumaylabbas8513 5 ай бұрын
Very fascinating. As usual a super interesting video!
@user-qp2ps1bk3b
@user-qp2ps1bk3b 5 ай бұрын
your videos are a joy to watch!
@scottcentoni7478
@scottcentoni7478 5 ай бұрын
My favorite application of linear algebra is quantum mechanics. Quantum chemistry basically is a huge eigenvalue problem. If you use a plane wave basis with periodic boundary conditions, you can do some of the calculations much more efficiently in momentum space using a fast Fourier transform.
@alexdefoc6919
@alexdefoc6919 5 ай бұрын
When i started learning determinants, i began wanting to revert the way of determinants back into matrixes. And so i studied them a lot, and have gotten a prototype of using matrixes equality to matrixes to equation equalities but in the reverse order, basically finding all the solutions for a 3 variable equation.
@kylebowles9820
@kylebowles9820 5 ай бұрын
Finally a video i could keep up with! Theres a small error in the adjacency matrix on column 2 but this was a great video. I recently used linear algebra to least squares fit a function on a non euclidian manifold. Linear algebra is unreasonably effective even in intrinsically nonlinear spaces lol
@goodplacetostop2973
@goodplacetostop2973 5 ай бұрын
17:57 Actually, that would be graph theory. But I also like to show the Fibonacci formulas with matrices! 18:01 Good Place To Stop
@einbatixx4874
@einbatixx4874 5 ай бұрын
How long have you been doing this by now actually?
@goodplacetostop2973
@goodplacetostop2973 5 ай бұрын
@@einbatixx4874 I’ve thought it was around 2 years, but actually it’s 3 years and half 🤯
@anonymous_4276
@anonymous_4276 5 ай бұрын
Dude I got busy with college and stuff so couldn't really visit this channel much, now after two years you're still doing this. Respect the dedication!
@ewthmatth
@ewthmatth 5 ай бұрын
​@@einbatixx4874 "doing this" doing what?
@jarnorajala
@jarnorajala 5 ай бұрын
Besides the adjacency matrix, a graph can be represented using the closely related Laplacian matrix. This has some mind-blowing applications. Like if you take the two eigenvectors corresponding to the two largest eigenvalues (by absolute value) of the Laplacian and use them as arrays of X and Y coordinates for the nodes, you get a really nice 2D representation of the graph that happens to be the solution to a particular optimization problem where the edges are springs.
@tzimmermann
@tzimmermann 5 ай бұрын
Thanks, your answer led me to reading a few articles on graph spectral theory, a subject I am more or less innocent about. It looks powerful, I'm gonna have a good time geeking this up!
@joshavery
@joshavery 4 ай бұрын
What is the optimization problem called?
@xlerb_again_to_music7908
@xlerb_again_to_music7908 5 ай бұрын
The concepts and symbology of linear algebra was not taught to me at school, so in Uni the jump into what the classes were doing hit me like a train. Pushed me out of the course. 10 years later, I took a subject related course (computing) - and it happened again, same problems; would never pass that module so dropped out. This happened to a relative in 2023 - had to quit after 1st semester as he is utterly lost, never used LA, the class already past what he knew and pulling away rapidly. His subject - finance. I went on to complete a PhD in a related topic - but the thing is - why is LA missing from some schools, yet assumed at Uni??
@OmnipotentO
@OmnipotentO 5 ай бұрын
I regret not taking linear algebra in college. I was a biology major but I love math and physics. I should've taken them as electives.
@devon9374
@devon9374 5 ай бұрын
Learn it now
@DistortedV12
@DistortedV12 4 ай бұрын
Check out Linear Algebra done right videos or Gilbert Strang lectures... not too late (just gotta devote Saturday mornings)
@MrJepreis
@MrJepreis 5 ай бұрын
Brilliant! as always!
@phenixorbitall3917
@phenixorbitall3917 4 ай бұрын
Differential Equations is my favorite. WOW! THANK YOU SO MUCH FOR THIS VIDEO SIR ❤🧠
@TheLuckySpades
@TheLuckySpades 5 ай бұрын
In a bunch of classes we would reduce parts of the problems to Linear Algebra and the proof would then be written as "proof via LinAlg"
@DaniErik
@DaniErik 5 ай бұрын
So hard to choose one application as my favorite, but I think Markov chains is definitely high up on my list.
@michaelaristidou2605
@michaelaristidou2605 5 ай бұрын
Excellent video!
@dominiquelaurain6427
@dominiquelaurain6427 5 ай бұрын
My favorite application of linear algebra is : intersecting conics in the 2D euclidean plane. I do euclidean geometry (with python code, sagemath and so on) and because I never found a better way, I use one simple code I never tried to complety undertstand, for getting the intersecting points. It is based on the 3x3 matrix representation of conics. My second application is Cayley conditions for Poncelet configuration. My old third application would be quaternions represented as 3x3 matrix...more number theory, or Hamilton's work review sequel. My fourth application would be in computational geometry graph theory, looks like part end of you video, about paths in a graph (if M adjacency matrix with binary elements then M^n is path lengths n, and M+M^2+... is a matrix with binary elements deciding whether there is ANY path between two vertices ).
@JosephCatrambone
@JosephCatrambone 5 ай бұрын
The modulo sum to matrix multiplication blew my mind. I wish I'd known that years ago.
@jennifertate4397
@jennifertate4397 3 ай бұрын
Great vid. Thanks.
@5_inchc594
@5_inchc594 4 ай бұрын
I’m enlightened. Thanks
@rogerr4220
@rogerr4220 5 ай бұрын
I like the use of linear algebra to find a closed form expression for the nth Fibonacci number. Solving linear recurrences by turning them to a matrix, diagonalizing them, and computing their powers to give entries of the sequence.
@tommychau1211
@tommychau1211 5 ай бұрын
Days ago, I was thinking of the volume of a slanted/oblique cone. If it was a cylinder, it would be obviously the same as a regular cylinder by thinking a stack of disks pushed sideways. After little digging, it’s called Cavalieri’s Principle and It should work for cones as well. So from other perspective, I tried to write this pushing stack of disks into matrix, which formed something called “shear matrix”. Amazingly, the determinant is 1, meaning there is no impact on the volume!
@sumdumbmick
@sumdumbmick 5 ай бұрын
this is why triangles are 1/2 *b*h regardless of how 'slanted' they are. the basic thing you've (partially) discovered is a shape constant. the shape constant for any rectangular n-dimensional figure is 1, but if you remove material from this to get other shapes then a shape constant comes into play. for ellipses it's pi/4, which relates both the area of the ellipse to the circumscribing rectangle and the circumference of the ellipse to the perimeter of the circumscribing rectangle, though if the ellipse has an eccentricity greater than 0 there's another shape coefficient at play as well. this is pretty well approximated by pi/4 *((1+b/a)^2 +4), or pi/4 * ((b/a)^2 +2b/a +5), where a is the semi-major axis and b is the semi-minor axis of the ellipse. for triangles the shape constant is 1/2, and this only relates to area. for pyramids the shape constant is 1/3, and this only relates to volume. the reason it's 1/2 and 1/3 is that it's the inverse of the number of dimensions the shape lives in, and it's no coincidence that these are identical to the coefficients that appear in the power rule when taking integrals or derivatives of polynomials. since a cone is just a pyramid with a circular base it's volume will always be pi/4 *1/3 of the rectangular prism made of its height and the rectangle circumscribing its base. pi/4 is in fact so much more fundamental than pi that most slide rules that mark something related to pi mark either pi/4 or 4/pi, because it allows finding the areas or volumes of shapes much more easily than trying to use pi usually does.
@nicholaslear7002
@nicholaslear7002 5 ай бұрын
Hey great video! Would you mind sharing what brand of chalkboard you use in your videos?
@euanthomas3423
@euanthomas3423 5 ай бұрын
Fascinating insight. What do the eigenvectors mean in these situations?
@stevepa3416
@stevepa3416 4 ай бұрын
There was an exercise in a parallel computing text book that I solved with a pretty fun application of linear algebra. It's especially nice to use it to show people why care of linear independence and why care about doing things over general fields F rather than just R or C. Problem was show that all cycles in hyper cubes have a even number of edges. I imagine there's a more straightforward way to do it. But idea is ok you make a boolean vector space. So if you hypercube is d dimensional, take it as binary strings/tuples of size d. Then there's 2 to the d of those tuples. As vector add take component wise XOR and as scalar multiply logical and. Im pretty sure this is just Zmod2 arithmetic but idk if there's something I'm forgetting. Tldr use the standard basis as e_i but again use the Zmod2 arithmetic component wise and as your field, then bam the hypercube is the span of your basis. And the basis vectors are your vertices of the graph. Then the edges of the hyper cube, take the product of the basis/vertices with itself to get all 2 tuples of them and now the edges are the subset of them that when you add them, the result is in the basis/vertex set. Now you're off to the races. Take any cycle of length p. We have to show p is even. The thin we have to work with is that by definition of a cycle, you end where you start. Idea is. Ok how did we define an edge, that when you add those 2 you get a basis vector. So ok you can cook up an equation where let's say v1 is the start; v1 + sum of a bunch of basis vectors = v1. Add v1 both sides again and then vi + vi is equal to the zero vector in this arithmetic since any bit XOR itself is 0. So now you have the sum of a bunch of basis vectors is 0, there may be a bunch of repeats since you can cycle around a trillion times any which way. So ok do some argument to justify collecting them by common ones so you'll have some a1 of e1 + a2 of e1... + ad of ed = zero vector. Only way that happens is if each of the ai is 0 mod 2. So all the ai are even. And any finite sum of even numbers is even. Really fun for pure math folks who only think of linear algebra for use in analysis and really fun for CS/EE types who just think of it as a means to crunch matricies.
@TheEternalVortex42
@TheEternalVortex42 5 ай бұрын
I wonder how much of this is that our puny human brains don't do well with nonlinear concepts. Thus the math that we've gotten good at happens to be the linear stuff. You could imagine an alternate universe in which we are better at nonlinear concepts in which linear math is but a tiny subset of what we focus on.
@Alan-zf2tt
@Alan-zf2tt 5 ай бұрын
I must admit to wondering if things really are chaotically linear - linearly chaotic? - with chaos inherent within a system merely be nesting a generator in the system. In other words nested chaotic feedback. Wouldn't it be nice if nature played simply with simple things to beguile us rather than create monsters just out of sight?
@kylebowles9820
@kylebowles9820 5 ай бұрын
I deep dived into lie algebra for quaternions, everything made a lot of sense, it gives me hope we could one day master the nonlinear (although it's a much larger class than linear so maybe it's apples and oranges)
@ulrichtietz1327
@ulrichtietz1327 5 ай бұрын
our puny human brains can't even formulate the question to the answer "42" 😅
@mattsgamingstuff5867
@mattsgamingstuff5867 4 ай бұрын
Not a mathematician, just like to mess with math some times. Was playing with some ideas for fun and stumbled upon the modular addition and subtraction looking like rotation from a different lens (it was an algebraic approach but no matrices). I was just interested in sets that generate their elements cyclically over a given operation (that might also have an inverse). I was just playing with the idea that for some functions repeated application of the derivative will yield the integral, and just started trying to generalize and maybe extend the idea (I figured if e^x, e^ix, sin(x), and cos(x) are examples I could maybe do interesting stuff if I could get a bit more general). After I had a tentative list of axioms (for the behavior I was interested in) modular addition/subtraction and rotations (clockwise and counterclockwise) ended up among examples I could come up with satisfying them. Maybe one day I'll get to finish playing around with the ideas perhaps, and probably discover I'm going nowhere new and stumble upon something easily googlable most likely.
@journeymantraveller3338
@journeymantraveller3338 5 ай бұрын
In statistics we have eigenvalues representing variances in factor analysis, cholesky decomposition, Jacobians, Hessian matrix, the magical LR hat matrix, variance/covariance matrices.
@aarongracia4555
@aarongracia4555 4 ай бұрын
We need a video about graphs, pls 🙏🏼
@ArjenVreugdenhil
@ArjenVreugdenhil 5 ай бұрын
A simple but elegant application is representing f(x) = (ax + b)/(cx + d) as matrix [[a b][c d]], thus making the space of Möbius functions isomorphic to SL(2) (or to something much more exciting when working in a module over Z instead of a real/complex vector space).
@Alkis05
@Alkis05 5 ай бұрын
That first example reminded me of cathegory theory some how. I bet there is a functor hiding in that situation, but I'm just a novice in CT
@NityaKrishnaDas926
@NityaKrishnaDas926 50 минут бұрын
Thank you so much 🙏🙏🙏🙏🙏🙏🙏🙏
@marcotosini7156
@marcotosini7156 5 ай бұрын
The graphical representation appears to be incorrect, but the video is fantastic!
@mschuhler
@mschuhler 3 ай бұрын
great video, loved the derivative example! (as a side note, i think your 2nd column for Node 2 in the last example should go {1 0 0 1 0})
@1vootman
@1vootman 4 ай бұрын
My favorite math class in college, particularly because it was the easiest for me! Im a visual thinker and LN suited my brain.
@shrayanpramanik8985
@shrayanpramanik8985 2 ай бұрын
9:10 no no no no! I'm in love after seeing this.
@ronaldjorgensen6839
@ronaldjorgensen6839 4 ай бұрын
thank you
@crimfan
@crimfan 5 ай бұрын
One of my professors in grad school---a famous numerical analyst---said that, with maybe a few exceptions like sorting, any applied problem that can't be turned into linear algebra can't be solved at all.
@OmniArmstrong
@OmniArmstrong 5 ай бұрын
a core concept to frequentist statistics is that if you sufficiently understand your problem, then it can always be represented as a conditionally linear problem--maybe excepting problems involving fundamentally discrete data. this seems prima facie similar to the Representation Theory point brought up below.
@antoniusnies-komponistpian2172
@antoniusnies-komponistpian2172 2 ай бұрын
This might actually help me with getting more familiar with analysis/calculus
@musicarroll
@musicarroll 4 ай бұрын
Linear algebra can also be thought of as mathematics in the small, i.e., local analysis. A large scale structure like an n-dim smooth manifold looks like a Euclidean space when you zoom in, and voila, you can apply linear algebra!
@ripper5941
@ripper5941 5 ай бұрын
Linear algebra is the most fluid and versatile metaphoric skeleton that u can use to solve real life scenarios
@budstep7361
@budstep7361 4 ай бұрын
7:12 this is only so simple because you set up the V space as identity matrix! For anyone curious
@ComplexVariables
@ComplexVariables 5 ай бұрын
I definitely share this view with all my students; linear is a serious power-tool
@KusacUK
@KusacUK 5 ай бұрын
And now I have a simple way of deriving the sum of angles formulae for sin and cos!
@jagatiello6900
@jagatiello6900 5 ай бұрын
12:00 Ha! This can be obtained from exp(tW)=I+tW+1/2!(tW)^2+... where I is the 2x2 identity matrix and W has first row (0 1) and second row (-1 0), by working out the matrix products of the expansion and identifying that exp(tW)=K(t) where K has first row (cos(t) sin(t)) and second row (-sin(t) cos(t)) from the Iwasawa decomposition, as in Proposition 2.2.5 of Bump's book Automorphic forms and representations.
@Alan-zf2tt
@Alan-zf2tt 5 ай бұрын
Truly = I do not know. I have much to learn I do appreciate it's beauty
@vladislavovich100
@vladislavovich100 4 ай бұрын
Fabulous!!!
@Wise4HarvestTime
@Wise4HarvestTime 5 ай бұрын
I remember taking linear algebra in college and it opening up amazing possibilities in computer graphics but this is pretty dense. I'm saying this 4 minutes in and will continue watching to see if I get what he's saying before applying for a job with the sponsor 😅😂😂🤣
@terryrodgers9560
@terryrodgers9560 Ай бұрын
Im not math professor, but I think trig is one of the most important concepts in mathematics (as a multivariable calculus student)
@carriersignal
@carriersignal 3 ай бұрын
I've always found mathematics to be a great subject and have always respected it for both what it is, and its usefulness. However, in the past I have always had trouble understanding some of it. Here lately, I have spent much more time studying the subject, and have realized that with enough effort, time and determination, you can get there.
@ANTONIOMARTINEZ-zz4sp
@ANTONIOMARTINEZ-zz4sp 5 ай бұрын
Machine Learning is a powerful application of linear algebra in the IT ecosystem.
@lanimulrepus
@lanimulrepus 5 ай бұрын
Very good video subject...
@nujuat
@nujuat 5 ай бұрын
I feel like the most intuitive way to think about linear algebra is LTI systems. Ie, amplifiers. Lets say you want to put a sound through an amplifier, and mix it with another one. Then it doent matter if you mix the two sounds before or after you put them through the amplifier. Thats all linearity means. Now lets say youre dr dre and want to pump the base. What is an arbitrary sound going to look like after being put through the amplifier? No idea. But each individual frequency is just going to be multiplied or phase shifted by some number. Therefore the frequencies are eigenvectors of the amplifier. Thats all eigenvector means. The eigenvalues are just the multiplications and phase shifts. So you can simplify the calculations of whats going to happen to a sound by transforming to a frequency basis and doing your calculations there. Thats all matrix diagonalisation is.
@MrFtriana
@MrFtriana 5 ай бұрын
Many problems in physics can be studied with linear algebra. From the newtonian mechanics to the monsters called quantum field theory and relativity (special and general), linear algebra has been proved as a powerful tool to make predictions about the Nature, because have a great unification power. You can study coupled oscillatory linear systems and find their symmetries under linear transformations; at the end, this implies conserved quantities, according with the Noether's theorem.
@ArjenVreugdenhil
@ArjenVreugdenhil 5 ай бұрын
Here is a physical application to explore: in geometric optics, represent a light ray by the vector (nu, h), where n = refractive index of medium, u = slope of light ray, h = height at which ray enters a given surface (relative to optical axis). An optical system can be described by composition of matrices: * [[1 P] [0 1]] for refraction, where P is the refractive power of the surface * [[1 0] [-d/n 1]] for travelling through a medium, where d is the horizontal distance For instance, a typical thin lens situation is described as product of four matrices: object distance, entering the lens, leaving the lens, image distance.
@cd-zw2tt
@cd-zw2tt 4 ай бұрын
Could you theoretically use any periodic function instead of sin and cos in the modulo sum example? My thinking is you need pure sine and cosine to get a steady tick around the circle, but with other periodic functions you could have some sort of interesting "weighting function" to the inputs. With sine and cosine, its pure and steady, but with something like a triangle wave or a compound sine wave, you could induce some very strange behavior.
@minecraftermad
@minecraftermad 5 ай бұрын
17:50, your matrix is off, its saying that 2 connects to itself,when that 1 is supposed to be 1 slot higher.
@juandesalgado
@juandesalgado 5 ай бұрын
If you ever browse over the "attention" paper on the transformers architecture in LLMs, the sentence about positional encoding that goes "... for any fixed offset k, PE_{pos+k} can be represented as a linear function of PE_{pos}" has some relation to the first application in this video.
@markharder3676
@markharder3676 5 ай бұрын
Thanks for this lecture. The calculus application was something I never learned about before. A real eye-opener, that. In the first example, how do we know that the 4 trig-based functions actually span a 4 D space? What about linear independence? In the graph theory application, it seems to me that vtx 1 is also connected to vtx 2, which you did not include in the matrix.
@vangrails
@vangrails 5 ай бұрын
I think that you need to proof that the span is also a base; that means that you need to proof that those 4 functions are lineary independent. They are so this span is also a base.
@alnitaka
@alnitaka 5 ай бұрын
One can represent the elements of the Galois group of an equation as matrices.
@superuser8636
@superuser8636 5 ай бұрын
Hi, how would you encode the +C ? You can point me to reference or calculate directly here, I will comprehend using notation. Also, never saw the integral vector matrix notation but follow you very easily. MS CS with BS CS (ML) BA Math (Stats). Wanted to take more real/complex/advanced matrix theory but ran out of time and had to get my career started before I hit 35. Love staying sharp with this content early mornings. TYVM❤
@user-uw1ut4ss2q
@user-uw1ut4ss2q 4 ай бұрын
the existence of basis of vector space is equivalent to the axiom of choice which seems to be unrelated to linear algebra.
@zachchamp93
@zachchamp93 4 ай бұрын
This is like using Abstractions in Computer programming basically. Just representing and embedding computational algorithms as algebraic expressions
@ianfowler9340
@ianfowler9340 5 ай бұрын
I remember the first time (Grade 13 High School) I saw the complex numbers, a+bi, represented by a 2x2 real martrix : a -b b a All of the operations on the complex numbers match exactly with matrix operations. So simple and obvious once you see it, but that's what makes it so amazing. Modulus/Determinant, DeMoivre's Theorem, rotation matrix. The list goes on. Even 0 -1 1 0 multiplied by itself by itself gives -1 0 0 -1
@tikeshverma4777
@tikeshverma4777 5 ай бұрын
My fav is least square solutions.
@user-bk2fo7ny9s
@user-bk2fo7ny9s 5 ай бұрын
my fav: dx(t)/dt = A x(t) solution is x(t) = exp(At) x(0)
@MS-sv1tr
@MS-sv1tr 5 ай бұрын
Finding the integral by finding the inverse of the derivative matrix was kind of mind-blowing. I know your example was chosen to make it simple, but can this be generally applied as a way to compute integrals?
@rainzhao2000
@rainzhao2000 3 ай бұрын
I remember being mind-blown just like you and wondering the same thing when I first saw this. I went down the rabbit hole of functional analysis and differential equations and I'm still digging. A neat way of thinking about this problem is to view it as solving the differential equation Df=g for the antiderivative f of the function g, and D is the derivative operator. Since differentiation is linear, Df=g now looks like a linear algebra problem where D is a linear operator, and f and g are vectors. In fact, in the context of functional analysis, functions are vectors belonging to vector spaces of functions appropriately called function spaces. If we can find a finite basis for the function space of f and g, then we can represent D as a matrix, g as a coordinate vector, and solve for f by matrix multiplying D inverse with g just like in the video. In general, these function spaces could be infinite dimensional and there's not always a useful basis to represent them, but the field of functional analysis has classified many kinds of function spaces with a variety of useful bases for solving differential equations.
@kilianklaiber6367
@kilianklaiber6367 5 ай бұрын
Very nice idea! ;-)
@kilianklaiber6367
@kilianklaiber6367 5 ай бұрын
My favorite application is quantum mechanics. That's why your first example immediately came to mind. Hilbert Spaces!
@Chris-mm6mn
@Chris-mm6mn 4 ай бұрын
are the basis arbitrarily chosen for each function?
@strikeemblem2886
@strikeemblem2886 5 ай бұрын
Claim that dim V = 4 at 3:15 is unjustified. Wrong adjecency matrix at 16:50.
@D.E.P.-J.
@D.E.P.-J. 5 ай бұрын
Another class of applications comes from algebraic topology. Algebraic topology uses linear algebra to study topological spaces.
@DaddyRaiden
@DaddyRaiden 5 ай бұрын
That is really fucking interesting
@rainerzufall42
@rainerzufall42 5 ай бұрын
The 2-node is messed up in the last example. 2 is not connected to 2, but to 1! So the upper left 2x2 square is not (1, 0; 0, 1), but (1, 1; 1, 0)...
@rainerzufall42
@rainerzufall42 5 ай бұрын
BTW: It's interesting to calculate Eigenvalues and Eigenvectors of this matrix... For example the 5th EW is λ_5 = 0 with EV v_5 = (0, -1, 1, 0, 1). On the other hand, the first EW is the biggest EW λ_1 = (1+sqrt(13))/2 with EV v_1 = (λ_1, 2, 1, λ_1, 1). The other Eigenvalues are λ_2 = (1+sqrt(5))/2, λ_3 = (1-sqrt(13))/2, and λ_4 = (1-sqrt(5))/2. But it some cases, it may be better to ignore the reflexions, then λ_1 = - sqrt(3), λ_2 = sqrt(3), λ_3 = -1, λ_4 = 1, and λ_5 = 0.
@33_milly93
@33_milly93 5 ай бұрын
Is that matrix at the end an incidence matrix?
@TheF3AR98
@TheF3AR98 5 ай бұрын
Thank you a million BobbyBroccoli, this is the best documentary on us particle physics. I watched it nonstop ❤
@curtiswfranks
@curtiswfranks 5 ай бұрын
Agreed.
@silpheedTandy
@silpheedTandy 3 ай бұрын
whoa. your sponsor itself makes me curious: what future is there for custom LLM chips? if i actually had money, i'd be investing in those companies!
@grizzleyeasy4480
@grizzleyeasy4480 4 ай бұрын
what the hell just happened . is he a magician. why nobody told me this. i had like 20 math subjects
@nunoalexandre6408
@nunoalexandre6408 5 ай бұрын
True!!!!!!!!!!!
@mathobey
@mathobey 5 ай бұрын
I don’t think that these examples show us “unreasonable effectiveness”, there effectiveness is very reasonable. Spaces of smooth functions naturally have structure of vector spaces and linear differential equations by definition rise from linear operators on these spaces. Same story with groups. On the one hand they have strong connections with rings (because there is construction of group ring, and group action of G ZG module structure) and so with modules over the rings (theory of modules of rings is generalization of linear algebra). On the other hand, vector spaces have natural action of automorphism group (also known as GL - general linear group) and for every group G we can find big enough space V and build faithful representation G -> GL(V) (“vectorification” of Cayley theorem for groups). That’s why connections with group theory and theory of differential equations are not surprising What is REALLY surprising, that linear algebra helps us to solve a lot of problems from discrete maths. For example weak Berge conjecture (graph is perfect complement of graph is perfect) has linear algebraical proof. Also spectral theory of graphs studies spectra of graph matrices (purely combinatorial construction, it’s hard to see algebraic meaning in it) and gives us results about (for example) regular graphs inner structure. This is what we really can call “unreasonable effectiveness of linear algebra”. Sorry for mistakes, English is not my native language
@mikecaetano
@mikecaetano 5 ай бұрын
Unbelievable would make better sense here than unreasonable. Linear algebra is highly effective, so effective that at times it may be difficult to believe exactly how highly effective.
@user-qc5qn7yp2z
@user-qc5qn7yp2z 5 ай бұрын
8:46 Isn’t that the wrong sequence of Matrix and vector? Seems it should be (1,0,0,0)*D^-1
@quiksilverrandom
@quiksilverrandom 5 ай бұрын
Isn't there a mistake in the matrix at 14:20 and following. The matrix shows no connection between points 1 and 2 but a connection between 2 and itself.
@ArduousNature
@ArduousNature 5 ай бұрын
I think all three entrees adjacent to the top left corner in the final matrix are wrong (should be flipped)
@ethanlipson1637
@ethanlipson1637 5 ай бұрын
These examples are illuminating, bur do we have an idea of why linear algebra is so well-suited as a proxy for understanding other mathematical objects?
@ClaraDeLemon
@ClaraDeLemon 5 ай бұрын
Linear algebra is the study of vector spaces and its functions (matrices). Vector spaces are those where you can add and multiply by constants. Turns out, adding and multiplying by constants are so fundamental that everything that could have a connection with it will be turned into linear algebra. Plus, matrices are very efficient at conveying information, so for example with graphs, any algorithm using adjacency matrices could be reasonably implemented just by observing the original graph itself, but matrices are handier to work with both for humans and machines
@claytongroves1044
@claytongroves1044 5 ай бұрын
I've never seen adjacency matrices but in the last example, the encoding data, why is the second column (0,1,0,1,0) and not (1,0,0,1,0)? Thanks.
@mikecaetano
@mikecaetano 5 ай бұрын
Looks like you found a mistake to me.
1886 Cambridge University Exam Integral
20:11
Michael Penn
Рет қаралды 2,7 М.
小路飞第二集:小路飞很听话#海贼王  #路飞
00:48
路飞与唐舞桐
Рет қаралды 16 МЛН
Суд над Бишимбаевым. 2 мая | ОНЛАЙН
7:14:30
AKIpress news
Рет қаралды 685 М.
2024 May  IBDP HL Maths AA TZA Paper 1 Q11ab
5:19
Learning and Teaching Resource Centre
Рет қаралды 441
Lecture 2.4 The Diffie-Hellman Key Exchange [Blockchain Tech]
13:10
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 168 М.
2023's Biggest Breakthroughs in Math
19:12
Quanta Magazine
Рет қаралды 1,5 МЛН
A differential equation from the famous Putnam exam.
20:21
Michael Penn
Рет қаралды 22 М.
Barry Mazur on algebraization
4:18
ζ(🍄)
Рет қаралды 13 М.
Why can't you multiply vectors?
51:16
Freya Holmér
Рет қаралды 384 М.
when a quadratic equation has an infinite root.
16:47
Michael Penn
Рет қаралды 33 М.
The History of Linear Algebra
16:41
gRocket
Рет қаралды 38 М.
Entire Chess World In Meltdown Over Bizarre New Opening
14:41
Epic Chess
Рет қаралды 56 М.