No video

What do Matrices Represent? - Learning Linear Algebra

  Рет қаралды 4,873

Mu Prime Math

Mu Prime Math

Күн бұрын

This video is about why we use matrices and how every matrix is related to a function that takes vectors as inputs. Understanding what a matrix represents is important in order to learn about the more advanced ideas in linear algebra!
Learning Linear Algebra playlist: • Learning Linear Algebra
Subscribe to see more new math videos!
Music: OcularNebula - The Lopez

Пікірлер: 31
@sanelprtenjaca9776
@sanelprtenjaca9776 3 жыл бұрын
Bravo! Your channel and 3Blue1Brown are just a perfect combination.
@eswyatt
@eswyatt 3 жыл бұрын
The difference is that after watching Hayden, you might actually be able to do some math. Sorry I had to ...
@friedrichbaumgarten8886
@friedrichbaumgarten8886 2 жыл бұрын
mu prime is better
@impc
@impc 3 жыл бұрын
BOOM! Very well explained! Thank you for clearing the confusions that has been in my mind for years!
@bjornchan3083
@bjornchan3083 3 жыл бұрын
Holy shit, I'm studying matrices and your video is such a huge help! Thankssssss for uploading this video!!!! Love your channel!
@ohno8774
@ohno8774 3 жыл бұрын
Your videos are incredible, they give such good foundational understanding
@BuddyNovinski
@BuddyNovinski 2 жыл бұрын
I could've have use this in late 1976. I took linear algebra without even knowing what a vector was! Yes, Grant Sanderson has an excellent channel, but now I'm thinking that I wish my professor had been left-handed. David Shirokoff and this fellow seem better at explaining the whole picture of mathematics rather than getting lost in the details. Supposedly our right brain sees the whole picture and dominates the left side. So now I finally understand what a matrix is.
@mrwclasseshazaribagh4105
@mrwclasseshazaribagh4105 3 жыл бұрын
You are very very very...... Good explainer. You are the one of the best mathematics teacher on youtube like, 3 blue one brown , mathelozer
@estudematematica
@estudematematica 3 жыл бұрын
What a great lesson! Congratulations! 👏👏👏
@flamewings3224
@flamewings3224 2 жыл бұрын
Firstly when I knew about Matrix I was learning that like a new language. Just like know the text without know a words means. But now I understand. Maybe not exactly, but around near the truth for why I leaning the Matrix. A lot of thanks for your great work!
@larzcaetano
@larzcaetano 3 жыл бұрын
It is always amazing to get back to linear algebra and study matrix related stuff! I would love to see you explaining some tensor algebra someday! Amazing video!!!!
@zahraakhalife9150
@zahraakhalife9150 2 жыл бұрын
Wow! Amazing explanation! 😍😍😍
@adolfocarrillo248
@adolfocarrillo248 2 жыл бұрын
Crap!😂 I just realise that I don´t know nothing about linear algebra!!! Thank for sharing your knowledge, you´ve an accute sense of reasoning, I´m really dazzled by your sight.
@programmer4047
@programmer4047 Жыл бұрын
Can you please make a video on minors, cofactors, adjoint and why do we need them? Why do we use them to calculate determinants and how are they related to determinants.
@mrboyban
@mrboyban 2 жыл бұрын
Muchas gracias hombre
@rktiwa
@rktiwa 3 жыл бұрын
Some tutor, yar! Out and out awesome!
@joetursi9573
@joetursi9573 2 жыл бұрын
Bravo. So complete.
@vasuhardeo1418
@vasuhardeo1418 3 жыл бұрын
Amazing vid, love the explanation.
@kingplunger6033
@kingplunger6033 3 жыл бұрын
thank you, really helpful and concise :)
@algoexpert
@algoexpert 3 жыл бұрын
This guy is just amazing!!!!
@oxbmaths
@oxbmaths 3 жыл бұрын
Brilliant explanation!
@cantcommute
@cantcommute 3 жыл бұрын
Oh this explains so much!
@mrwclasseshazaribagh4105
@mrwclasseshazaribagh4105 3 жыл бұрын
Can you make a video about the method of finding square root of numbers. Why, that method works?
@patolorde
@patolorde 3 жыл бұрын
i always wondered that
@AjayKumar-jb3qe
@AjayKumar-jb3qe 3 жыл бұрын
How can i deaign my studio like your...
@joetursi9573
@joetursi9573 2 жыл бұрын
Now you see what unit vectors are. They are the column vectors that are 1 by 2.
@HolyG-sus
@HolyG-sus 3 жыл бұрын
Let's do this
@argamnex
@argamnex 3 жыл бұрын
feel like 3Blue1Brown video but drier hmm...
@eswyatt
@eswyatt 3 жыл бұрын
Yeah learning is a bit harder than just having someone trigger a faux intuitive epiphany that makes you say "wow, I knew it all along!"
@angelmendez-rivera351
@angelmendez-rivera351 2 жыл бұрын
In actuality, it is conceptually more fundamental than this. Your explanation of a vector space described vectors as being tuples of numbers. That is not what a vector is. A vector space is just comprised of arbitrary objects you can add, and you can scale them by some numerical factor, an element of a field. As long as the operations are well-defined and satisfy the axioms of a space, it does not matter what the vectors are. They could be tuples of numbers, but they could also be sequences, tuples of tuples, functions of real numbers, and funnily enough, they could even be matrices. In fact, you can have a vector space whose elements are vector spaces themselves. If I have a collection of apples, and I give this collection of apples some form of addition, and a way to scalar multiply, then I have a vector space whose vectors are my apples. That is, fundamentally, what a vector space is. In color science, we deal with infinite-dimensional vector spaces, but the vectors are not tuples of numbers or arrows in space. The vectors are the colors themselves. Color vision is a phenonenon that can be completely formalized as a projection between vector spaces. A matrix, meanwhile, is a function that takes tuples of numbers from a field, and maps them to a single number. They do not have to be 2-tuples either. They could be 3-tuples. With 3-tuples, you can have things 2-by-3-by-4 matrices, and the entries are arranged not in a rectangular 2-dimensional array, but a 3-dimensional array that looks like a rectangular prism. That is just what matrices are. A different question is to explain why matrices are important, and how they are related to vector spaces. If you have an n-dimensional vector space, then the space of n-by-1 column matrices just so happens to be isomorphic to that n-dimensional vector space. If you have a fixed basis, then you can arrange the coefficients of any linear combination of that basis into the entries of the n-by-1 column matrix. That is what allows us to represent multilinear operators between finite-dimensional vector spaces as matrices. The vector space could be R^n, but it could be something completely different. Also, the basis vectors do not have to be (1, 0) and (0, 1). I could just as easily say the basis vectors are (1/2, 1/2) and (1/2, -1/2). It does not matter. As long as they are linearly independent, and they span the space, they can be arbitrarily chosen as basis vectors. The idea is simply this: with multilinear operators, all the information you need to uniquely identify them is to know what they do to the basis vectors, regardless of what you choose those basis vectors to be. How you represent an operator as a matrix depends on the basis you choose, just as how you represent a vector as a matrix depends on the basis you choose, but the magic about matrices is that regardless of the choice, the structure of the equations will always look the same, and the operations between the matrices will always be the same. Finally, the importance of linear operators in higher dimensional vector spaces is the same as that of linear functions in 1-dimensional calculus. Any function between two vector spaces, regardless of whether that function is linear or not, can be expanded into a Taylor series, and the linearization process that gives rise to this expansion uses linear operators. This gives matrices additional importance as functions between vector spaces. This is why the definition of the derivative of f at p that you find in many textbooks for functions f : R^n -> R^n is that there exists a linear operator A(p) on R^n such that lim ||f(x) - f(p) - A(p)·(x - p)||/||x - p|| (x -> p) = 0. This definition actually works if f is a function between any two normed vector spaces, but if the vector spaces are finite dimensional, then A(a) is always representable as a matrix.
@MuPrimeMath
@MuPrimeMath 2 жыл бұрын
This video considers finite-length matrices, which are linear transformations of finite-dimensional vector spaces. Every finite-dimensional vector space is isomorphic to F^n for F the field of scalars and n the dimension of the vector space, and this isomorphism enables us to represent elements of an arbitrary finite-dimensional vector space as tuples of elements in the field of scalars. Matrices in the standard basis for F^n can be viewed as linear transformations of an arbitrary finite-dimensional vector space via conjugation by this isomorphism.
1: What Does a Matrix Represent? - Learning Linear Algebra
9:06
Mu Prime Math
Рет қаралды 18 М.
Look at two different videos 😁 @karina-kola
00:11
Andrey Grechka
Рет қаралды 14 МЛН
Kids' Guide to Fire Safety: Essential Lessons #shorts
00:34
Fabiosa Animated
Рет қаралды 15 МЛН
Fortunately, Ultraman protects me  #shorts #ultraman #ultramantiga #liveaction
00:10
What do matrices represent physically?
14:59
Usman Majeed
Рет қаралды 46 М.
Abstract vector spaces | Chapter 16, Essence of linear algebra
16:46
3Blue1Brown
Рет қаралды 1,4 МЛН
Linear Algebra - Math for Machine Learning
41:23
Weights & Biases
Рет қаралды 100 М.
Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra
17:16
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
Linear Algebra Derivation of Lorentz Transformation
22:01
Mu Prime Math
Рет қаралды 11 М.
The Best Way To Learn Linear Algebra
10:32
The Math Sorcerer
Рет қаралды 70 М.
1. The Geometry of Linear Equations
39:49
MIT OpenCourseWare
Рет қаралды 1,7 МЛН
Look at two different videos 😁 @karina-kola
00:11
Andrey Grechka
Рет қаралды 14 МЛН