Expressing a projection on to a line as a matrix vector prod | Linear Algebra | Khan Academy

  Рет қаралды 154,230

Khan Academy

Khan Academy

Күн бұрын

Expressing a Projection on to a line as a Matrix Vector prod
Watch the next lesson: www.khanacadem...
Missed the previous lesson?
www.khanacadem...
Linear Algebra on Khan Academy: Have you ever wondered what the difference is between speed and velocity? Ever try to visualize in four dimensions or six or seven? Linear algebra describes things in two dimensions, but many of the concepts can be extended into three, four or more. Linear algebra implies two dimensional reasoning, however, the concepts covered in linear algebra provide the basis for multi-dimensional representations of mathematical reasoning. Matrices, vectors, vector spaces, transformations, eigenvectors/values all help us to visualize and understand multi dimensional concepts. This is an advanced course normally taken by science or engineering majors after taking at least two semesters of calculus (although calculus really isn't a prereq) so don't confuse this with regular high school algebra.
About Khan Academy: Khan Academy offers practice exercises, instructional videos, and a personalized learning dashboard that empower learners to study at their own pace in and outside of the classroom. We tackle math, science, computer programming, history, art history, economics, and more. Our math missions guide learners from kindergarten to calculus using state-of-the-art, adaptive technology that identifies strengths and learning gaps. We've also partnered with institutions like NASA, The Museum of Modern Art, The California Academy of Sciences, and MIT to offer specialized content.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to KhanAcademy’s Linear Algebra channel:: / channel
Subscribe to KhanAcademy: www.youtube.co...

Пікірлер: 24
@Vulkura
@Vulkura 12 жыл бұрын
You explained this better than my professor and teaching assistant. My hat goes off to you this day, sir.
@fashionvella730
@fashionvella730 Жыл бұрын
Khan academy please can you tell me where you find out all this stuff in details because in books there is always something missing. But in your videos, you give the whole proof of everything.
@atiramgine
@atiramgine 12 жыл бұрын
Khaaaaan. I love you Khan. I can almost feel my Lin Alg grade rising...
@doordie7560
@doordie7560 9 жыл бұрын
i had never seen tutorials like this
@stonemysterioserusss
@stonemysterioserusss 9 жыл бұрын
do or die until about a week ago
@ratnakarbachu2954
@ratnakarbachu2954 3 жыл бұрын
thanks sir.god bless u.
@LAnonHubbard
@LAnonHubbard 13 жыл бұрын
@EvgenijM86 Sal is applying the projection to the first column of the identity matrix (the "x" basis vector [1 0]). The projection formula was already defined to be (vector_to_project "dot" unit_vector) * unit_vector. In this case vector_to_project is the "x" basis vector.
@scientist1492
@scientist1492 8 жыл бұрын
Hello everyone, Could someone explain how the solutions to (x.u)u = Ax could be taken as a span of [1 0] and [0 1] as explained in the video at 9:47 point? Any links to explaning the process could be helpful. Thank You
@mingtianni
@mingtianni 7 жыл бұрын
The answer is given in lecture 56. Here is a summary: T(x) = T(I*X) = T(x1*e1 + x2*e2) = x1*T(e1) + x2*T(e2) = [T(e1), T(e2)]*X, where I is identity matrix and e1 and e2 are basis vectors. The idea has been that any vector is a is a linear combination of basis vectors. Applying a linear transformation to that vector is equivalent to applying the same transformation to basis vectors first and then combining afterwards.
@zhenyishen5102
@zhenyishen5102 7 жыл бұрын
Thank you, I have better understanding about the projection and It has some strong conncetion with the matrix norm.
@alaaabdullahmohammedalsaad7447
@alaaabdullahmohammedalsaad7447 3 жыл бұрын
you expressed very well the meaning of projection , thank you so much
@97021840
@97021840 15 жыл бұрын
excellent
@s0m0c
@s0m0c 13 жыл бұрын
Gracias.
@mt2yo
@mt2yo 4 жыл бұрын
Bravo, bravo, bravo! Very elegant explanation thank you so much!
@gekorio
@gekorio 15 жыл бұрын
I think I understand your thinking, but the dot product give us a scalar, so it can't be a projection, right?
@alkalait
@alkalait 15 жыл бұрын
I've come to think of the dot product also as the projection of a onto b, as if b was a unit vector, but with that projection scaled to times the length of b
@martinhvezda6535
@martinhvezda6535 3 жыл бұрын
i love you
@alkalait
@alkalait 15 жыл бұрын
yes, apologies. the projection is a vector, my mistake. replace the word "projection" with "length of projection", to what I said before.
@suyashneelambugg
@suyashneelambugg 5 жыл бұрын
You don't explain the reason why you choose identity matrix for finding A and what if we use any other matrix instead of identity.
@mt2yo
@mt2yo 4 жыл бұрын
What happens if you choose another matrix?
@saakethchintha5385
@saakethchintha5385 4 жыл бұрын
He chose an identity matrix because it's also the standard matrix in R2..meaning whatever cud be done to the standard matrix cud be done to any other matrix in R2
@Judobrinez
@Judobrinez 8 жыл бұрын
... ?
@Judobrinez
@Judobrinez 8 жыл бұрын
***** nice ;D
"Идеальное" преступление
0:39
Кик Брейнс
Рет қаралды 1,4 МЛН
«Жат бауыр» телехикаясы І 26-бөлім
52:18
Qazaqstan TV / Қазақстан Ұлттық Арнасы
Рет қаралды 434 М.
Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra
17:16
16. Projection Matrices and Least Squares
48:05
MIT OpenCourseWare
Рет қаралды 480 М.
Vector Projections : Data Science Basics
14:58
ritvikmath
Рет қаралды 70 М.
The deeper meaning of matrix transpose
25:41
Mathemaniac
Рет қаралды 402 М.
A Swift Introduction to Geometric Algebra
44:23
sudgylacmoe
Рет қаралды 903 М.
Dot products and duality | Chapter 9, Essence of linear algebra
14:12
3Blue1Brown
Рет қаралды 2,7 МЛН
Change of basis | Chapter 13, Essence of linear algebra
12:51
3Blue1Brown
Рет қаралды 2 МЛН