Matrix Norms : Data Science Basics

  Рет қаралды 56,288

ritvikmath

ritvikmath

Күн бұрын

Пікірлер: 54
@florawang7603
@florawang7603 3 жыл бұрын
Thank you this is the clearest video on matrix norms I've watched so far
@ritvikmath
@ritvikmath 3 жыл бұрын
Wow, thank you!
@dansantner
@dansantner 2 жыл бұрын
Agree. I have been watching Strang's lectures and he skips so many conceptual steps sometimes it's hard to follow. This filled in the gaps.
@robertc6343
@robertc6343 3 жыл бұрын
We’re saying “OK, matrix, you’re allowed to have...” 🤣🤣🤣🤣 so stressful for her 🤣🤣
@angelmcorrea1704
@angelmcorrea1704 4 жыл бұрын
Thanks for all this lectures, very clear.
@jtm8514
@jtm8514 3 жыл бұрын
You make studying fun! Thank you so much, I loved watching this. It wasn't a chore after a bit, I was in bliss from how cool the math was.
@hba3415
@hba3415 3 жыл бұрын
I was struggling so hard and finding anything on the internet. Thank God I found your Video.
@sofiyavyshnya6723
@sofiyavyshnya6723 4 жыл бұрын
Amazing video! Super brief and to the point and super clear! Thanks so much for all your help!
@ritvikmath
@ritvikmath 4 жыл бұрын
Glad it helped!
@teodorvijiianu41
@teodorvijiianu41 Жыл бұрын
I swere to god I watched the uni lecture 3 times i had no ideea what they were talking about. In less than 10 minutes it now makes sense. Thank you!
@jacob_dmn
@jacob_dmn 3 жыл бұрын
This Channel Changed My way of thinking.. THANK YOU MAN
@haimteicherteicher4227
@haimteicherteicher4227 3 жыл бұрын
very focused and clear explanation, much appreciated.
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@tyflehd
@tyflehd 3 жыл бұрын
Thank you for your clear and intuitive descriptions :)
@ritvikmath
@ritvikmath 3 жыл бұрын
You're most welcome!
@emmanuelamankwaaadjei3051
@emmanuelamankwaaadjei3051 2 жыл бұрын
Precise yet detailed explanation. Great work.
@ЕгорРудица
@ЕгорРудица 3 жыл бұрын
Thank you bro! Huge Respect from Ukraine!
@邱越-n6u
@邱越-n6u 3 жыл бұрын
Thank u! I am just wondering if a non-square matrix has a norm as well. Why you give [ui ui...ui] the matrix rather than the nx1size matrix [ui]?
@bobbysokhi7296
@bobbysokhi7296 Ай бұрын
Great explanation.
@ritvikmath
@ritvikmath Ай бұрын
Glad you think so!
@srs.shashank
@srs.shashank 2 жыл бұрын
This would be applicable for only square matrices right? How we calculate 2-norms for rectangular(non-square) matrices, Thanks!
@kamelismail3730
@kamelismail3730 3 жыл бұрын
this is an absolute gem!
@manhhungnguyen4270
@manhhungnguyen4270 2 жыл бұрын
Thank you for the explanation. I have a question Does this spectral norm ( 2-norm) show the "size" of the matrix just like Frobenius norm? I know that 2 norm of matrix shows the maximum singular value of a matrix and the Frobenius norm shows the "size" of a matrix But I am confused when you use 2 norm to compare the matrices When I want to compare 2 matrices, which one is better to use?
@youlongding
@youlongding 3 жыл бұрын
could you give me the detaied discussion of that property mentioned
@pulkitnijhawan653
@pulkitnijhawan653 3 жыл бұрын
Awesome intuitive explanation :)
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad you liked it!
@maciejgarbacz8785
@maciejgarbacz8785 Жыл бұрын
I think that this video does not give a good, human-like explanation. I have figured out this thing from other sources after some time and I will leave my attempt of explanation here. For vectors, you would usually want to know how big they are, based on their length. Applying the Pythagoras theorem, would give that ||v||2 = sqrt(x^2 + y^2 + z^2 + ...), which is considered a 2-norm (or Euclidean norm) and can be used to measure the length of a vector in an nD space. As it turns out, these are not the only ways of measuring length of vectors, as there are different norms (And a generalized formula for an n-norm), like 1-norm (Manhattan norm), which gives the total distance to walk in straight lines: ||v||1 = |x| + |y| + |z| + ... . Last important norm is the infinity-norm, which lightly speaking tells that if you were a chess king piece on a plane, how many moves would you have to perform to get to the tip of the vector: ||v||inf = max(x, y, z, ...). All of those n-norms have a visualization, which is just a graph which you obtain after assuming the length of a vector is 1 and transforming the given formulas. 1 = sqrt(x^2 + y^2) 1^2 = x^2 + y^2 y^2 = 1 - x^2 y = sqrt(1-x^2) or y = -sqrt(1-x^2) (Fun Fact: That is the equation for a circle, and you can technically integrate that to get its area to obtain the value of pi, which is what Newton did to calculate the precise value of pi for the first time in history - Watch "The Discovery That Transformed Pi" by Veritassium) As it turns out it is also useful to measure how big the output of a matrix can get! For example if the maximum length of an output vector would be zero, you would know that the matrix gives a 0 to every vector it gets, so it is useless in many cases. This is literally a matrix norm. The matrix 2-norm means that the length of a vector will be measured by the 2-norm method I explained below. So to get that matrix norm, just plug in every possible vector with length 1 and find the output vector with maximum length. I hope I have helped someone in despair. I was just really frustrated how everyone on the internet just reads out the formulas and hopes the viewer will memorize them without any understanding. If there is something wrong in my comment, don't hesitate to give a reply there!
@juneshgautam8655
@juneshgautam8655 11 ай бұрын
4:22 Could you please provide me the information on how could I find a proof for that?
@SeidelMatheus
@SeidelMatheus 2 жыл бұрын
Great lesson!
@doxo9597
@doxo9597 3 жыл бұрын
This was great, thank you!
@EmilioGarcia_
@EmilioGarcia_ 4 жыл бұрын
Hi I really enjoy your content! Quick question here, what you mean with '' bigger output "? Perhaps that with a matrix A you can span most of 'y' that belongs to R^m with the vector x that belongs to R^n ? Bit confused here, thanks for your help.
@thehorizontries4759
@thehorizontries4759 10 ай бұрын
He means vectors with bigger norms
@user-or7ji5hv8y
@user-or7ji5hv8y 3 жыл бұрын
Awesome and clear
@KorayUlusan
@KorayUlusan Жыл бұрын
10/10 video!
@sandrasyperek1945
@sandrasyperek1945 2 жыл бұрын
Great video. I like it. Concrete examples would have been nice. Thank you for making the video. :)
@maksymfigat
@maksymfigat Жыл бұрын
Thanks!
@fatihsarac662
@fatihsarac662 Жыл бұрын
Perfect
@Pukimaxim
@Pukimaxim 4 жыл бұрын
What do you mean by decay when talking about negative 9 to the power of a large number?
@jako276
@jako276 3 жыл бұрын
I believe that he said "point 9" meaning 0.9, and by decay he means that 0.9^2 would 0.81, 0.9^3 would be 0.729, thus "decaying" to zero as exponent approaches large numbers.
@sirengineer4780
@sirengineer4780 2 жыл бұрын
Great ! keep on bro
@noway4715
@noway4715 4 жыл бұрын
Definition of matrix norm can have an example?
@thomasheirbaut6612
@thomasheirbaut6612 5 ай бұрын
insanneeeeeee! Legend
@Posejdonkon
@Posejdonkon Жыл бұрын
Applicable and accent-free study material. Greatly appreciated!
@박병현-g2e
@박병현-g2e 3 жыл бұрын
u r really cooool~~~~~
@tanvirkaisar7245
@tanvirkaisar7245 Жыл бұрын
could you please give me the detailed proof of ||A||||B||
@epsilonxyzt
@epsilonxyzt 4 жыл бұрын
solve an example is better than so much talk.
@cahitskttaramal3152
@cahitskttaramal3152 Жыл бұрын
couldn't understand almost most of it but thanks anyway
@bartlmyy
@bartlmyy 3 жыл бұрын
merci bisou
@Fat_Cat_Fly
@Fat_Cat_Fly 3 жыл бұрын
The editing brings jumps and looks very uncomfortable. The course is superb!
@seankeaneylonergan1859
@seankeaneylonergan1859 Ай бұрын
Tldw, he doesn’t show how to calculate the norm of a vector.
@broda680
@broda680 3 жыл бұрын
Did someone ever tell you that you look like kumar from the movie Harold and Kumar ? :D
@psychwolf7590
@psychwolf7590 3 жыл бұрын
I was thinking of the same actor!! They are soo similar omg
@aviator1472
@aviator1472 3 ай бұрын
Didnt understand anything.
@potreschmotre1118
@potreschmotre1118 3 жыл бұрын
thank you!
Eigendecomposition : Data Science Basics
8:51
ritvikmath
Рет қаралды 70 М.
Lecture 8: Norms of Vectors and Matrices
49:21
MIT OpenCourseWare
Рет қаралды 160 М.
How it feels when u walk through first class
00:52
Adam W
Рет қаралды 24 МЛН
Human vs Jet Engine
00:19
MrBeast
Рет қаралды 125 МЛН
哈哈大家为了进去也是想尽办法!#火影忍者 #佐助 #家庭
00:33
火影忍者一家
Рет қаралды 130 МЛН
Numerical Methods: Vector and Matrix Norms
10:57
Jaisohn Kim VT
Рет қаралды 54 М.
The Jacobian : Data Science Basics
10:04
ritvikmath
Рет қаралды 35 М.
What is Norm in Machine Learning?
5:15
Normalized Nerd
Рет қаралды 78 М.
Eigenvalues & Eigenvectors : Data Science Basics
11:58
ritvikmath
Рет қаралды 147 М.
The Covariance Matrix : Data Science Basics
11:00
ritvikmath
Рет қаралды 235 М.
The Lp Norm for Vectors and Functions
9:34
Dr. Will Wood
Рет қаралды 78 М.
Norms
16:38
NPTEL-NOC IITM
Рет қаралды 73 М.
MATH426: Matrix norms
13:44
Toby Driscoll
Рет қаралды 78 М.
Diagonalization and power of a matrix
11:32
Prime Newtons
Рет қаралды 36 М.