The Moore-Penrose Pseudoinverse - Topic 37 of Machine Learning Foundations

  Рет қаралды 24,118

Jon Krohn

Jon Krohn

Күн бұрын

This video introduces Moore-Penrose pseudoinversion, a linear algebra concept that enables us to invert non-square matrices. The pseudoinverse is a critical machine learning concept because it solves for unknown variables within the non-square systems of equations that are common in machine learning. To show you how it works, we’ll use a hands-on code demo.
There are eight subjects covered comprehensively in the ML Foundations series and this video is from the second subject, "Linear Algebra II: Matrix Operations". More detail about the series and all of the associated open-source code is available at github.com/jonkrohn/ML-foundations
The next video in the series is: • Regression with the Ps...
The playlist for the entire series is here: • Linear Algebra for Mac...
This course is a distillation of my decade-long experience working as a machine learning and deep learning scientist, including lecturing at New York University and Columbia University, and offering my deep learning curriculum at the New York City Data Science Academy. Information about my other courses and content is at jonkrohn.com
Dr. Jon Krohn is Chief Data Scientist at untapt, and the #1 Bestselling author of Deep Learning Illustrated, an interactive introduction to artificial neural networks. To keep up with the latest from Jon, sign up for his newsletter at jonkrohn.com, follow him on Twitter @JonKrohnLearns, and on LinkedIn at linkedin.com/in/jonkrohn

Пікірлер: 20
@EmmanuelPeter-y4d
@EmmanuelPeter-y4d 3 ай бұрын
Thanks Jon Krohn. Do you have a course detailing learning ML from ground up? Thanks in anticipation of your response, I love your pedagogical skills.
@Jamming0ut
@Jamming0ut 3 жыл бұрын
Excelente video, me ha ayudado de una manera increíble, saludos desde Colombia.
@JonKrohnLearns
@JonKrohnLearns 3 жыл бұрын
You're most welcome, Santiago! Glad you found this video helpful :)
@bastianian2939
@bastianian2939 3 жыл бұрын
Insanely helpful video. Hopes my comment helps boost this to the youtube algorithm!
@JonKrohnLearns
@JonKrohnLearns 3 жыл бұрын
Thanks, Bastian!
@Grobulia1
@Grobulia1 3 жыл бұрын
Thank you so much for your clear explanation! It would also be helpful if you posted a link to the Jupyter notebook that is shown in the video so that we could peruse it.
@JonKrohnLearns
@JonKrohnLearns 3 жыл бұрын
You're most welcome, Ksenia! Glad you found my explanation of MPP helpful :) A link to the accompanying open-source code is provided in the video description and, separately, as a text overlay when the notebook is first brought up during the video. For convenience, here is the full URL: github.com/jonkrohn/ML-foundations/blob/master/notebooks/2-linear-algebra-ii.ipynb
@etherioussanjudraganeel3163
@etherioussanjudraganeel3163 3 жыл бұрын
Man You are the Best you don't know how much u helped me through this video Thank you so much
@JonKrohnLearns
@JonKrohnLearns 3 жыл бұрын
YES! I am so happy to hear this. You are so welcome and I'm delighted to be able to help :D
@mohamedsamsudeensoofiba8382
@mohamedsamsudeensoofiba8382 8 ай бұрын
First of all, the video and the playlist are super helpful. one small thing I found out is that the final output after doing the calculations from torch.svd is not matching with the torch.pinverse(). But surprisingly torch.linalg.svd() output of pseudo inverse is matching with torch.pinverse(). is it expected?
@Victor-ji1rz
@Victor-ji1rz 9 ай бұрын
A_p = torch.tensor([[-1, 2], [3, -2], [5, 7.]]) U, d, V = torch.svd(A_p) UT = torch.transpose(U, 0, 1) d_diag = torch.diag(d) d_plus = torch.inverse(d_diag) torch.matmul(V, torch.matmul(d_plus, UT))
@Victor-ji1rz
@Victor-ji1rz 9 ай бұрын
This works, but can someone explain to me why the matrix U is not of the same size as the one returned with the numpy SVD method ?
@NavnilDas-o1n
@NavnilDas-o1n 3 ай бұрын
1 second ago The code in Pytorch is as follows: import torch A_pt = torch.tensor([[-1,2],[3,-2],[5,7]]).float() A_pt U_pt, d_pt, Vt_pt = torch.linalg.svd(A_pt) U_pt_T = U_pt.T V_pt = Vt_pt.T V_pt D_pt = np.diag(d_pt) D_plus_pt = torch.linalg.inv(D_pt) D_conc_plus_pt = torch.concatenate( (D_plus_pt, torch.tensor([[0.],[0.]])), axis=1 ) A_plus_pt = torch.matmul(V_pt, torch.matmul(D_conc_plus_pt ,U_pt_T)) A_plus_pt I am getting the following result: tensor([[-0.0877, 0.1777, 0.0758], [ 0.0766, -0.1193, 0.0869]]) Can somebody please tell if I am correct?
@taraskuzyk8985
@taraskuzyk8985 2 жыл бұрын
Are there any resources on why pseudinversion works as an ML rule so well with noisy data? (compared to something like Hebbian learning)
@theseusRJ7
@theseusRJ7 2 жыл бұрын
hey is it too far fetch for me to try to recreate a linear regression algorithm on my own at this point?
@JonKrohnLearns
@JonKrohnLearns 2 жыл бұрын
Certainly not! I think this video provides you with precisely a way to do it, as long as the dataset is not too large. In my "Calculus for ML" playlist (which I recommend undertaking after the Linear Algebra one you've been working through), we thoroughly detail how to create a linear regression algorithm using a machine learning approach that scales to any number of data points: kzbin.info/aero/PLRDl2inPrWQVu2OvnTvtkRpJ-wz-URMJx
@literallynobody4840
@literallynobody4840 Ай бұрын
You have given best shot to explain....but my brain is not able to catch up 😢
@subhashmishra8665
@subhashmishra8665 Жыл бұрын
My Frustration level at -2:09 🙂🙂🙂
@ali-qq6cp
@ali-qq6cp 2 жыл бұрын
V transpose was not in the moore-penrose formula, but when you calculated, you put V transpose !!
@JonKrohnLearns
@JonKrohnLearns 2 жыл бұрын
Yep, I explain this in the audio: The method we used to create V automatically creates "V transpose", so in order to create an untransposed V (for use in the Moore-Penrose formula), we need to tranpose "V transpose"! That is why we annotated with "VT.T" in the code: We're taking "V transpose" and transposing it to make it plain old V.
DID A VAMPIRE BECOME A DOG FOR A HUMAN? 😳😳😳
00:56
Как не носить с собой вещи
00:31
Miracle
Рет қаралды 894 М.
What's in the clown's bag? #clown #angel #bunnypolice
00:19
超人夫妇
Рет қаралды 24 МЛН
My Daughter's Dumplings Are Filled With Coins #funny #cute #comedy
00:18
Funny daughter's daily life
Рет қаралды 34 МЛН
Advanced Linear Algebra - Lecture 38: Introduction to the Pseudoinverse
11:16
Linear Systems of Equations, Least Squares Regression, Pseudoinverse
11:53
Singular Value Decomposition (the SVD)
14:11
MIT OpenCourseWare
Рет қаралды 612 М.
Using the Moore-Penrose Pseudoinverse to Solve Linear Equations
6:51
Lecture 47 - Singular Value Decomposition | Stanford University
13:40
Artificial Intelligence - All in One
Рет қаралды 336 М.
What are Genetic Algorithms?
12:13
argonaut
Рет қаралды 50 М.
DID A VAMPIRE BECOME A DOG FOR A HUMAN? 😳😳😳
00:56