Principal Component Analysis (PCA) 1 [Python]

  Рет қаралды 49,646

Steve Brunton

Steve Brunton

Күн бұрын

Пікірлер: 27
@yenunadeesaselviento
@yenunadeesaselviento 4 жыл бұрын
The code cuts off at the edge of the video. Where can we download it. Thanks for sharing this!
@NiKogane
@NiKogane 2 жыл бұрын
Thank you so much for providing all of this knowledge online for free !
@EladM8a
@EladM8a 4 жыл бұрын
Why the division in B/np.sqrt(nPoints)?
@anirbanbhattacharjee8093
@anirbanbhattacharjee8093 Жыл бұрын
In PCA literature, the covarience matrix B*B is normalized by nPoints (or the bessel correction (nPoints -1), but doesn't matter here because nPoints is large). So if you normalize B by np.sqrt(nPoints) instead, B* also gets normalized by np.sqrt(nPoints) and you end up getting the C normalized by nPoints
@anirbanbhattacharjee8093
@anirbanbhattacharjee8093 Жыл бұрын
where C = (B*)B, & B* is the transpose of B
@jbhsmeta
@jbhsmeta 4 жыл бұрын
Hi Mr. Steve, I have one question, why are you dividing the "B by np.sqrt(nPoints)" U, S, VT = np.linalg.svd(B/np.sqrt(nPoints),full_matrices=0) dividing mean centered data by sqrt of no.of data -?? Could not understand.
@melvinlara6151
@melvinlara6151 4 жыл бұрын
Actually i have the exact same question. Could you figure it out?
@JoaoVitorBRgomes
@JoaoVitorBRgomes 4 жыл бұрын
@@melvinlara6151 I didn't see the whole lecture yet, but I guess B is data with mean =0 and np.sqrt(nPoints) probably is the standard deviation (variance squared). So he first standardize the data then he applies SVD ...
@melvinlara6151
@melvinlara6151 4 жыл бұрын
@@JoaoVitorBRgomes hey! actually i figured the same thing out. But, thank you;
@JoaoVitorBRgomes
@JoaoVitorBRgomes 4 жыл бұрын
@@melvinlara6151 no problem Melvin Lara, I am a student of Data Science too. If you have a kaggle profile and want to exchange knowledge my alias is " topapa .
@anirbanbhattacharjee8093
@anirbanbhattacharjee8093 Жыл бұрын
In PCA literature, the covarience matrix (B*)B is normalized by nPoints (or the bessel correction (nPoints -1), but doesn't matter here because nPoints is large). So if you normalize B by np.sqrt(nPoints) instead, B* also gets normalized by np.sqrt(nPoints) and you end up getting the (B*)B normalized by nPoints
@tomlane6590
@tomlane6590 3 жыл бұрын
A brilliant set of videos. Thank you so much.
@subramaniannk3364
@subramaniannk3364 4 жыл бұрын
Great lecture Steve! You explained that "u" in svd represents principal direction, "sigma" represents loading. What does "v" represents ?
@sambroderick5156
@sambroderick5156 3 жыл бұрын
There’s a whole series a lectures explaining this (and a book.
@sheiladespard8861
@sheiladespard8861 3 жыл бұрын
I tried to download the code from the website, but Python code folder includes only Matlab code :(
@NiKogane
@NiKogane 2 жыл бұрын
Hi, it was corrected - I downloaded it today !
@muhammadmuneeburrahman1262
@muhammadmuneeburrahman1262 3 жыл бұрын
You said in the video that each row of X will represent an example/or record, and column will represent the feature. In your code, X.shape = (2, 1000) where each column represent one data point and. B is passed to the SVD with same shape. Hence the VT matrix size is (2,1000) which means that there are 1000 principle Componenets, that is not possible of 2D data??? Am I right or wrong?? Please explain?
@kanacaredes
@kanacaredes 3 жыл бұрын
excellent video!!! Thks
@Eigensteve
@Eigensteve 3 жыл бұрын
You are welcome!
@nguyenvan-hau9577
@nguyenvan-hau9577 5 жыл бұрын
Beautiful code!
@1PercentPure
@1PercentPure Жыл бұрын
i kneel............................................................
@charlespatterson8412
@charlespatterson8412 5 жыл бұрын
I would prefer to do this in my head because I can visualize it and move it around. I am not a mathematician but many of these are terms for things I am already familiar with. Perhaps I should have kept my TRS80 and took Bill's Class at Juanita High. I decided to concentrate on 'Salmon Enhancement' and 'European History' instead. It's probably just as well, I find writing code quite boring because I am more into Concepts... "Keep up the Good work!"
@saitaro
@saitaro 5 жыл бұрын
Math is fully about concepts. And how would you visualize something that is higher than 3 dimension?
@charlespatterson8412
@charlespatterson8412 5 жыл бұрын
@@saitaro Extrapolation
@user-iiii234a5gc
@user-iiii234a5gc 5 жыл бұрын
add a time term? or 4dimension more is exist just at theorical expression
@yaseenmohammad9600
@yaseenmohammad9600 4 жыл бұрын
this technique is generally used when large amounts of higher dimensional data are there. like in image processing for example if u take 50(50*50) images it will become 50,2500 dimensional data resulting in covariance of 2500*2500 matrix where pca is used to extract eigen faces. now i don't think there are people who can solve eigen value equation for 2500 * 2500 matrix in head
@charlespatterson8412
@charlespatterson8412 4 жыл бұрын
@@yaseenmohammad9600 Maybe if the variables are 'round' enough I could 'take a shot' at it...
Principal Component Analysis (PCA) 2 [Python]
7:56
Steve Brunton
Рет қаралды 29 М.
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 411 М.
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
SVD: Eigenfaces 1 [Python]
4:21
Steve Brunton
Рет қаралды 39 М.
Principal Component Analysis (PCA) [Matlab]
15:56
Steve Brunton
Рет қаралды 88 М.
PCA Analysis in Python Explained (Scikit - Learn)
16:11
Ryan & Matt Data Science
Рет қаралды 7 М.
SVD: Eigenfaces 2 [Python]
10:42
Steve Brunton
Рет қаралды 27 М.
Principle Component Analysis (PCA) using sklearn and python
12:30
Learn Machine Learning Like a GENIUS and Not Waste Time
15:03
Infinite Codes
Рет қаралды 404 М.
StatQuest: PCA in Python
11:37
StatQuest with Josh Starmer
Рет қаралды 210 М.
Data Analysis 6: Principal Component Analysis (PCA) - Computerphile
20:09
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН