Lecture54 (Data2Decision) Principle Components in R

  Рет қаралды 25,107

Chris Mack

Chris Mack

Күн бұрын

Пікірлер: 35
@narak2273
@narak2273 4 жыл бұрын
thank you. The best I have seen on PCA.
@AlexButterfield
@AlexButterfield 7 жыл бұрын
This is great. at 13:53 you find the coefficients of the original variables. At that point, are you saying that this creates a function essentially: bodyfat = -4.49(Weight) -0.36(Chest) +10.41(Abdomen) - 0.46(Hip) + 1.06(Thigh) +0.5(Biceps)
@Newmarw
@Newmarw 6 жыл бұрын
There are some way to get coeff to real data (not standardized)?
@pulkit21aug
@pulkit21aug 6 жыл бұрын
Hi , once your have the coefficients for PCR and equation will be such as y = b0 + b1*PC1+ b2*PC2 .. . Now when you deploy this model in production how do you convert this equation to original variable values equation ? do we calculate PC1 value using original data-set value and then feed in the above equation
@SuperFrydensbjerg
@SuperFrydensbjerg 5 жыл бұрын
Is it possible to test the significance of the coefficients? -instead of just getting the betas, when you multiply the rotation matrix with the coefficient matrix
@bluepearl
@bluepearl 8 жыл бұрын
Thank you so much for your video Chris! That was truly one lifesaver!
@sukanyaiyer8776
@sukanyaiyer8776 7 жыл бұрын
This was great, beautifully explained.
@sushanmodak
@sushanmodak 4 жыл бұрын
Do you not need to rescale the values form production after you get back to the original coefficients?
@nadiaaddeep1970
@nadiaaddeep1970 4 жыл бұрын
Ridge Regression in R please .. ( effect Multicollinearty by ridge regression )
@shanthankesharaju6716
@shanthankesharaju6716 6 жыл бұрын
I want to run a regular regression and compare the model with the a regression of PCA with higher components. How can I do it ?
@divyak2976
@divyak2976 6 жыл бұрын
hi, very informative video. nd helpful for me. i just have one qn.. how do you know which variables make up PC1 and which PC2 etc
@Alfredo_Ortiz_Bio
@Alfredo_Ortiz_Bio 4 жыл бұрын
How to make a PCA with three components?
@sudiptapaul2919
@sudiptapaul2919 4 жыл бұрын
Thanks Professor for the informative video. I still wonder how to interpret the results!!!
@faezehfazel224
@faezehfazel224 3 жыл бұрын
AWESOME!
@ziddneyyy
@ziddneyyy 3 жыл бұрын
thanks for the video
@Dekike2
@Dekike2 5 жыл бұрын
First of all, thanks Chris for the time you spend teaching statistics... Students need more people like you. I wanted to ask you something. In this video you explain how to make a regression on Principal Components, don't you? My doubt is how to know if some of the explanatory variables are meaningless with this procedure. That is, in your example, you finally find the coefficients for the 6 explanatory variables you used (weight, chest, abdomen, hip, thigh and biceps) after making a linear model among bodyfat (Y) and the Principal Components 1, 2, 4 and 5 (they were significant). You find the coefficients -4.48, -0.35, 10.40, -0.45, 1.05 and 0.50 for weight, chest, abdomen, hip, thigh and biceps respectively. Can we know if some of those coefficients are non-significant, and thus, they are not related to bodyfat? Because in the video you test the significance of the coefficients for the model between Y (bodyfat) and the Principal Components, but not for the coefficients that you get later for the explanatory variables. Or due to the method (Regression on Principal Components), we can't exclude any of the explanatory variables? I'm just curious about how to interpret this. Thanks in advance!!!! :)
@ChrisMack
@ChrisMack 5 жыл бұрын
If you want to remove a non-significant predictor variable, principle component analysis will not be of much help. See lecture 53 for more details.
@matthewholland9827
@matthewholland9827 4 жыл бұрын
Thanks so much! Very easy to follow and understand.
@iamjinse
@iamjinse 4 жыл бұрын
thank you so much for this video lesson sir, this helps me a lot...
@muhammadumerkhan3894
@muhammadumerkhan3894 7 жыл бұрын
Thank you for video, really helpful. especially for last night study
@venkatsagar3038
@venkatsagar3038 4 жыл бұрын
Hi Chris, this was very informative and helpful. I just have one question, as we are using scaled data and when specifying the model coefficients in terms of original variables, how can we unscale the coefficients of the model?
@casualcomputer6544
@casualcomputer6544 6 жыл бұрын
why are the r square for both pca regression and linear model the same?
@chrismack783
@chrismack783 6 жыл бұрын
The PCA model is the same as the linear model, but uses "rotated" variables, that is, new variables that are just linear combinations of the original variables. Thus, the PCA model will always give the exact same fit as the original model. The goal is to then pick some of the new variables to eliminate because they have little impact on the fit, leaving only the "principle" components.
@fengyuwen4072
@fengyuwen4072 6 жыл бұрын
Thank you very much for the explaination!!!!!
@cosimotuena6597
@cosimotuena6597 5 жыл бұрын
Dear Professor many thanks for this tutorial, it is very helpful. However, I still find hard to understand what we can tell about our predictors on the dependent variable. Which is the next step after this PCA procedure? Since each PC is composed of each of the predictors, how can we know which of them are relevant for in this case body fat (chest, abdomen, hip etc.)? And from which PC would we rely on (I assume PC1 as is the most explicative but also the others are sig.)?
@ChrisMack
@ChrisMack 5 жыл бұрын
PCA will not tell you which individual variable (chest, hip, etc.) is or is not significant. It will only tell you which principle components are significant (using the standard tests for significance).
@sinemsenel6155
@sinemsenel6155 4 жыл бұрын
hey sir, i have question, why dont you split your data into test and train. i wondering that, if i work on same data with vary of machine learning models, should i work with PCA data set all of them or it is just for one model like linear regression?
@chrismack783
@chrismack783 4 жыл бұрын
For a discussion of data splitting, see Lecture 62: Building Models. kzbin.info/www/bejne/b561dIKkh8d6itk
@bevansmith3210
@bevansmith3210 6 жыл бұрын
Hi Chris, excellent video! Just a quick question, now that you have your PCs, how do you relate them back to the original variables? How do I know which variables impact the response? Thanks
@chrismack783
@chrismack783 6 жыл бұрын
You can go back and for between the original variables and the principle components, since the PCs are linear combinations of the original variables. But PCA doesn't tell you which of the original variables are significant - you get that information from a regular regression. Instead, you want to find out which principle components are significant (that is, what linear combination of original variables have the greatest impact on the output).
@zhou6075
@zhou6075 2 жыл бұрын
thank you
@kalyanasundaramsp8267
@kalyanasundaramsp8267 6 жыл бұрын
super sir
@naqeebsafi6808
@naqeebsafi6808 3 жыл бұрын
Hey Sir, I need an urgent help from. You, can you help me plz.... I will remember u in my prayers ...
@OZ88
@OZ88 4 жыл бұрын
ok but pca is useful for linear problems ... most problems in machine learning or real problems ....
@ChrisMack
@ChrisMack 4 жыл бұрын
Of course, not every problem involves linear modeling (linear in the coefficients of the model), but many do. And these are real problems, important problems. For nonlinear modeling, difference approaches are required.
Lecture55 (Data2Decision) Robust Estimation
20:43
Chris Mack
Рет қаралды 13 М.
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 417 М.
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 52 МЛН
When Rosé has a fake Fun Bot music box 😁
00:23
BigSchool
Рет қаралды 6 МЛН
Lecture53 (Data2Decision) Principal Component Analysis
25:18
Chris Mack
Рет қаралды 5 М.
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 404 М.
Principal components analysis in R
26:49
Hefin Rhys
Рет қаралды 158 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Lecture70 (Data2Decision) Factorial Design in R
30:27
Chris Mack
Рет қаралды 30 М.
Principal Component Regression in R
17:34
Spencer Pao
Рет қаралды 11 М.
Redundancy Analysis (RDA) in R
13:55
Just One Bird's Opinion
Рет қаралды 11 М.
Lecture69 (Data2Decision) Analysis of Covariance in R
13:19
Chris Mack
Рет қаралды 15 М.
PCA : the math - step-by-step with a simple example
20:22
TileStats
Рет қаралды 118 М.