Machine Learning Tutorial Python - 19: Principal Component Analysis (PCA) with Python Code

  Рет қаралды 201,571

codebasics

codebasics

Күн бұрын

Пікірлер: 171
@codebasics
@codebasics 2 жыл бұрын
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@Rainbow-lj5pp
@Rainbow-lj5pp 7 ай бұрын
This is a really easy to understand and thorough explanation of principal component analysis. Many others I watched were either too technical and math theory oriented or to basic in showing how to use the function but not what it does. This is a great balance of understanding and practicality.
@mayishamaliha6369
@mayishamaliha6369 Жыл бұрын
super helpful for newbies not scaring them off with too many statistical terms and getting overwhelmed. thank u so much
@rakeshbullet7363
@rakeshbullet7363 Жыл бұрын
Awesome videos - Simple explanations. A balanced approach to teaching with a right mixture of theory and practicals and not overwhelming the learners . i loved the approach - After seeing numerous ML training videos from across the spectrum , this is far most the best one i have seen . Thank you for taking time to create these videos .
@dushyyanta5305
@dushyyanta5305 2 жыл бұрын
You are the best! I am doing PG in DS but still, I watch your videos for better understanding. Kudos! Keep it up!
@AqsaChappalwala
@AqsaChappalwala 6 ай бұрын
Masters in Data Science in the UK and still loves watching only your videos :-)
@mohitupadhayay1439
@mohitupadhayay1439 2 жыл бұрын
The last few minutes were BANG ON! This is what i wanted to hear. Thanks!
@ernestanonde3218
@ernestanonde3218 2 жыл бұрын
This is the best channel on KZbin. You are simply amazing. You just saved my career. Thanks a million
@ernestanonde3218
@ernestanonde3218 Жыл бұрын
@Karthiktanu I am.a student of data science and analytics
@ernestanonde3218
@ernestanonde3218 Жыл бұрын
@@Thanusree234 yes
@bhaskarg8438
@bhaskarg8438 2 жыл бұрын
Thank you, PCA concept is clearly explained . Need to understand in actual real life scenarios, what we consider, the performance or process time
@prakashkoneti7630
@prakashkoneti7630 Жыл бұрын
I would really appreciate for your hard work in making these videos and decoding the complex to easy..
@luciamatamorospava4382
@luciamatamorospava4382 Жыл бұрын
it's like the 10th video i'm watching on PCA and the FIRST one I understand, thank you so much!
@BG4INDIA
@BG4INDIA Жыл бұрын
Impressed with the clarity of explaination
@codebasics
@codebasics Жыл бұрын
glad you liked it
@akinloluwababalola6666
@akinloluwababalola6666 2 жыл бұрын
Hello Code basics. I usually enjoy your videos as I learn a lot from them. Can you make a video on association rules, apriori algorithms and any machine model that deals with the determination of interrelationships amongst variables? Thank you
@maruthiprasad8184
@maruthiprasad8184 2 жыл бұрын
Thank you very much for simple and great explanation. I got higher accuracy in SVM=86.74 %, after PCA I got accuracy in RF=73.06
@pranjiljain4500
@pranjiljain4500 2 жыл бұрын
But timing and machine power also decreases heavily
@mohammeddanishreza4902
@mohammeddanishreza4902 Жыл бұрын
can you share the github link for your code please.
@richardshaw8326
@richardshaw8326 8 ай бұрын
Great explanation on PCA. @codebasics: I must have missed it though, but after running the PCA to identify which features will give the results, I missed where one might get the features.
@swagsterfut9992
@swagsterfut9992 2 жыл бұрын
at 17:35, shouldn't we be doing pca.fit_transform()on our scaled dataset (X_scaled in our case) rather than on X?
@geoafrikana
@geoafrikana 2 жыл бұрын
This came to my mind also. Perhaps the accuracy would have been higher if he scaled before pca.
@anirudhgangadhar6158
@anirudhgangadhar6158 2 жыл бұрын
Yes it should be on X_scaled.
@guillermokinderman8267
@guillermokinderman8267 Жыл бұрын
I was trying to understand PCA, this video helped me a lot
@mukeshkumaryadav350
@mukeshkumaryadav350 2 жыл бұрын
It was an amazing explanation of PCA without much mathematics and eigen value and vector which scares me. Interesting learning 1. we can know variance explained by each PC which helps.
@CharzIntLtd
@CharzIntLtd 3 жыл бұрын
Thanks sir the great work, your explanation makes ML easier for sure 🙏
@boogersincoffee
@boogersincoffee 2 жыл бұрын
Ahhhhhh I've been struggling to understand this and this cleared everything up, thank you
@TK-fx8dh
@TK-fx8dh 3 жыл бұрын
My long await topic!!!! Thank you for posting this PCA lesson
@ogobuchiokey2978
@ogobuchiokey2978 2 жыл бұрын
Your videos have helped me to complete my MSc research. Thank you for being a great teacher. I do have a question, during the explanation, you said we should always use PCA on the scaled data but during implementation, you used the unscaled data. Could you explain this?
@kreativeaman7688
@kreativeaman7688 10 ай бұрын
I had the same question following through.
@kreativeaman7688
@kreativeaman7688 10 ай бұрын
I tried using PCA on the scaled data and used it in SVM, Logistic Regression and RandomForest classifier, but the results were almost the same as to using regular data with PCA.
@gnaneshgn8341
@gnaneshgn8341 3 жыл бұрын
Nice Video sir. Please make a video for the math behind PCA. Thanks in Advance Sir
@geekyprogrammer4831
@geekyprogrammer4831 3 жыл бұрын
kzbin.info/www/bejne/fJjEnI2ta7Bkh7M this should be sufficient if you want to know mathematics
@shubhanshugupta9754
@shubhanshugupta9754 2 жыл бұрын
U can get best of pcs by taking log(total features)
@anirudhgangadhar6158
@anirudhgangadhar6158 2 жыл бұрын
Highest accuracy: SVM - 85.83%, after PCA (3 PCs), accuracy was 83.87%. For all 3 models, accuracy slightly (
@asamadawais
@asamadawais Жыл бұрын
I am watching this vedio 2nd or 3rd time. @Dhavel you best among equals...👍👍
@wavyjones96
@wavyjones96 2 жыл бұрын
I HAVE SOME QUESTIONS: 1)if you use ur PCA data that has been scaled before doing any train test split...wouldnt it cause Data Lakeage? 2) should not the target be dropped?
@hrithiksarma1204
@hrithiksarma1204 2 жыл бұрын
I had the same doubt, have you got any update on this ?
@slainiae
@slainiae 7 ай бұрын
Highest accuracy 0.8729 with SVM (linear) and with PCA n_components = 11.
@dees900
@dees900 Жыл бұрын
great explanation on PCA. It's an abstract concept to grasp. well done
@ogochukwustanleyikegbo2420
@ogochukwustanleyikegbo2420 Жыл бұрын
After completing the assignment, i got a best score of 0.85 with svm:rbf kernel and after PCA my best score reduced to 0.68 still svm:rbf kernel
@levimungai1846
@levimungai1846 3 күн бұрын
I got a question. I understand that what is contained in the PCA array is the loading scores of each feature. why do we use this as our new training data? what do these figures in the PCA array really represent?
@namantiwari8251
@namantiwari8251 Жыл бұрын
Sir can you please tell which features it reduces How can I get those particular selected (reduced) features as output?
@talkingbirb2808
@talkingbirb2808 8 ай бұрын
I would add that reducing number of columns should help with overfitting
@leamon9024
@leamon9024 Жыл бұрын
Thanks for this amazing tutorial. Hope you could do one video about when to use feature selection and feature extraction, or even combination of them.
@nastaran1010
@nastaran1010 9 ай бұрын
Hi. I have a question. why when you perform PCA, for input, you did not give (x_scaled), you gave x?????
@arjunprashanth7824
@arjunprashanth7824 6 ай бұрын
Shouldn't X_scaled be passed inside pca.fit_transform() method? Because if you're passing X, there's no point we did the scaling right?
@ANASS-AHMADD
@ANASS-AHMADD 4 ай бұрын
Exactly, i was about to ask the same question.
@userhandle-u7b
@userhandle-u7b 4 ай бұрын
I tried both. When passing X to pca without scaling, I got higher score. But you're right, I also believe to pass X_scaled for parallel comparison.
@RijoSLal
@RijoSLal 23 күн бұрын
It's not always necessary , if the variance is smaller you can do pca without rescaling it
@krishnadaskv2197
@krishnadaskv2197 3 жыл бұрын
I am getting around 80 % score when using PCA(0.99999) in exercise, which is higher than the score before using PCA, and also getting a better score without removing outliers.
@codebasics
@codebasics 2 жыл бұрын
That’s the way to go kv, good job working on that exercise
@jjanna07751
@jjanna07751 2 жыл бұрын
thanku ..pca explained very easily
@jinks3669
@jinks3669 2 жыл бұрын
Another very informative video. DHANYAVAAD ! :)
@Maniclout
@Maniclout 3 жыл бұрын
Amazing explanation, I understand PCA now.
@sarangali4595
@sarangali4595 2 жыл бұрын
Sir also make a video on how PCA actually works and what type of information we can gain from the loadings like how are these features affecting the label
@mohammadhosseinkazemi8558
@mohammadhosseinkazemi8558 Жыл бұрын
Thank you for the video. I have one problem, though: Shouldn't we first split the data into training and test sets, then scale each set separately using StandardScaler(), RobustScaler(), etc. ?
@ahsanurrahman8915
@ahsanurrahman8915 Жыл бұрын
Very nicely described ! I have a question: In your example PCA(0.95) reduces the dimension to 29. But, how do we know which dimensions it picked? I am asking this because I want to use PCA to determine the principal drivers in determining the targets.
@akashbhargava906
@akashbhargava906 10 ай бұрын
hey buddy, PCA doesn't pick any existing dimension. It creates new dimensions which by the naked eye won't make much sense to you.
@nikhilanand9022
@nikhilanand9022 Жыл бұрын
Here is my two question for u 1- why u dont scaled the target column means (y) 2- for score as accuracy why u dont compare with actual and predicted u give for score is x_test and y_test why not y_pred and y_test
@vanshoberoi2154
@vanshoberoi2154 2 ай бұрын
1- y is target it doesnt influence the training as x inputs do , the raw values of y are used to calculate errors (e.g., loss functions) directly. Scaling y is generally unnecessary and could alter the model's predictions in unintended ways. 2- as sir has previously explained.. when u use accuracy_score then we pass ypred and ytest but in model.score it takes xtest,ytest and internally converts xtest to ypred
@AkaExcel
@AkaExcel 3 жыл бұрын
@codebasics Thank You for Teaching and helping us!
@punnarahul4068
@punnarahul4068 3 жыл бұрын
great looking for more videos dhaval bhai............../
@DrizzyJ77
@DrizzyJ77 5 ай бұрын
Thank you code basics❤
@anonymous-bi6ul
@anonymous-bi6ul 4 ай бұрын
Why didn't you used X_scaled as the parameter to the fit transform function of pca?
@anirudh7150
@anirudh7150 8 ай бұрын
Thank you Sir. It was really helpful.
@souhamahmoudi7745
@souhamahmoudi7745 Жыл бұрын
Thanks for sharing, it's highly appreciated
@mansijswarnkar4389
@mansijswarnkar4389 2 жыл бұрын
Wonderful, as always - thanks for making this video, it has helped me a lot ! Regards
@purebackend1993
@purebackend1993 2 жыл бұрын
You kill it, amazing!
@Ooo12376
@Ooo12376 3 жыл бұрын
Please also explain the math behind it. You get questions on math behind PCA in interviews. People ask the derivation of PCA
@salahmahmoud2119
@salahmahmoud2119 Жыл бұрын
You are the best!!! 👏
@MohammadYs77
@MohammadYs77 2 жыл бұрын
Very informative and practical.
@youktinathbhowmick4673
@youktinathbhowmick4673 2 жыл бұрын
Thanks for the explanation. I have one question: When you are doing PCA, you are taking the whole data and after that you are doing train test split. Isn't bit unethical? Again, if I do pca on train data, is same will the same pca can be applied on test data? Is there anyway to store the transformation of PCA to apply that on test data?
@hrithiksarma1204
@hrithiksarma1204 2 жыл бұрын
I had the same doubt, have you got any update on this ?
@tobe7602
@tobe7602 2 жыл бұрын
Hi Good tutorial, i think you must use X_train in pca.fit_transform and not X. Thanks
@pateltapasvi7277
@pateltapasvi7277 11 ай бұрын
How can I get selected features in dataframe along with its feature name instead of number 1, 2, 3,etc.?
@sarangali4595
@sarangali4595 2 жыл бұрын
Sir please also make a video on how to find relations using descriptive technique.
@kalluriyaswanthkumar2275
@kalluriyaswanthkumar2275 Жыл бұрын
sir you told that we should scale before pca but you are applying pca to non scaled data in code
@self.__osman
@self.__osman 2 жыл бұрын
Hi. I might not be making any sense here but I wanted to know if same thing could be achieved with entropy and information gain. We know information gain tells you the feature with the most information or importance as a number . Therefore, in theory, we can remove all the features with really low information gains. I think it would this would work with discrete data better. I don't know if it already exists. If it does, what method does this. If it doesn't, can I know if this solution is practical.
@LamNguyen-jp5vh
@LamNguyen-jp5vh 2 жыл бұрын
Hi, I just want to ask why we use StandardScaler instead of MinMaxScaler in the lecture (not exercise). Thank you so much for your help!
@farahamirah2091
@farahamirah2091 Жыл бұрын
I have question, we can trained model using pca , then how about imbalance dataset? We not need to do imbalance?
@manjularathore1076
@manjularathore1076 3 жыл бұрын
You are absolutely amazing.
@mr.luvnagpal7407
@mr.luvnagpal7407 2 жыл бұрын
Thankyouu so much for this amazing video
@albertoachavalrodriguez2461
@albertoachavalrodriguez2461 2 жыл бұрын
Great video!
@nriezedichisom1676
@nriezedichisom1676 8 ай бұрын
Thank you
@sohailshaikh786
@sohailshaikh786 2 жыл бұрын
Thanks
@usamaalicraft3646
@usamaalicraft3646 3 жыл бұрын
Thanks sir 😊😊
@tamirat9797
@tamirat9797 8 ай бұрын
Thank you 🙏
@debatradas1597
@debatradas1597 2 жыл бұрын
Thank you so much
@nyangwindicollins1018
@nyangwindicollins1018 2 жыл бұрын
Superb
@BG4INDIA
@BG4INDIA Жыл бұрын
Hi Mr. Dhaval, I am so thankful for sharing such a good informative video. Like "ogobuchiokey2978" even i wanted to know, if there is a specific reason of not selecting X_scaled while fitting into PCA? In the above demo, if I fit raw X, I get 29 new PCA-features but if i fit scaled_X i get new 40 PCA-features. Similarly through your exercise, if I fit scaled_X I get 10 features (only 1 attribute is reduced) with Accuracy of 85% and if i fit raw X, i get 2 attributes, but accuracy dips down to 69%(Random Forest) I believe this depends on the data as well.
@Cat_Sterling
@Cat_Sterling Жыл бұрын
Should you scale the data before PCA?
@ujjwalchetan4907
@ujjwalchetan4907 2 ай бұрын
Thanks.
@girishtripathy3354
@girishtripathy3354 2 жыл бұрын
The dimensions it should get reduced to, isn't it another hyper parameter? For 2 dimensions yeah you can visualize. For > 2, visualization is not possible. How can you decide what dimension you should reduce your dataset to?
@suriyaprakashgopi
@suriyaprakashgopi Жыл бұрын
nicely done
@bommubhavana8794
@bommubhavana8794 2 жыл бұрын
Hello, I have newly started working on a PCR project. I am stuck at a point and could really use some help...asap Thanks a lot in advance. I am working on python. So we have created PCA instance using PCA(0.85) and transformed the input data. We have run a regression on principal components explaining 85 percent variance(Say N components). Now we have a regression equation in terms of N PCs. We have taken this equation and tried to express it in terms of original variables. Now, In order to QC the coefficients in terms of original variables, we tried to take the N components(85% variance) and derived the new data back from this, and applied regression on this data hoping that this should give the same coefficients and intercept as in the above derived regression equation. The issue here is that the coefficients are not matching when we take N components but when we take all the components the coefficients and intercept are matching exactly. Also, R squared value and the predictions provided by these two equations are exactly same even if the coefficients are not matching I am soo confused right now as to why this is happening. I might be missing out on the concept of PCA at some point. Any help is greatly appreciated.Thank you!
@aaditya1267
@aaditya1267 Жыл бұрын
nice explanation !!
@babalolamayowamercy186
@babalolamayowamercy186 2 жыл бұрын
Nice video Thank you
@zainnaveed267
@zainnaveed267 2 жыл бұрын
sir i have a question how one can predict target values when PCA create all new columns based on its own calculations
@siddheshmhatre2811
@siddheshmhatre2811 Жыл бұрын
Thanks ❤
@makoriobed
@makoriobed 3 ай бұрын
is it X_pca=pca.fit_transform(X) or X_pca=pca.fit_transform(X_scaled)
@HT-xt4cn
@HT-xt4cn 3 ай бұрын
What was the purpose of scaling X at 14:18?
@MonilModi10
@MonilModi10 Жыл бұрын
Why PCA rotate the axis? What is a significance of that?
@krishnapatel8852
@krishnapatel8852 Жыл бұрын
hello, if I want to visulize this data in 3D, then what will be z axis ?
@tigrayrimey6418
@tigrayrimey6418 2 жыл бұрын
Nice points.
@mayank66
@mayank66 Жыл бұрын
amajing
@RiteshKumar-yv8nx
@RiteshKumar-yv8nx 4 ай бұрын
Why didn't you normalise y(i.e. the dataset.target)?
@ShanthoshKumaarSomiRajesh
@ShanthoshKumaarSomiRajesh 10 ай бұрын
I have a query to ask. You said we should pass the data to PCA after scaling but you passed the original X instead of X_scaled. Why ??
@dhineshv2590
@dhineshv2590 2 ай бұрын
Since we are using grayscale image, it's already kinda in scaled values(px values).
@dhineshv2590
@dhineshv2590 2 ай бұрын
Since we are using grayscale image, its already kinda in scaled value I guess.
@gulnawaz9670
@gulnawaz9670 2 жыл бұрын
Hi Sir, very informative video. I have a problem I uploaded a local dataset and when I use code dataset.keys () which shows Index(['Unnamed: 0', 'Flow ID', ............ Now at pd.DataFrame(dataset.data, columns=dataset.feature_names) then it shows an error even I changed data into Unnamed as well but ut occurs the same problem. AttributeError: 'DataFrame' object has no attribute 'data' waiting for your kind reply. Thanks.
@snehasneha9290
@snehasneha9290 2 жыл бұрын
After reducing the dimensions is it possible to know which columns are selected after applying the PCA in this example we got the 29 features is it possible to know what are those 29 out of total data frame features
@kunalchavan4685
@kunalchavan4685 2 жыл бұрын
I am having the same question but I think this are not the same 29 columns out of 64, this are totally different 29 columns which contains data from all 64 columns
@RamveerSingh-el6zl
@RamveerSingh-el6zl Жыл бұрын
Can you tell how to do varimax rotation?
@ajaxx627
@ajaxx627 3 жыл бұрын
Please I have a problem with some work. I was given a list of words let’s say about 200 different words. And I’m meant to create a code that generates 3 random words each together. Eg wordlist=[a, b, c, d, e,................z] Output should be = a, d, z c, o, x And so on Please how do I do it?
@sharmilasenguptachowdhry509
@sharmilasenguptachowdhry509 2 жыл бұрын
Thanks v m! can you pls help explain Eigen values and Eigen vectors from the data science perspective? thanks again
@taimoorneutron2940
@taimoorneutron2940 3 жыл бұрын
hello sir, i am 27 now and masters is in progress sir i have teaching experience. but now i want to start my career in Machine learning or data science ? so it is possible? every company needs new fresh comers so what should i do?
@vanshoberoi2154
@vanshoberoi2154 2 ай бұрын
doesnt 0.95 ie 95% retention means 60 out of 64 features should have been retained... why how 25
@jayuchawla1892
@jayuchawla1892 2 жыл бұрын
you applied pca on normal dataframe whereas in theory you explained we need to apply on scaled dataframe
@VickyKumar-dk6rd
@VickyKumar-dk6rd 2 жыл бұрын
feature_names column is now not reflecting in Load_digits() dataset
@MyManiratnam
@MyManiratnam 3 жыл бұрын
Hi, I have seen your videos on PCA they are really informative and your explanation is really cool. I have a doubt, we apply PCA on the dataset and later we go for model fitting for example if it is a classification problem we go for classification model. Here my doubt is, after building a model we validate it with test set, after that if I have new observation i.e new row in the dataset, how to predict my label?
@kirubakaran6145
@kirubakaran6145 10 ай бұрын
hello Maniratnam, have you got the answer the above question?.
@FutureAIDev2015
@FutureAIDev2015 2 жыл бұрын
I have no idea where to start on the exercise or even what "z-score" means for getting rid of outliers.
@slainiae
@slainiae 8 ай бұрын
Check out video #41 in this video series. That teaches everything about Z Scores.
PCA Analysis in Python Explained (Scikit - Learn)
16:11
Ryan & Matt Data Science
Рет қаралды 4,4 М.
Elza love to eat chiken🍗⚡ #dog #pets
00:17
ElzaDog
Рет қаралды 17 МЛН
Não sabe esconder Comida
00:20
DUDU e CAROL
Рет қаралды 62 МЛН
бабл ти гель для душа // Eva mash
01:00
EVA mash
Рет қаралды 6 МЛН
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 2,9 МЛН
Data Analysis 6: Principal Component Analysis (PCA) - Computerphile
20:09
Machine Learning Tutorial Python 12 - K Fold Cross Validation
25:20
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 413 М.
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 296 М.
StatQuest: PCA in Python
11:37
StatQuest with Josh Starmer
Рет қаралды 206 М.
Elza love to eat chiken🍗⚡ #dog #pets
00:17
ElzaDog
Рет қаралды 17 МЛН