PCA Indepth Geometric And Mathematical InDepth Intuition ML Algorithms

  Рет қаралды 112,020

Krish Naik

Krish Naik

Күн бұрын

github Materials: github.com/kri...
Principal component analysis (PCA) is a popular technique for analyzing large datasets containing a high number of dimensions/features per observation, increasing the interpretability of data while preserving the maximum amount of information, and enabling the visualization of multidimensional data. Formally, PCA is a statistical technique for reducing the dimensionality of a dataset. This is accomplished by linearly transforming the data into a new coordinate system where (most of) the variation in the data can be described with fewer dimensions than the initial data. Many studies use the first two principal components in order to plot the data in two dimensions and to visually identify clusters of closely related data points. Principal component analysis has applications in many fields such as population genetics, microbiome studies, and atmospheric science.

Пікірлер: 93
@exploreEverything4519
@exploreEverything4519 Жыл бұрын
First I understood pca concept 3 years back from nptel lecture. It was full of mathematics and It went far above my head because the theory part was missing. Believe me with your explanations I can understand his lecture too. No one could explain the way you have explained. It was outstanding.
@aditinautiyal4299
@aditinautiyal4299 Жыл бұрын
Thank you so much for not only sharing your knowledge but also putting so much effort to cover each and every point of the particular topic.
@salminsultana9654
@salminsultana9654 28 күн бұрын
So I am your new fan sir. Today I was searching for videos about the auto eda. And I found yours. After that I started to watch your videos about pca clustering etc. And you are just amazing.. now I can understand properly. Thanks a lot sir.
@aj_actuarial_ca
@aj_actuarial_ca Жыл бұрын
PCA is so very well explained in your video sir. You're really the best teacher ever !!!
@IshanGarg-y1u
@IshanGarg-y1u Жыл бұрын
This is a good video, I recommend first you watch PCS step by step guide from stat quest to get a high level view with animations, then you watch this video to get more details and understanding alongside some code. Then in case you want to know the mathematics behind it refer to some articles online where the explain why we calculate the covariance matrix, then build the objective function using lagrange multiplier and then derive why eigen values of covariance matrix are the desired results
@pritamrajbhar9504
@pritamrajbhar9504 9 ай бұрын
thanks a lot, Krish this is the simplest and most detailed video about PCA.
@akashpaul9892
@akashpaul9892 Жыл бұрын
You really are a good teacher brother... Teaching with relatable examples help to understand each topic so perfectly and easily.. Thank you so much brother.. Keep teaching us... Love from Bangladesh
@man9mj
@man9mj 11 ай бұрын
thank you for this elegant effort in explaining PCA
@Harsh_Yadav_IITKGP
@Harsh_Yadav_IITKGP Жыл бұрын
Krish your efforts are remarkable in this ml series.....
@adnanshujah6230
@adnanshujah6230 9 ай бұрын
best of the best lecture .covers all the required concepts about subject . most of videos available only shows how to perform PCA but not whay it is required and concept behind it .but sir Krish thankyou so much for such a detailed lecture and clearing the concepts . highly recommended lecture and his channel 🥰🥰🥰🥰🥰🥰
@adnanshujah6230
@adnanshujah6230 9 ай бұрын
i simply say this one video is enough to get the clear concept ;once again thankyou soooooo .... much sir Krish
@syco-brain8543
@syco-brain8543 5 ай бұрын
best video about pca on internet so far
@SanthoshKumar-dk8vs
@SanthoshKumar-dk8vs 2 жыл бұрын
Thanks for sharing Krish really helpfull, last two days am refreshing this topic only🤗
@vinothkumar7531
@vinothkumar7531 Жыл бұрын
You are a great teacher I ever seen in my entire life.The way you are teaching even makes the lazy or slow learner to a strong learner using Krish Naik g(ji) Boosting algorithm.Just Kidding 😃😃.Hatsoff to your effort to help the people.
@ashwintiwari9642
@ashwintiwari9642 2 жыл бұрын
No where I can find this explanation it's too good no confusion no complex demonstration use cases a cleanest and simplest way to understand PCA in depth thanks alot Krish it takes lot of takes and research to explain single topics in data science and in this way it's all appreciated work
@taslima5007
@taslima5007 11 ай бұрын
You are my favourite youtuber and teacher.
@paneercheeseparatha
@paneercheeseparatha Жыл бұрын
Wonderful try to explain PCA without much mathematics. Though it would be great if you also do a video on implementing PCA from scratch in python. Loved your playlist! kudos to you!
@dipamsarkar6626
@dipamsarkar6626 Жыл бұрын
This guy should be named as "God father of Data Science India" an absolute legend
@BhagatRaviNarayan
@BhagatRaviNarayan 6 күн бұрын
He used the comparison operator in reverse order Accuracy1 < Accuracy2 < Accuracy3 > Accuracy4 > Accuracy5, Due to curse of dimensionality. 5:35
@yogendrapratap1982
@yogendrapratap1982 Жыл бұрын
Everything had been really resourceful in lecture series but this lecture was overly extended, 30 min topic has been extended to 1 hours 30 mins repeating same stuff again and again
@amitx26
@amitx26 10 ай бұрын
Sir, I thing have felt strongly is that you expain and deliver a little better in recorded videos. Thanks for providing such great content for us for free!
@RakshithML-vo1tr
@RakshithML-vo1tr 10 ай бұрын
Hi bro I am starting data science how can I start? By seeing Krish sir roadmap and like u said should I prefer recorded videos
@samareshms4591
@samareshms4591 10 ай бұрын
This guy is single handedly carrying the AI ML community in the India 🙇‍♂🙇‍♂
@ramakrishnayellela7455
@ramakrishnayellela7455 9 ай бұрын
Such a good explanation krish
@kvafsu225
@kvafsu225 Жыл бұрын
Excellent presentation.
@viratkumar9161
@viratkumar9161 Жыл бұрын
Its quite vage to say if pearson correlation value is zero there is no relationship between x and y. Example consider Y= mod(X) line the person correlation is 0, but still there is relationship easily visible after plotting
@SiddharthSwamynathan
@SiddharthSwamynathan Жыл бұрын
Correct. Pearson correlation has the capacity only to capture the linear relationship. Coefficient 0, would be no linear relationship exists. But there exists a possibility of a non linear relationship within the covariates and target.
@pankajray5939
@pankajray5939 2 жыл бұрын
PCA is one of the important topics of ML
@irisshutterwork1411
@irisshutterwork1411 2 жыл бұрын
Well explained. Thank you
@unicornsolutiongh2022
@unicornsolutiongh2022 Жыл бұрын
powerfull lecture. keep it up sir
@harshitsamdhani1708
@harshitsamdhani1708 Жыл бұрын
Thank You for the video
@the-ghost-in-the-machine1108
@the-ghost-in-the-machine1108 Жыл бұрын
Thanks sir, god bless you!
@AjayPatel-pc1yf
@AjayPatel-pc1yf Жыл бұрын
Gjb sir mja aa gaya❤
@mr.pianist
@mr.pianist 6 ай бұрын
very good lec beginner friendly
@thop9747
@thop9747 2 жыл бұрын
was really helpful. Keep up the work sir.
@bhagyashriakolkar7763
@bhagyashriakolkar7763 Жыл бұрын
Thank you sir....nice explanation
@manikandanm3277
@manikandanm3277 2 жыл бұрын
In theory part, to find the eigen values, you multiply the covariance matrix with a vector. How's that particular vector V is chosen and used to multiply with the covariance matrix? I'm confused with this only, otherwise a great lecture, thanks krish👍
@priyam39
@priyam39 Жыл бұрын
That v is the eigen vector itself we are looking for.Sir just explained
@meghananddussa62
@meghananddussa62 19 күн бұрын
Covariance(direction) , correlation coefficient (strength and direction) both will tell about linear relationship only, they do not explain about non-linear relationship
@baravind6548
@baravind6548 9 ай бұрын
In extracting from 2D to 1D, if PC1 has the higer varience and PC2 has 2nd higher varience. Is it nessesary that PC1 should be perpendicular to PC2?
@javeedtech
@javeedtech 2 жыл бұрын
Thanks for video, from fsds batch 2
@yachitmahajan3579
@yachitmahajan3579 10 ай бұрын
best explanation
@IzuchukwuOkafor-v6e
@IzuchukwuOkafor-v6e 11 ай бұрын
Very lucid explanation of PCA.
@samthomas3881
@samthomas3881 11 ай бұрын
Thanks Sir!
@BMVLM-
@BMVLM- 2 ай бұрын
Bhai content mast hai lekin advertisment bhot sare hai bot disturbing.
@PAVVamshhiKrishna
@PAVVamshhiKrishna 6 ай бұрын
Fantastic
@Nikhillllllllllllll
@Nikhillllllllllllll Жыл бұрын
how to get names of those 2 features we got after feature extraction
@eurekad7340
@eurekad7340 4 ай бұрын
If possible could you please make video on truncated svd as well. I searched but I couldn't find any video on svd from you
@faizannaseem3384
@faizannaseem3384 4 ай бұрын
See Go Classes Free Leactures for SVD
@shivachauhan2837
@shivachauhan2837 2 жыл бұрын
To improve my resume what should I try kaggle Or open source
@theharvi_
@theharvi_ 9 ай бұрын
❤thx
@BharatDhungana-n4s
@BharatDhungana-n4s Жыл бұрын
implementation is best
@sumankumar01
@sumankumar01 Жыл бұрын
Campus x and you both refer same books or what since the example is same ?
@CodeWonders_
@CodeWonders_ 2 жыл бұрын
Can you tell me who will teach in data science course you or sudhanshu sir ?
@chayanikaboruha6657
@chayanikaboruha6657 11 ай бұрын
Krish please make a video regarding how we can use auto encoder for text data
@jitendrasahay3847
@jitendrasahay3847 4 ай бұрын
If we have 3 features then we are getting 3 eigen vectors and later we combine 2 out of them to create 1 eigen vector. Combining here basically mean projection. Earlier when we projected we got n eigen vectors out of n feature then again we will get 2 eigen vectors. Where the dimensionality reduction is happening??? What I m missing here really??? Can anyone help ???
@RahulA-b9o
@RahulA-b9o Жыл бұрын
How do i know that the model is over feeded.. any method to find out that the model trained is under curse of Dimensionality???????
@lagangupta3193
@lagangupta3193 6 ай бұрын
How will we decide the number of features that we have to mention in n_components?
@user-rx5kq6oo9y
@user-rx5kq6oo9y 2 жыл бұрын
Bro can you make cheat sheet of data science like multiple dsa sheets on youtube?
@MamunKhan-px2vb
@MamunKhan-px2vb 2 жыл бұрын
Just Great
@MrKhan-xu1vf
@MrKhan-xu1vf 2 жыл бұрын
Kinda amazing teaching skills
@ITSimplifiedinHINDI
@ITSimplifiedinHINDI 7 ай бұрын
Greater than ko Less than aur Less Than ko Greater Than, kyoun likh rahe ho Guruji.
@arungireesh686
@arungireesh686 Жыл бұрын
superb
@Bitter_Truth-zc4eq
@Bitter_Truth-zc4eq Жыл бұрын
Which software are you using for writing?
@KRSandeep
@KRSandeep 10 ай бұрын
Scrble Ink which is available for windows laptop only
@mohitkumarsingh7318
@mohitkumarsingh7318 11 ай бұрын
Sir pls, also cover SVD , it's a request
@viratjanghu945
@viratjanghu945 Жыл бұрын
Sir please make a video on the independent component analysis and linear discriminant analysis it is my humble request sir please
@ramdharavath7542
@ramdharavath7542 2 жыл бұрын
Useful
@shruti9731
@shruti9731 10 ай бұрын
❤❤
@SohanDeshar-pf6zh
@SohanDeshar-pf6zh 7 ай бұрын
Good explanation but it might be a good idea to remove one of the "InDepth"s from the video title.
@muhammadrafiq1720
@muhammadrafiq1720 Жыл бұрын
There is Ad after each 3 to 4 minets , difficult to concentrate especially with low speed inter et.
@baravind6548
@baravind6548 9 ай бұрын
How to get the vector v? that is to be multiplied by A
@kunalpandya8468
@kunalpandya8468 Жыл бұрын
After we get 2 features from pca, what is the name of those two features?
@mr.patientwolfx5984
@mr.patientwolfx5984 2 жыл бұрын
sir what do you think of guvi data science program? can i join.
@AmmarAnjum-h2s
@AmmarAnjum-h2s Жыл бұрын
Why sir you don't talk point to point things..repeating everything again and missing some stuff to talk
@somnath1235
@somnath1235 2 жыл бұрын
What does the covariance and corelation decide ? Does covariance denotes how closely 2 features exist? And does corelation denotes whether the features are directly or inversely proportional?
@saisrinivas3066
@saisrinivas3066 2 жыл бұрын
covariance only describes the type of relationship whereas correlation describes the type and strength of the relationship between two numerical variables
@bhargav1811
@bhargav1811 2 жыл бұрын
Correlation is scaled version of covariance !!!! Range of covariance = (-inf,+inf) Range of correlation = (-1,+1)
@Datadynamo
@Datadynamo 2 жыл бұрын
Covariance is a measure of the joint variability of two random variables. It tells you how two variables are related to each other. A positive covariance means that the variables are positively related, which means that as one variable increases, the other variable also tends to increase. A negative covariance means that the variables are inversely related, which means that as one variable increases, the other variable tends to decrease. Correlation is a normalized version of covariance, it gives the measure of the strength of the linear relationship between two variables. It ranges from -1 to 1, where -1 is the perfect negative correlation, 0 is no correlation and 1 is perfect positive correlation. Like covariance, it tells you how two variables are related to each other, but it gives you a more intuitive sense of the strength of the relationship, as it is scaled between -1 and 1.
@shanthan9.
@shanthan9. Жыл бұрын
Good video but too lengthy
@siddharthmohapatra7297
@siddharthmohapatra7297 2 жыл бұрын
Sir I want to ask ...I have no coding skills and background...bcom Background Can I do data science masters from pw skills ... everything will be taught from verry basics ???
@rutvikchauhan1572
@rutvikchauhan1572 2 жыл бұрын
You can do it, first learn python , then search data science cources on youtube and on various apps like udemy , coursera , swayam...... And enrolled on it......
@siddharthmohapatra7297
@siddharthmohapatra7297 2 жыл бұрын
@@rutvikchauhan1572 I have enrolled in pw skills
@anuraganand6675
@anuraganand6675 Жыл бұрын
@Rutvik Chauhan how is you feedback of pw skills data science course?
@akindia8519
@akindia8519 8 ай бұрын
​@@siddharthmohapatra7297 hi can you please give us feedback of pw skills' data science masters program?
@siddhisg
@siddhisg Жыл бұрын
greater than less than symbol though🥲
@vaibhavyadav-w8g
@vaibhavyadav-w8g Жыл бұрын
@jitendrasahay3847
@jitendrasahay3847 4 ай бұрын
I have to say : a very short precise material has been elongated irritatingly. Repetative statements...
@satyapujari7731
@satyapujari7731 Жыл бұрын
After every five minutes, there was an advertisement, which made it difficult to concentrate while watching videos.
@rajdama205
@rajdama205 Жыл бұрын
Use youtube vanced broo
@satyapujari7731
@satyapujari7731 Жыл бұрын
@@rajdama205 Great! 🤟
@vishalgupta9620
@vishalgupta9620 Жыл бұрын
noob knows nothing
@priyotoshsahaThePowerOf23
@priyotoshsahaThePowerOf23 Жыл бұрын
BEST
Complete Machine Learning In 6 Hours| Krish Naik
6:37:52
Krish Naik
Рет қаралды 1 МЛН
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 423 М.
Drink Matching Game #игры #games #funnygames #умныеигры #matching #игрыдлякомпании #challenge
00:26
Smart Sigma Kid #funny #sigma
00:36
CRAZY GREAPA
Рет қаралды 51 МЛН
Мем про дорожку
00:38
Max Maximov
Рет қаралды 4,5 МЛН
Minecraft: Who made MINGLE the best? 🤔 #Shorts
00:34
Twi Shorts
Рет қаралды 46 МЛН
Standardization Vs Normalization- Feature Scaling
12:52
Krish Naik
Рет қаралды 311 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Top 3 Best Books to Learn ML/AI and Become a Professional
15:48
ML New Papers
Рет қаралды 6 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,8 МЛН
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 456 М.
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 415 М.
StatQuest: PCA main ideas in only 5 minutes!!!
6:05
StatQuest with Josh Starmer
Рет қаралды 1,3 МЛН
t-SNE Simply Explained
25:49
ritvikmath
Рет қаралды 17 М.
Drink Matching Game #игры #games #funnygames #умныеигры #matching #игрыдлякомпании #challenge
00:26