Mod-01 Lec-30 Principal Component Analysis (PCA)

  Рет қаралды 99,287

nptelhrd

nptelhrd

Күн бұрын

Applied Multivariate Statistical Modeling by Dr J Maiti,Department of Management, IIT Kharagpur.For more details on NPTEL visit nptel.ac.in

Пікірлер: 60
@prajwalshenoy2735
@prajwalshenoy2735 4 жыл бұрын
Highly underrated video on PCA. Explained using a combination of statistical and linear algebra approach. Nothing beats this.
@gulabpreetsingh9985
@gulabpreetsingh9985 Жыл бұрын
can you explain at 23:57 , how we got z1 = x1cosθ + x2sinθ and z2 = -x1sinθ + x2cosθ.
@ArthurMcGDM
@ArthurMcGDM 8 жыл бұрын
Superb explanation. The best I can find on the internet or in the classroom. Thank you.
@arunbm123
@arunbm123 8 жыл бұрын
please tell me how derivate of aj'Saj = 2Saj ?
@juanfelipeceronuribe2910
@juanfelipeceronuribe2910 7 жыл бұрын
At 31:23 what makes the transformation orthogonal is A'A=I, not inv(A)A=I (that last one holds for any invertible matrix)
@worldpulse365
@worldpulse365 4 жыл бұрын
Indian teachers are by far the best. Mainly because many of them have stuck to traditional forms of instruction...patiently explaining and WRITING DOWN everything step by step. Nothing beats this. As a student you don't want to see everything compressed into slides with all the info already there (a sad norm by lazy professors in 'reputable' universities nowadays. You want to see the puzzle unfold as the teacher explains. Slides should be used sparingly as used here.
@manisekar2884
@manisekar2884 9 жыл бұрын
dear sir, i completed my eng in 98 after that i lost touch with maths, i joined recently in doct program in managemnet looking for understanding on PCA, the way, you derived the methodolgy of this, rekindled my interest in M1 back.. thank you
@kennetholelewe9932
@kennetholelewe9932 9 жыл бұрын
Sir, I love your teaching simplicity. Thank you so much.
@sumanshekhar3197
@sumanshekhar3197 9 ай бұрын
Oh my god! This is amazing explanation of PCA hands down!
@VALedu11
@VALedu11 4 жыл бұрын
Thank you sir for this comprehensive explanation of PCA. You made the derivatives look easy as well.
@rakeshnalliboyana2160
@rakeshnalliboyana2160 9 ай бұрын
Nice explanation before I did not have a proper idea ,but now I got an idea regarding PCA after watching your video.
@gulabpreetsingh9985
@gulabpreetsingh9985 Жыл бұрын
can someone explain at 23:57 , how we got z1 = x1cosθ + x2sinθ and z2 = -x1sinθ + x2cosθ.
@vikiviky1122
@vikiviky1122 4 жыл бұрын
He is the answer of why we need a teacher. Thanks
@kinglucos5146
@kinglucos5146 Жыл бұрын
It’s always the random Indian guy who explain your toughest subjects like a champ
@kanchandatta4668
@kanchandatta4668 3 ай бұрын
Sir one thing not clear so far at 23:58 the projetion on z2 why - sin theta is coming? Is it 4 th quadrant so. We know all, sin tan ,cos positive in 4 quadrant. Not sure please response.
@HelloIamRachnaGupta
@HelloIamRachnaGupta 5 жыл бұрын
Thank you so much sir. This video answered my questions of PCA implementation in ML which no other tutorial/blog did.
@prabhuthomas8770
@prabhuthomas8770 6 жыл бұрын
The best lecture on PCA. Thank you.
@vaskoo84
@vaskoo84 Жыл бұрын
Best explaination on PCA , Thank you sir !
@ashutoshsingh1808
@ashutoshsingh1808 3 жыл бұрын
What a clear explanation!!! Thank you Sir!
@vishwajeetsingh5444
@vishwajeetsingh5444 Жыл бұрын
I love the way Maity sir explained the concept of PCA.
@GIS-Engineer
@GIS-Engineer 5 жыл бұрын
What is covariance. What is eigen value and eigen vector.what is correlation. If we have 150 sample of water .what is the relation of and all
@kweweli7821
@kweweli7821 8 жыл бұрын
destroyed PCA in an hour...perfect job
@chinkirai988
@chinkirai988 6 жыл бұрын
Thanks for sharing beautiful maths of PCA.
@gouravkarmakar9606
@gouravkarmakar9606 4 жыл бұрын
best explanation of PCA.
@ashumohanty566
@ashumohanty566 5 жыл бұрын
long live sir.you are the god for poor student
@Dyslexic_Neuron
@Dyslexic_Neuron Жыл бұрын
how is the variance of X replaced by covariance of X , didnt get that part
@ap3035
@ap3035 4 жыл бұрын
Rotation of axis by theta also rotate the data on axis or not?
@GuilleLissa
@GuilleLissa 8 жыл бұрын
Excellent explanation Dr. J Maiti
@sathviks1420
@sathviks1420 8 ай бұрын
Good lecture! Thank you!
@kadirgulec591
@kadirgulec591 8 жыл бұрын
Excellent demonstration. But if you have enough background. Thank you.
@matinhewing1
@matinhewing1 6 жыл бұрын
Trying to learn the background as I go along...)))
@extramiles3831
@extramiles3831 9 жыл бұрын
Sir, can we use Mohr circle here and avoid tedious calculations?
@jhareswarmaiti8766
@jhareswarmaiti8766 9 жыл бұрын
+VIVEK SINGH Please explain a bit the use of Mohr circle here.
@anigov
@anigov 7 жыл бұрын
Jhareswar Maiti Thank you so much Sir for the wonderful explanation in both videos.It has given me a much better understanding of PCA --
@MeghaSKumar
@MeghaSKumar 5 жыл бұрын
hi sir..thanks a lot for the clear explanation of PCA. that was really helpful for me. I have a doubt @15.45..dimension of data matrix x is n*p..so n rows and p columns..then why is it p*1 in x in the immediate step..or is that an uppercase p.. final matrix should be m*p where m
@idntknw1000
@idntknw1000 9 жыл бұрын
Comprehensive explanation. Thank you
@dwivedys
@dwivedys 6 жыл бұрын
Excellent! Where is the second part of the lecture?
@shebhk11
@shebhk11 4 жыл бұрын
kzbin.info/www/bejne/pKaVk6KDibCCfKM
@hellomacha4388
@hellomacha4388 3 жыл бұрын
Super explanation
@amitkumarmaiti5392
@amitkumarmaiti5392 4 жыл бұрын
Great explanation
@jagadeesh2681996
@jagadeesh2681996 5 жыл бұрын
can you tell me the nam eof the text you are flowing
@snehalnair3708
@snehalnair3708 8 жыл бұрын
Awesome video
@arunbm123
@arunbm123 8 жыл бұрын
At 37.33 a' V(X) aj .........how did aj come here
@juanfelipeceronuribe2910
@juanfelipeceronuribe2910 7 жыл бұрын
Also bear in mind that V(X) is the matrix of covariances of {x_1,...,x_p}
@hemmapermal532
@hemmapermal532 7 жыл бұрын
arunbm we are maximizing the Var(z) and aTa=1.
@hemmapermal532
@hemmapermal532 7 жыл бұрын
V(aTx)=E[(aTx - aTu)^2]. This is the formula of variance. As we already knew E(aTx)=aTu. u=mu. You try expand the equation and you will know where that aj came from.
@Dyslexic_Neuron
@Dyslexic_Neuron Жыл бұрын
@@hemmapermal532 taking aT out , we will get (aT)^2 and E(x-mu)^2 which is Var(x) , finally we will get (aT)^2* Var(X) , but where does the aj come from ??
@sharathraju8849
@sharathraju8849 6 жыл бұрын
PRINCIPAL COMPONENT ANALYSIS PROCEDURE BY GOMEZ AND GOMEZ...IF ANY BODY KNOW PLEASE REPLY...
@SAHILGAJBHIYE60
@SAHILGAJBHIYE60 11 ай бұрын
Ty very much sir ❤
@jontravolta1625
@jontravolta1625 10 жыл бұрын
Thanks sir!!
@ridickrolandtakong5358
@ridickrolandtakong5358 9 жыл бұрын
Excellent
@PavanYadiki
@PavanYadiki 9 жыл бұрын
Thank you Sir.
@deborahawe4063
@deborahawe4063 6 жыл бұрын
Thanks lots
@diehardcynic
@diehardcynic 9 жыл бұрын
Thank you sir
@rojalin9221
@rojalin9221 7 жыл бұрын
NICE SIR
@mswoonc
@mswoonc 7 жыл бұрын
durka durka
@sansin-dev
@sansin-dev 5 жыл бұрын
So much paper wasted. Could have shown everything on slides or board.
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
These are super old videos of 1990s at that time boards were not so common and thus method is better for understanding dickster
@ebenezerr9152
@ebenezerr9152 3 жыл бұрын
If it was shown on slides, u wouldnt have understood
Mod-01 Lec-31 PCA -- Model Adequacy & Interpretation
58:46
nptelhrd
Рет қаралды 16 М.
Lecture: Principal Componenet Analysis (PCA)
51:13
AMATH 301
Рет қаралды 174 М.
진짜✅ 아님 가짜❌???
0:21
승비니 Seungbini
Рет қаралды 10 МЛН
GIANT Gummy Worm #shorts
0:42
Mr DegrEE
Рет қаралды 152 МЛН
She wanted to set me up #shorts by Tsuriki Show
0:56
Tsuriki Show
Рет қаралды 8 МЛН
The Lost World: Living Room Edition
0:46
Daniel LaBelle
Рет қаралды 27 МЛН
Mod-01 Lec-10 Multivariate normal distribution
57:33
nptelhrd
Рет қаралды 124 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Animation vs. Math
14:03
Alan Becker
Рет қаралды 78 МЛН
Ali Ghodsi, Lec 1: Principal Component Analysis
1:11:42
Data Science Courses
Рет қаралды 102 М.
Lecture 46 - Dimensionality Reduction - Introduction | Stanford University
12:02
Artificial Intelligence - All in One
Рет қаралды 79 М.
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 419 М.
17: Principal Components Analysis_ - Intro to Neural Computation
1:21:19
MIT OpenCourseWare
Рет қаралды 39 М.
진짜✅ 아님 가짜❌???
0:21
승비니 Seungbini
Рет қаралды 10 МЛН