Linear Discriminant Analysis (LDA) made easy

  Рет қаралды 19,955

Saptarsi Goswami

Saptarsi Goswami

Күн бұрын

Пікірлер: 49
@arghyakusumdas54
@arghyakusumdas54 4 жыл бұрын
--Intution behind LDA a)It reduces dimensions by constructing new features which are linear combinations of original features. b)It uses eigen decomposition c)It does compression (of features) thereby providing better visualization d)It provides better efficiency. --Different from PCA a) PCA is unsupervised does not consider class labels,whereas ,LDA is supervised considers class labels(hence sometimes provides with better separability between classes) b) PCA finds the axis along the maximum variability of the data whereas LDA finds the axis along which the class separation is maximum. --Intution behind scatter a)The variability within a class is minimized (Inter class scatter). b)The variability between classes is maximized (Between Class scatter). ---Characteristics of LDA a) It is a supervised technique and tries to find axis that separates the classes the most. b) Fischer Discriminant Ratio is the metric that is the ratio of Between class scatter and within class scatter and the intention is to maximize the ratio. c) For c-classes the number of non-zero eigen values is (c-1). Significantly only (c-1) lines are sufficient to separate c classes in this context. d)It assumes all classes are generated from gaussian distribution .(sometimes this may not be true) e)It assumes that the 2 classes have the same covariance matrix. Sir, thank you again for this wonderful initiative of yours.
@abhay9994
@abhay9994 Жыл бұрын
Wow, this video on Linear Discriminant Analysis (LDA) by Instructor Saptarsi Goswami is incredibly informative and well-explained. I truly appreciate how he breaks down the concepts and compares LDA to PCA, highlighting the advantages of LDA. The explanations of the Fisher discriminant ratio, inter-class scatter, within-class scatter, and eigenvalue decomposition have given me a solid understanding of LDA. Thank you, Instructor Saptarsi, for sharing your expertise and helping me improve my knowledge in this area!
@dhoomketu731
@dhoomketu731 4 жыл бұрын
Exceptionally well explained. Short, crisp and to the point. From a strictly practical standpoint, the great thing about this lecture is that it avoids delving into the exhaustive mathematics for obtaining the solution of the Fisher optimization problem and instead focusses on the critical takeaways from LDA. Even those lengthy and mathematically rigorous NPTEL lectures do not lay as much focus on emphasizing upon the takeaways from LDA or even if they do, halfway through the lecture, the viewer is exhausted to such an extent that he/she is not in a position to fully grasp them. Another great thing about this lecture is that it's class agnostic. If you refer to other lectures on LDA, they try to explain this topic from a binary classification perspective and towards the end, they tell you how the same formulation with minor modifications can be extended to a more than 2 class classification problem which in my opinion convolutes stuff a bit. I Have subscribed to your channel and would recommend your channel to other friends as well. Regards from Dehradun, Uttarakhand.
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thank you so much for your elaborate and generous comment. This will motivate us further.
@austinjohn8713
@austinjohn8713 4 жыл бұрын
Excellent lecture. You know the stuff, because you broke it down for us to see what is happening. After scouring the net for days for the motivation behind the steps in LDA, i breathe a sigh of relief seeing your video. Thank you for saving me from frustration over something that is actually very simple to understand. Please make more videos. I subscribe and i like!
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thank you so much
@NehaGupta-lk4ou
@NehaGupta-lk4ou Жыл бұрын
I agree, this video explains the motivation and lays out the steps very simply, cutting out the jargon.
@aashishraina2831
@aashishraina2831 4 жыл бұрын
excellent. you have the art of providing detailed information so simple.
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thanks a lot. All the best to your journey of ML and DL
@nk-dy4hc
@nk-dy4hc 9 ай бұрын
Very good explanation. You deserve more subscription sir. KZbin shorts might bring some users. Unfortunately, the algorithm works that way. All the best.
@sofluzik
@sofluzik 4 жыл бұрын
Never seen this lucid explanation...thank you
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thanks a lot. Please do check our other videos and give your feedback.
@aashishadhikari8144
@aashishadhikari8144 3 жыл бұрын
kudos for specifying that m1, m2 and m3 are vectors while using for SB. Many sources do not consider this fact.
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Thanks a lot Aashish
@jaikishank
@jaikishank 4 жыл бұрын
Very good presentation and nice conceptual explanation. Thanks Mr Saptarasi.
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thanks a lot, do checkout our other videos too, like PCA and tSNE if they are of interest to you
@mouleeswarang.s1023
@mouleeswarang.s1023 3 жыл бұрын
This video was highly helpful. Thank you!
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Thanks a lot
@thejaswinim.s1691
@thejaswinim.s1691 2 жыл бұрын
great job...
@SaptarsiGoswami
@SaptarsiGoswami 2 жыл бұрын
Thank you
@kausalyaakannan7064
@kausalyaakannan7064 4 жыл бұрын
Excellent one sir. Thank you
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thanks a lot, do check out our other videos too
@nintishia
@nintishia 2 жыл бұрын
Excellent exposition, thanks.
@bruteforce8744
@bruteforce8744 2 жыл бұрын
excellent video...just a small correction, the mean of the y componets of X1 is 3.6 and not 3.8
@piyukr
@piyukr 2 жыл бұрын
It was a very helpful lecture. Thank you, Sir.
@rajdeep1229
@rajdeep1229 3 жыл бұрын
Very good lecture.
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Thanks Rajdeep.
@sudhirm4094
@sudhirm4094 3 жыл бұрын
A difficult concept explained so simply. Thankyou Sir.
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Thanks for putting your comments here. Please do share with interested folks and check , out o our other videos too.
@mmm777ization
@mmm777ization 3 жыл бұрын
No where @11:08 while finding S B you have used the formula mentioned while you started explaining them at beginning that is nowhere the total mean is used
@dendyarmanda4417
@dendyarmanda4417 3 жыл бұрын
i'd like to ask, when computing S1 and S2 why the result is devided by four sir? , and when computing Sb in the previous formula it was mentioned Ni , and it was not used, . Need some explain sir
@sofluzik
@sofluzik 4 жыл бұрын
Really very nice lucid sir...
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thank you so much
@SoumyaDasgupta
@SoumyaDasgupta 4 жыл бұрын
Good explanation Saptarshi.
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thank you
@SoumyaDasgupta
@SoumyaDasgupta 4 жыл бұрын
@@SaptarsiGoswami lets connect for a good ML project if you are free.
@dr.shaminisrivastava4213
@dr.shaminisrivastava4213 4 жыл бұрын
Difference between orthogonal and canonical discriminant techniques
@azizullah6360
@azizullah6360 3 жыл бұрын
for graphical representation what software u used?
@iitncompany
@iitncompany Жыл бұрын
Wring at 10.40 s1 and s2 matrix both calculation wrong .
@abdulrahimshihabuddin1119
@abdulrahimshihabuddin1119 3 жыл бұрын
Is MDA an extension to LDA? Or a different method?
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Hey, sorry for the late reply, I missed your question. You can think MDA to be a more generic version of LDA. In LDA, a class comes from a single gaussian distribution, in MDA it can come from multiple gaussian distribution, each may represent a subclass. You may also follow the ink www.sthda.com/english/articles/36-classification-methods-essentials/146-discriminant-analysis-essentials-in-r/
@abdulrahimshihabuddin1119
@abdulrahimshihabuddin1119 3 жыл бұрын
Saptarsi Goswami Sorry sir. I meant Multiple discriminant analysis. Is that a different method ? Are mixture discriminant analysis and multiple discriminant analysis the same?
@SaptarsiGoswami
@SaptarsiGoswami 3 жыл бұрын
Not much is available about this. May be I will check Duda's book and comment, I am tempted to tell this is LDA with multi class, but I will hold my comments before I am sure.
@pritomroy2465
@pritomroy2465 4 жыл бұрын
you save the day..
@SaptarsiGoswami
@SaptarsiGoswami 4 жыл бұрын
Thanks a lot.
@mmm777ization
@mmm777ization 4 жыл бұрын
6:52 thanx
@mmm777ization
@mmm777ization 3 жыл бұрын
within=intra
@tyronefrielinghaus3467
@tyronefrielinghaus3467 Жыл бұрын
English too painful to listen I'm afraid.
PCA with Python
20:49
Saptarsi Goswami
Рет қаралды 1,3 М.
StatQuest: Linear Discriminant Analysis (LDA) clearly explained.
15:12
StatQuest with Josh Starmer
Рет қаралды 800 М.
Маусымашар-2023 / Гала-концерт / АТУ қоштасу
1:27:35
Jaidarman OFFICIAL / JCI
Рет қаралды 390 М.
SLIDE #shortssprintbrasil
0:31
Natan por Aí
Рет қаралды 49 МЛН
Discriminant Analysis (Part 1)
21:00
Galit Shmueli
Рет қаралды 27 М.
Linear discriminant analysis (LDA) - simply explained
24:26
TileStats
Рет қаралды 62 М.
LDA using Python
16:48
Saptarsi Goswami
Рет қаралды 15 М.
Dimensional Reduction| Principal Component Analysis
19:06
Krish Naik
Рет қаралды 165 М.
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 412 М.
PCA, SVD, LDA Linear Dimensionality Reduction Techniques
11:36
DecisionForest
Рет қаралды 13 М.
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 421 М.