Linear Discriminant Functions: Basics to Perceptron [E20]

  Рет қаралды 8,308

Pratik Jain

Pratik Jain

Күн бұрын

Пікірлер: 29
@sahiba421
@sahiba421 2 жыл бұрын
Good content. Simple and clear explanation.
@ritamsarkar3156
@ritamsarkar3156 3 жыл бұрын
I am looking for this kind of explanation, and found this , surprisingly it has very low views , Thank you very much
@pratikian
@pratikian 3 жыл бұрын
Thank you for your feedback. It motivates me a lot. 😇
@GamingPotatoHD
@GamingPotatoHD 4 жыл бұрын
Thank you for this explanation!
@aparnagnamboothiri1527
@aparnagnamboothiri1527 4 жыл бұрын
Good explanation 🤗
@harshinisewani800
@harshinisewani800 5 жыл бұрын
Thank you for such a good explanation
@pratikian
@pratikian 5 жыл бұрын
Happy to help .. :-)
@harshinisewani800
@harshinisewani800 4 жыл бұрын
@@pratikian can you please add video for generalized linear discrimination function
@pratikian
@pratikian 4 жыл бұрын
@@harshinisewani800 can you tell something more about this, Acctually I have not heard about it.
@kenway346
@kenway346 5 жыл бұрын
very nice..keep up the good work!!
@pratikian
@pratikian 5 жыл бұрын
Thank you so much 😊
@yatinarora9650
@yatinarora9650 3 жыл бұрын
PERFECT
@utkarshkathuria2931
@utkarshkathuria2931 3 жыл бұрын
does Wo call as bias?
@pratikian
@pratikian 3 жыл бұрын
Yes
@utkarshkathuria2931
@utkarshkathuria2931 3 жыл бұрын
@@pratikian please can you tell me why we need it and why it is called as bias?
@utkarshkathuria2931
@utkarshkathuria2931 3 жыл бұрын
@@pratikian I also want to ask that output g(x)=Wtranspose x. But the size of W is (no. of class X dimension) and the size of x is (dimension X 1). so why you write it as Wtranspose x and not as Wx. Please tell me if I get anything wrong
@pratikian
@pratikian 3 жыл бұрын
@Utkarsh Kathuria so think of this in terms of line. As I have explained in the video we need a line which will seperate the two classes. If you see the general equation of line in 2d its like this ax + by + c = 0 Now you can see that there is a b and c which are the parameters of the line which can define this line ... so a and b mostly contribute to the slope of the line .. you can modify this eqn to .. y = (-b/a)x - c Here you can see that a and b contribute to the slope of the line and '-c' is the y intercept it tells when will the line cut the y axis. If c is zero it says the line passes from the origin. This is the significance of the wo term it is the c over here. You can move the line in 2d by this. Bias. So it is called bias just because initially when the idea of artificial intelligence had come the first thing was the mc culloch pitts neurons which used to just see whether the class is either positive or negative that is greater than 0 or less than 0. Slowly people realised that this is not enough ... so they added the bias ... say bias is 5 ... so now the threshold becomes greater than 5 or less than 5 . Here you can see that the decision is biased towards the positive class .. hence the name bias was given ..
@pratikian
@pratikian 3 жыл бұрын
Ohh no no over here I have taken only a 2 class problem and here I have defined w as a column vector d x 1 and X is also a column vector d x 1 hence I have written Wtranspose X.
@CEDMAYAKUNTLAPRASANNAKUMAR
@CEDMAYAKUNTLAPRASANNAKUMAR 4 жыл бұрын
could you please explain how to find the equations of line if we have three classes??
@pratikian
@pratikian 4 жыл бұрын
So for 3 classes we can use different strategies as follows 1) one with rest In this what we do is we take one class on one side and group the other 2 classes as one (not class 1 group) and use the algorithm to make a line seperating them. Now take class 2 and the(not class 2 group) other two and get a line and the same for the 3rd class as well. So now we have 3 lines given a test point check it through all the 3 and make a decision. Eg. Line one says class 1 Line 2 says not class 2 Line 3 says not class 3 Means class 1 is the reqired class 2) Other approach is one vs one approach Here we make pairs of 2 classes Say class1 with class2 Class2 with class 3 Class 3 with class 1 Run the algorithm and again we get 3 lines Now choose the class that has majority Eg. Line 1 says class 1 Line 2 says class 2 Line 3 says class 1 Then the required class is class 1
@CEDMAYAKUNTLAPRASANNAKUMAR
@CEDMAYAKUNTLAPRASANNAKUMAR 4 жыл бұрын
​@@pratikian equationof straight line is x*(sigma inverse)*(mui) - 0.5* transpose)*(sigma inverse)* (mui)+ log (pi) is this correct?? what is log (pi) while calculating line between class 1 and rest of the classes
@CEDMAYAKUNTLAPRASANNAKUMAR
@CEDMAYAKUNTLAPRASANNAKUMAR 4 жыл бұрын
will mu should be calculated for class 1 and combined mean for class 2 and class 3 then we should proceed?? if possible could you make short video for three classes?? please
@pratikian
@pratikian 4 жыл бұрын
Oh wait the above reply was for the discriminant boundaries in the perceptron algorithm. I guess the question that you are asking is from the discriminant boundaries for the bayesian decision theory, where we have assumed a gaussian distribution which is this video kzbin.info/www/bejne/r6qohZWmo5xrn9E Now for 3 class here The discriminant boundary will be given by g1(x) = g2(x) g2(x) = g3(x) g3(x) = g1(x) By this you will get 3 equations Now each equation will have a range like if the region is in between class 1 and 2 then the first equation will be used for region in between class 2 and 3 equation 2 and for region in between 1 and 3 the 3rd equation
@pratikian
@pratikian 4 жыл бұрын
I feel the equation of straight line that you have written here is either wrong or there is a typing mistake please check that to with the video that I have mentioned in the previous reply
@CEDMAYAKUNTLAPRASANNAKUMAR
@CEDMAYAKUNTLAPRASANNAKUMAR 4 жыл бұрын
I am able to find the equations between class 1 and class 2,,, class 1 and class 2. but while finding the equation between Class 2 and class 3 im going wrong
Perceptron Algorithm [E21]
16:04
Pratik Jain
Рет қаралды 3,3 М.
Lec 16: Linear Discriminant Functions and Perceptron Criteria (Part I)
41:44
99.9% IMPOSSIBLE
00:24
STORROR
Рет қаралды 31 МЛН
StatQuest: Linear Discriminant Analysis (LDA) clearly explained.
15:12
StatQuest with Josh Starmer
Рет қаралды 796 М.
6.3 Discriminant Functions (UvA - Machine Learning 1 - 2020)
17:21
Attention in transformers, step-by-step | DL6
26:10
3Blue1Brown
Рет қаралды 2 МЛН
Discriminant Function and Decision Boundaries : [E5.1]
23:31
Pratik Jain
Рет қаралды 12 М.
Linear Discriminant Analysis : Multiple Class [E19.2]
14:23
Pratik Jain
Рет қаралды 3,7 М.
Curse of Dimensionality - EXPLAINED!
9:01
CodeEmporium
Рет қаралды 6 М.
Linear Discriminant Analysis (LDA) made easy
20:24
Saptarsi Goswami
Рет қаралды 19 М.