Deep Learning(CS7015): Lec 2.2 McCulloch Pitts Neuron, Thresholding Logic

  Рет қаралды 135,529

NPTEL-NOC IITM

NPTEL-NOC IITM

Күн бұрын

Пікірлер: 46
@sumukhagc5528
@sumukhagc5528 4 жыл бұрын
This is the first nptel lecture I found which is useful
@onlydj-yq3fk
@onlydj-yq3fk 5 ай бұрын
😂😂😂
@Muskan-wi2zz
@Muskan-wi2zz 3 жыл бұрын
Can't thank enough. The concepts are explained in an absolutely amazing manner.
@AtulSharma-hy9yo
@AtulSharma-hy9yo 5 жыл бұрын
best explanation i found on the internet so far
@kanikagarg
@kanikagarg 6 ай бұрын
If you’ve also studied psychology, these concepts are easy and relatable; Great explanation, by the way.
@rushikeshkorde2673
@rushikeshkorde2673 4 жыл бұрын
very nice sir , your way of teaching is awesome
@rishabhsetiya
@rishabhsetiya 3 жыл бұрын
I have read in several articles that excitatory inputs are assigned weight 1 and inhibitory inputs are assigned weight -1. Just mentioning for information of other students.
@aishwaryabimaljoy6642
@aishwaryabimaljoy6642 3 жыл бұрын
you are right.
@GHamsa-e5h
@GHamsa-e5h Ай бұрын
the concepts are given in a very simple and clear manner. Thank u.... Just need a clarification for my doubt....For the function x1 and !x2, the threshold value is given as 1. if x1=1 and x2= 0, the sum is 1, threshold satisfies, fires -- OK....But if x1=0 and x2=1, then too the sum is 1, threshold satisfies , fires...BUT NOT OK... how is this taken care? The same for X1=1 and X2= 1, sum =2, threshold satisfies,,,fires....BUT NOT OK...Pls clarify
@dhruvinchawda439
@dhruvinchawda439 10 ай бұрын
Can anyone explain why threshold for tautology 11:02 is zero ?
@svk0071
@svk0071 6 ай бұрын
Tautology implies that output is TRUE or 1 always irrespective of the inputs. So the threshold is 0 since even if both x1 and x2 are 0 , the output is still 1.
@alyaalblooshi9476
@alyaalblooshi9476 5 жыл бұрын
Simple and informative .
@UtkarshSinghchutiyaNo1
@UtkarshSinghchutiyaNo1 Ай бұрын
Why don't we have such professors in NIT
@ganeshwaichal1
@ganeshwaichal1 2 жыл бұрын
Love younteacher....great explanation
@harshitarajoria-k8u
@harshitarajoria-k8u 2 ай бұрын
Omg.....what a teacher he is.....🤌
@mohammadrasheed9247
@mohammadrasheed9247 5 жыл бұрын
Great explanation!
@ZakirHussain-nd4fw
@ZakirHussain-nd4fw 9 ай бұрын
intro music just like Doordarshan Shaktimaan tv show.
@sanketkamta106
@sanketkamta106 5 жыл бұрын
threshold for ANDNOT and NOR? Can anyone explain?
@Musical_Era3
@Musical_Era3 5 жыл бұрын
@Gokul Gopakumar threshold zero means I think it gets fired for every value for binary.
@chhaprichandu
@chhaprichandu 5 жыл бұрын
@@Musical_Era3 While experimenting, I have found that the boundaries makes more sense if we take the transformed values instead of the raw input values. For example, while plotting x_1 ^ !x_2, if we plot the graph of x_1 and x_2, the problem of decision boundary arises as you mentioned in your point. However, if we plot x_1 vs !x_2, then this problem is solved. However, in the later case the threshold also changes which I think can be handled.
@deepakkumarsisodia7092
@deepakkumarsisodia7092 5 жыл бұрын
Think of x2 as a power switch where x2=1 means power is OFF and x2=0 means power is ON. When power switch is OFF (x2=1) then output is ALWAYS 0 irrespective of what the other input and threshold value is. Thus, out of total 4 possible inputs (i.e. 00,01,10,11) 01 and 11 are ruled out bcz x2 is 1 in both of these. Applicable inputs are 00 and 10 (bcz x2 is 0 in both i.e. power is ON). For 00 input, output is 0. For 10 input, output is 1. Therefore, for all applicable inputs the threshold is 1. **At the bottom of the slide it's clearly written : if any inhibitory input is 1 the output will be 0 **
@mratanusarkar
@mratanusarkar 4 жыл бұрын
@@deepakkumarsisodia7092 that power switch analogy was great!!
@mratanusarkar
@mratanusarkar 4 жыл бұрын
I was really confused, as I didn't get what Inhibitory Input implied and missed the footnote... finally, I got it... if any inhibitory input gets 1, the output becomes zero regardless of other conditions and the neuron and other inputs... so, that solves it... I'm leaving a link to an article on towardsdatascience here: towardsdatascience.com/mcculloch-pitts-model-5fdf65ac5dd1
@pranjalnama2420
@pranjalnama2420 2 жыл бұрын
amazing lecture
@dipali0010
@dipali0010 3 жыл бұрын
Hello, Please tell me which book I should refer to.
@madhuvarun2790
@madhuvarun2790 3 жыл бұрын
Fantastic lecture. I have a doubt. How is the threshold for x1 and !x2 1? Since x2 is inhibitory input it should always be 0, now if x1 is assigned 1 or 0 the resultant Boolean operation would be (1 and 0), (0 and 0) would still be 0(because the operation mentioned is AND). Could anyone explain please?
@MrMopuri
@MrMopuri 3 жыл бұрын
When x2 is 0, the resulting Boolean operations become (1 AND 1), (0 AND 1). Note that it is !x2 (NOT x2). Hence the threshold is 1 (x1 has to be 1) for the output to be 1. Hope this is clear.
@himanshu5891
@himanshu5891 3 жыл бұрын
x2 is an inibitory input, meaning if it is 1 then then y=0, irrespective of the values of other inputs. So it is sort of connected with and operation with other inputs. We can see it as !x2 (not x2). So we have x1 connected with and operation with !x2 (which changes the values of inputs if given 0 then !x2 produces 1 and vice versa). If x2=1 means !x2=0, so y=0 for any value of x1. Now if x2=0 means !x2=1, so for x1=1, y=1. So threshold is x1+x2=1+0=1.
@shyammarjit9994
@shyammarjit9994 2 жыл бұрын
@@MrMopuri thanks for the explanation.
@MohitSharma-vd1eh
@MohitSharma-vd1eh 5 жыл бұрын
nice explanation
@rohitprasad7418
@rohitprasad7418 5 жыл бұрын
a very nice explanation. Thank you
@ashutoshpatil26
@ashutoshpatil26 5 жыл бұрын
thank you sir
@sumanacharya3014
@sumanacharya3014 2 жыл бұрын
Sir I want to know about if you use three input x1 ,x2 and x3 then the decision boundary is plane because it matches with the plane equation ie x1+x2+x3+d=0 but you told its hyperplane how sir please explain it.
@kamleshkumarsingh9758
@kamleshkumarsingh9758 Жыл бұрын
For more than 3 dimensions the decision boundary will be a hyperplane
@narengs9790
@narengs9790 4 жыл бұрын
But how will the equation look like for NAND? can you kindly explain.
@shankaruma489
@shankaruma489 4 жыл бұрын
its same as AND!
@Harry-vr5vz
@Harry-vr5vz 5 жыл бұрын
why weights were not included
@thanioruvan4556
@thanioruvan4556 Жыл бұрын
this is not perceptron, this is mp neuron model, which doesn't have weights
@umang9997
@umang9997 Жыл бұрын
@@thanioruvan4556 True. Different videos on KZbin suggest otherwise, but the fact is MP Neuron does not have weights.
@Harry-vr5vz
@Harry-vr5vz 5 жыл бұрын
why weights were not included
@DerEddieLoL
@DerEddieLoL 4 жыл бұрын
you dont need weights if there is only one connection going out from x_i?
@prithvip6360
@prithvip6360 4 жыл бұрын
Becuase MP neurons doesnt have weights
@asjadnabeel
@asjadnabeel Жыл бұрын
Because MP Neurons are the early model, which doesn't have concept of weights
Deep Learning(CS7015): Lec 2.3 Perceptrons
11:00
NPTEL-NOC IITM
Рет қаралды 116 М.
Подсадим людей на ставки | ЖБ | 3 серия | Сериал 2024
20:00
ПАЦАНСКИЕ ИСТОРИИ
Рет қаралды 610 М.
Try this prank with your friends 😂 @karina-kola
00:18
Andrey Grechka
Рет қаралды 6 МЛН
McCulloch Pitts Neuron and Thresholding Logic
23:47
IIT Madras - B.S. Degree Programme
Рет қаралды 10 М.
McCulloch Pits algorithm with solved example
5:17
btech tutorial
Рет қаралды 149 М.
Deep Learning(CS7015): Lec 2.1 Motivation from Biological Neurons
7:32
Attention in transformers, visually explained | DL6
26:10
3Blue1Brown
Рет қаралды 1,9 МЛН
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 423 М.
Gradient descent, how neural networks learn | DL2
20:33
3Blue1Brown
Рет қаралды 7 МЛН
Deep Learning(CS7015): Lec 3.1 Sigmoid Neuron
12:30
NPTEL-NOC IITM
Рет қаралды 74 М.
Подсадим людей на ставки | ЖБ | 3 серия | Сериал 2024
20:00
ПАЦАНСКИЕ ИСТОРИИ
Рет қаралды 610 М.