This is the fifth time I am watching these series. Professor Winston Rest In Peace, your lectures were out of the world and second to none ! RIP
@sasazaza5226 жыл бұрын
This is the best lecture on probabilistic inference that i have ever come across
@katateo3282 жыл бұрын
yeah, exactly, Prof helps me to clarify the big question about the third equation in the axiom. The equation must be a axiom, not a property of probabilistic. We cannot prove this equation. Just axiom. Just feel it, because it makes sense by Venn diagram. The Venn diagram cannot prove this equation, just illustrate the feeling. Tens of years to find out an answer. Great!
@samjackgreen2 жыл бұрын
Some markers 02:15 making a table 04:20 keep a tally of the various combinations 08:25 switching to another situation (Why is the dog barking? or is it just barking...) 12:00 A LITTLE PROBLEM with the table 13:50 Illustrating the point that PROBABILISTIC INFERENCE has a role to play in understanding human intelligence. - In the old days, I'd have to make a big table 15:59 On the other hand, there are times when I don't know all the stuff I need to know, in order to make the calculation. "Will the child of a republican be a republican?" 17:10 We can't record all those numbers and it is a pain to guess at them FREQUENTIST VIEW: the probabilities come out of looking at the data. SUBJECTIVE VIEW: we just make up the measurements. We might talk about NATURAL PROPENSITITES (like in quantum mechanics). 17:55 Working out probabilities without working out a full table 18:15 MAP FOR THE REST OF THE LECTURE (1) basic probability (2) conditional probability (3) chain rule (4) independence (5) conditional independence (6) belief nets (7) joint probability tables 23:23 Part (2): Conditional probability 28:00 Part (3): chain rule 30:34 part (4): Independence 33:40 part (5): Conditional independence 36:15 part (6) belief nets 38:00 Every node in the diagram *DEPENDS* on its parents -- and nothing else that's not a descendent. -Given its parents, every node is independent of all other non-descendents. "I want to make a MODEL of what's going to happen here; so let me see what kinds of probabilities I'm going to have to figure out" (40:40) 40:50 What exactly is the probability that a burglar appears? let's says it's 1 day in ten [DOES NOT DEPEND ON ANYTHING] 41:05 What exactly is the possibility that a racoon shows up? let's say it appears once every two days [DOES NOT DEPEND ON ANYTHING] 41:25 the dog's barking depends on whether there's a burglar OR a racoon - i.e. we need to look at 4 combinations 43:10 How exactly does [dog bark] determine the probability p[police called]? 43:40 How exactly does the [presence of racoon] determine p[trash can is knocked over]? 44:20 *I had to specify 10 numbers. Otherwise, (with a joint probability table) I'd have to specify 32 numbers. Considerable saving.*
@ruis23457 жыл бұрын
It's just such an amusement to listen and watch him articulate this probabilistic view of proceeding facts and assess it.
@rj-nj3uk2 жыл бұрын
wow, amazing. The intuitoinist view of probability using area, simply fantastic.
@tarlanahad4 жыл бұрын
RIP Sir Patrick Winston May Allah reward you for such an explanatory lecture!
@supersnowva67178 ай бұрын
This is such an amazing class on probabilistic inference!!! Thank you professor! RIP 🙏
@subashchandrapakhrin35374 жыл бұрын
Rest in peace Dr. Great video your teaching method made me read PhD ...
@sanmithramudigonda5 жыл бұрын
Pure genius at work!!
@hyguamo9 жыл бұрын
Great lecture. The best explanation I have seen so far.
@katateo3282 жыл бұрын
hahah, the big big gap btw MIT and harvard in the sky! I love MIT so so much.
@AhmedIsam5 жыл бұрын
RIP Prof Patrick Winston
@justusbenning16265 жыл бұрын
no. please tell me you're kidding.
@qzorn44408 жыл бұрын
great info, how does Patrick Winston find time to teach at this level? thanks.
@gordonlim23223 жыл бұрын
I think it is more intuitive to view independence with 2 separated circles in a venn diagram.
@JakobLemvig2 жыл бұрын
If they are separated, they are highly dependent. E.g., if A happens, you then know B cannot happen.
@artlenski81158 жыл бұрын
You can certainly get his axioms 1 and 3 from the Kolmogorov axioms en.wikipedia.org/wiki/Probability_axioms. Regarding his axiom 2, he probably meant that p(S) = 1, p(^S) = 0, where S is a universal set (set of all possible outcomes), and ^S is an impossible event.
@gordonlim23223 жыл бұрын
I think I have seen markovs chains several times now but I’m not sure if I have ever looked into it… anyways I was just looking into it to figure out Hidden Markov Models and was reminded of this lecture…more particularly the section around 39:00 where there’s a dependency graph. that being said i’m not sure if there’s any connection yet.
@Jirayu.Kaewprateep Жыл бұрын
📺💬 He understands probability and its tools 🥺💬 I understand you explain SHA table probabilities and the table generates probabilities results from selection options which is a good idea when it reflects results from import data. 🧸💬 As the principle command the total result of probability is 1.0 then otherwise is 1 - action probability. 🐑💬 How about you guess a seat number you sitting in the front rows ⁉ 👧💬 That is among the edges of your attention to the lessons and your comfortability. 📺💬 To explain of things ( probability table ) depends on the formula descendent or it depend on the previous rows.
@vadim648419 ай бұрын
33:26 The explanation of conditional probabilities was not great. Two probability ratios are equal - it tells us nothing about what it means. Can the equality just be a coincidence? If we make them equal by expanding or shrinking the size of the U, does it mean the two variables suddenly become dependent?
@EwanThomasM7 жыл бұрын
plz add lec 20....plz plz
@realyogi0076 жыл бұрын
Here is my personal notes I created out of this lecture. github.com/Santhosh-KS/MachineLearning_Concepts/blob/master/ProbabilisticInferences.ipynb
@SarveshBhatnagar12146 жыл бұрын
I think , at 42:56 , the phantom probability are wrong, since they should add up to one...
@flasher7024 ай бұрын
I link he meant P(d)/sum(all P(d)s). Which adds up to 1. And I think that is then the probability of all parents being in that state, given the dependent is true. But I am not actually sure of either one of those being correct, or being the relevant information here.
@usugacadavid215 жыл бұрын
I have a question: in 28:45 he explain the general formula for conditional probabilities. However, if we apply it on the case P(a,b,c), where x1 = a, x2 = b, and x3 = c, we do not obtain the same result as the professor as the probabilities seem to be shifted in the other way. In other words, we finish with P(a) as the last term and not P(c). Can someone explain me where is my mistake ? Thank you
@shahinsalehi47805 жыл бұрын
I'm thinking about the exact same thing, It doesn't make sense for i to start at n, it should start at 1.
@ebubekirceylan64255 жыл бұрын
best lecture ever
@bobbynazaris7508 жыл бұрын
Correction: Chain rule product: it should be P(X_n,...X_1) not P(X_1..X_n) I think! @Minute 28 in the clip.
@aSeaofTroubles7 жыл бұрын
It's correct,and the order doesn't matter
@sohan44656 ай бұрын
where is the 20th video? its 21 after 19
@mitocw5 ай бұрын
* Please note: Lecture 20, which focuses on the AI business, is not available.
@corey333p7 жыл бұрын
Where is lecture 20? I want it.
@mitocw7 жыл бұрын
+corey333p That lecture is not available, most likely due to third party content.
@917228546 жыл бұрын
just realised, AD and BC year && determinant of matrix is found by ad-bc ....... :)
@AkshayAradhya6 жыл бұрын
What does a hack in this context even mean ?
@timelyrain4 жыл бұрын
hack refers to a category of organized events in which participants are to achieve certain technical merits given an open-ended scope and extremely limited time, those events happen often at MIT
@samkins770610 жыл бұрын
what happen to lesson 20 i notice it was ommitted
@mitocw10 жыл бұрын
* Please note: Lecture 20, which focuses on the AI business, is not available.
@Cit0bor10 жыл бұрын
***** Is there a reason why it is not available? Just curious since in the earlier lectures, Professor Winston mentions it is one of the most important lecture in the course, so it is rather odd for it to not be available.
@mitocw10 жыл бұрын
Cit0bor Sorry, no reason was given. It was done at the request of the instructor.
@JohnForbes10 жыл бұрын
***** Is it available in any other medium or as paid content (other than enrolling at MIT)?
@mitocw10 жыл бұрын
John Forbes Sorry, it is not available in another medium on MIT OpenCourseWare. There might be other places it could be found. There seems to be quite a bit of materials that show up in the Google search on this class.
@guywithaname54086 жыл бұрын
Just realised the albino guy at the front is using a little telescope. Very interesting.
@razzlfraz5 жыл бұрын
Apparently albinos can't see well. Many are legally blind.
@AdarshSingh-qk2rj6 жыл бұрын
Did he used Matlab in the starting of the video 5:46 or what kind of software is it ??
@gatoquantico39253 жыл бұрын
He created a Java application. You can check it out the the OCW site.
@mentalflow7 жыл бұрын
This is awesome
@yihengzhu73896 жыл бұрын
Is the region of z at 35:25 correct? I think it should cover (a and b)
@G1I2A3N4N4H5S66 жыл бұрын
I think it is already assumed that a and b are independent.
@gordonlim23223 жыл бұрын
what year are these students in?
@mitocw3 жыл бұрын
Mostly Sophomores, Juniors, and Seniors. See ocw.mit.edu/6-034F10 for more info. Best wishes on your studies!
@JNSStudios7 жыл бұрын
*WOAH TECHNOLOGY*
@dostoguven8 жыл бұрын
44:34 fight club, lol.
@pablonapan46986 жыл бұрын
I've never seen a Professor so tired and pissed off while teaching AI.