21. Probabilistic Inference I

  Рет қаралды 93,522

MIT OpenCourseWare

MIT OpenCourseWare

Күн бұрын

Пікірлер: 58
@sepehrsabeti2467
@sepehrsabeti2467 3 жыл бұрын
This is the fifth time I am watching these series. Professor Winston Rest In Peace, your lectures were out of the world and second to none ! RIP
@sasazaza522
@sasazaza522 6 жыл бұрын
This is the best lecture on probabilistic inference that i have ever come across
@katateo328
@katateo328 2 жыл бұрын
yeah, exactly, Prof helps me to clarify the big question about the third equation in the axiom. The equation must be a axiom, not a property of probabilistic. We cannot prove this equation. Just axiom. Just feel it, because it makes sense by Venn diagram. The Venn diagram cannot prove this equation, just illustrate the feeling. Tens of years to find out an answer. Great!
@samjackgreen
@samjackgreen 2 жыл бұрын
Some markers 02:15 making a table 04:20 keep a tally of the various combinations 08:25 switching to another situation (Why is the dog barking? or is it just barking...) 12:00 A LITTLE PROBLEM with the table 13:50 Illustrating the point that PROBABILISTIC INFERENCE has a role to play in understanding human intelligence. - In the old days, I'd have to make a big table 15:59 On the other hand, there are times when I don't know all the stuff I need to know, in order to make the calculation. "Will the child of a republican be a republican?" 17:10 We can't record all those numbers and it is a pain to guess at them FREQUENTIST VIEW: the probabilities come out of looking at the data. SUBJECTIVE VIEW: we just make up the measurements. We might talk about NATURAL PROPENSITITES (like in quantum mechanics). 17:55 Working out probabilities without working out a full table 18:15 MAP FOR THE REST OF THE LECTURE (1) basic probability (2) conditional probability (3) chain rule (4) independence (5) conditional independence (6) belief nets (7) joint probability tables 23:23 Part (2): Conditional probability 28:00 Part (3): chain rule 30:34 part (4): Independence 33:40 part (5): Conditional independence 36:15 part (6) belief nets 38:00 Every node in the diagram *DEPENDS* on its parents -- and nothing else that's not a descendent. -Given its parents, every node is independent of all other non-descendents. "I want to make a MODEL of what's going to happen here; so let me see what kinds of probabilities I'm going to have to figure out" (40:40) 40:50 What exactly is the probability that a burglar appears? let's says it's 1 day in ten [DOES NOT DEPEND ON ANYTHING] 41:05 What exactly is the possibility that a racoon shows up? let's say it appears once every two days [DOES NOT DEPEND ON ANYTHING] 41:25 the dog's barking depends on whether there's a burglar OR a racoon - i.e. we need to look at 4 combinations 43:10 How exactly does [dog bark] determine the probability p[police called]? 43:40 How exactly does the [presence of racoon] determine p[trash can is knocked over]? 44:20 *I had to specify 10 numbers. Otherwise, (with a joint probability table) I'd have to specify 32 numbers. Considerable saving.*
@ruis2345
@ruis2345 7 жыл бұрын
It's just such an amusement to listen and watch him articulate this probabilistic view of proceeding facts and assess it.
@rj-nj3uk
@rj-nj3uk 2 жыл бұрын
wow, amazing. The intuitoinist view of probability using area, simply fantastic.
@tarlanahad
@tarlanahad 4 жыл бұрын
RIP Sir Patrick Winston May Allah reward you for such an explanatory lecture!
@supersnowva6717
@supersnowva6717 8 ай бұрын
This is such an amazing class on probabilistic inference!!! Thank you professor! RIP 🙏
@subashchandrapakhrin3537
@subashchandrapakhrin3537 4 жыл бұрын
Rest in peace Dr. Great video your teaching method made me read PhD ...
@sanmithramudigonda
@sanmithramudigonda 5 жыл бұрын
Pure genius at work!!
@hyguamo
@hyguamo 9 жыл бұрын
Great lecture. The best explanation I have seen so far.
@katateo328
@katateo328 2 жыл бұрын
hahah, the big big gap btw MIT and harvard in the sky! I love MIT so so much.
@AhmedIsam
@AhmedIsam 5 жыл бұрын
RIP Prof Patrick Winston
@justusbenning1626
@justusbenning1626 5 жыл бұрын
no. please tell me you're kidding.
@qzorn4440
@qzorn4440 8 жыл бұрын
great info, how does Patrick Winston find time to teach at this level? thanks.
@gordonlim2322
@gordonlim2322 3 жыл бұрын
I think it is more intuitive to view independence with 2 separated circles in a venn diagram.
@JakobLemvig
@JakobLemvig 2 жыл бұрын
If they are separated, they are highly dependent. E.g., if A happens, you then know B cannot happen.
@artlenski8115
@artlenski8115 8 жыл бұрын
You can certainly get his axioms 1 and 3 from the Kolmogorov axioms en.wikipedia.org/wiki/Probability_axioms. Regarding his axiom 2, he probably meant that p(S) = 1, p(^S) = 0, where S is a universal set (set of all possible outcomes), and ^S is an impossible event.
@gordonlim2322
@gordonlim2322 3 жыл бұрын
I think I have seen markovs chains several times now but I’m not sure if I have ever looked into it… anyways I was just looking into it to figure out Hidden Markov Models and was reminded of this lecture…more particularly the section around 39:00 where there’s a dependency graph. that being said i’m not sure if there’s any connection yet.
@Jirayu.Kaewprateep
@Jirayu.Kaewprateep Жыл бұрын
📺💬 He understands probability and its tools 🥺💬 I understand you explain SHA table probabilities and the table generates probabilities results from selection options which is a good idea when it reflects results from import data. 🧸💬 As the principle command the total result of probability is 1.0 then otherwise is 1 - action probability. 🐑💬 How about you guess a seat number you sitting in the front rows ⁉ 👧💬 That is among the edges of your attention to the lessons and your comfortability. 📺💬 To explain of things ( probability table ) depends on the formula descendent or it depend on the previous rows.
@vadim64841
@vadim64841 9 ай бұрын
33:26 The explanation of conditional probabilities was not great. Two probability ratios are equal - it tells us nothing about what it means. Can the equality just be a coincidence? If we make them equal by expanding or shrinking the size of the U, does it mean the two variables suddenly become dependent?
@EwanThomasM
@EwanThomasM 7 жыл бұрын
plz add lec 20....plz plz
@realyogi007
@realyogi007 6 жыл бұрын
Here is my personal notes I created out of this lecture. github.com/Santhosh-KS/MachineLearning_Concepts/blob/master/ProbabilisticInferences.ipynb
@SarveshBhatnagar1214
@SarveshBhatnagar1214 6 жыл бұрын
I think , at 42:56 , the phantom probability are wrong, since they should add up to one...
@flasher702
@flasher702 4 ай бұрын
I link he meant P(d)/sum(all P(d)s). Which adds up to 1. And I think that is then the probability of all parents being in that state, given the dependent is true. But I am not actually sure of either one of those being correct, or being the relevant information here.
@usugacadavid21
@usugacadavid21 5 жыл бұрын
I have a question: in 28:45 he explain the general formula for conditional probabilities. However, if we apply it on the case P(a,b,c), where x1 = a, x2 = b, and x3 = c, we do not obtain the same result as the professor as the probabilities seem to be shifted in the other way. In other words, we finish with P(a) as the last term and not P(c). Can someone explain me where is my mistake ? Thank you
@shahinsalehi4780
@shahinsalehi4780 5 жыл бұрын
I'm thinking about the exact same thing, It doesn't make sense for i to start at n, it should start at 1.
@ebubekirceylan6425
@ebubekirceylan6425 5 жыл бұрын
best lecture ever
@bobbynazaris750
@bobbynazaris750 8 жыл бұрын
Correction: Chain rule product: it should be P(X_n,...X_1) not P(X_1..X_n) I think! @Minute 28 in the clip.
@aSeaofTroubles
@aSeaofTroubles 7 жыл бұрын
It's correct,and the order doesn't matter
@sohan4465
@sohan4465 6 ай бұрын
where is the 20th video? its 21 after 19
@mitocw
@mitocw 5 ай бұрын
* Please note: Lecture 20, which focuses on the AI business, is not available.
@corey333p
@corey333p 7 жыл бұрын
Where is lecture 20? I want it.
@mitocw
@mitocw 7 жыл бұрын
+corey333p That lecture is not available, most likely due to third party content.
@91722854
@91722854 6 жыл бұрын
just realised, AD and BC year && determinant of matrix is found by ad-bc ....... :)
@AkshayAradhya
@AkshayAradhya 6 жыл бұрын
What does a hack in this context even mean ?
@timelyrain
@timelyrain 4 жыл бұрын
hack refers to a category of organized events in which participants are to achieve certain technical merits given an open-ended scope and extremely limited time, those events happen often at MIT
@samkins7706
@samkins7706 10 жыл бұрын
what happen to lesson 20 i notice it was ommitted
@mitocw
@mitocw 10 жыл бұрын
* Please note: Lecture 20, which focuses on the AI business, is not available.
@Cit0bor
@Cit0bor 10 жыл бұрын
***** Is there a reason why it is not available? Just curious since in the earlier lectures, Professor Winston mentions it is one of the most important lecture in the course, so it is rather odd for it to not be available.
@mitocw
@mitocw 10 жыл бұрын
Cit0bor Sorry, no reason was given. It was done at the request of the instructor.
@JohnForbes
@JohnForbes 10 жыл бұрын
***** Is it available in any other medium or as paid content (other than enrolling at MIT)?
@mitocw
@mitocw 10 жыл бұрын
John Forbes Sorry, it is not available in another medium on MIT OpenCourseWare. There might be other places it could be found. There seems to be quite a bit of materials that show up in the Google search on this class.
@guywithaname5408
@guywithaname5408 6 жыл бұрын
Just realised the albino guy at the front is using a little telescope. Very interesting.
@razzlfraz
@razzlfraz 5 жыл бұрын
Apparently albinos can't see well. Many are legally blind.
@AdarshSingh-qk2rj
@AdarshSingh-qk2rj 6 жыл бұрын
Did he used Matlab in the starting of the video 5:46 or what kind of software is it ??
@gatoquantico3925
@gatoquantico3925 3 жыл бұрын
He created a Java application. You can check it out the the OCW site.
@mentalflow
@mentalflow 7 жыл бұрын
This is awesome
@yihengzhu7389
@yihengzhu7389 6 жыл бұрын
Is the region of z at 35:25 correct? I think it should cover (a and b)
@G1I2A3N4N4H5S6
@G1I2A3N4N4H5S6 6 жыл бұрын
I think it is already assumed that a and b are independent.
@gordonlim2322
@gordonlim2322 3 жыл бұрын
what year are these students in?
@mitocw
@mitocw 3 жыл бұрын
Mostly Sophomores, Juniors, and Seniors. See ocw.mit.edu/6-034F10 for more info. Best wishes on your studies!
@JNSStudios
@JNSStudios 7 жыл бұрын
*WOAH TECHNOLOGY*
@dostoguven
@dostoguven 8 жыл бұрын
44:34 fight club, lol.
@pablonapan4698
@pablonapan4698 6 жыл бұрын
I've never seen a Professor so tired and pissed off while teaching AI.
22. Probabilistic Inference II
48:46
MIT OpenCourseWare
Рет қаралды 54 М.
14. Learning: Sparse Spaces, Phonology
47:49
MIT OpenCourseWare
Рет қаралды 80 М.
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 70 МЛН
Happy birthday to you by Secret Vlog
00:12
Secret Vlog
Рет қаралды 6 МЛН
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 1,9 МЛН
Real Man relocate to Remote Controlled Car 👨🏻➡️🚙🕹️ #builderc
00:24
13. Learning: Genetic Algorithms
47:16
MIT OpenCourseWare
Рет қаралды 525 М.
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4,1 МЛН
Dr. JEFF BECK - The probability approach to AI
1:10:07
Machine Learning Street Talk
Рет қаралды 18 М.
18. Representations: Classes, Trajectories, Transitions
48:58
MIT OpenCourseWare
Рет қаралды 57 М.
Daniel Everett, "Homo Erectus and the Invention of Human Language"
1:10:43
Harvard Science Book Talks and Research Lectures
Рет қаралды 510 М.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 717 М.
15. Learning: Near Misses, Felicity Conditions
46:54
MIT OpenCourseWare
Рет қаралды 70 М.
7. Constraints: Interpreting Line Drawings
49:13
MIT OpenCourseWare
Рет қаралды 135 М.
COMPSCI 188 - 2018-10-11 - Bayes' Nets: Independence (D-Separation)
1:25:00
Webcast Departmental
Рет қаралды 25 М.
12a: Neural Nets
50:43
MIT OpenCourseWare
Рет қаралды 532 М.
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 70 МЛН