How Bayes Theorem works

  Рет қаралды 545,584

Brandon Rohrer

Brandon Rohrer

Күн бұрын

Part of the End-to-End Machine Learning School Course 191, Selected Models and Methods at e2eml.school/191
A walk through a couple of Bayesian inference examples.
The blog: brohrer.github....
The slides: docs.google.co...
Follow me for announcements: / _brohrer_

Пікірлер: 411
@phytasea
@phytasea 7 жыл бұрын
Wow best explanation and example ever I saw ^^ Fantastic.
@marciasola462
@marciasola462 5 жыл бұрын
Excellent
@romanemul1
@romanemul1 6 ай бұрын
exactly. These pacient disease examples were driving me nuts.
@welcome33333
@welcome33333 7 жыл бұрын
this is by far the most accessible explanation of Bayes theorem. Well done Brandon!
@Skachmo1
@Skachmo1 7 жыл бұрын
There are never lines at the men's room.
@selamysfendegu8239
@selamysfendegu8239 6 жыл бұрын
haha.
@dinikosama
@dinikosama 6 жыл бұрын
lol
@Beebo
@Beebo 4 жыл бұрын
Unless it's cocaine.
@masterpogi1818
@masterpogi1818 3 жыл бұрын
So the probability of this sample being true is 0%, hahhaha
@dhanushram6137
@dhanushram6137 3 жыл бұрын
You have never been to developer conferences :)
@WahranRai
@WahranRai 3 жыл бұрын
0:40 Bayes wrote 2 books one about theology and one about probabilty. He planed to write a third book infering the existence / non existence of God with probability (likelihood distribution = humanity, Prior distribution= miracles !
@danrattray8884
@danrattray8884 7 жыл бұрын
Best explanation of Bayes theorem I have seen. Fantastic teaching.
@viviankoneke1389
@viviankoneke1389 4 жыл бұрын
"...one of the top 10 math tattoos of all time." 😂
@dorinori8189
@dorinori8189 7 жыл бұрын
When you said small human, I imagined a small adult
@tofu.delivery.
@tofu.delivery. 5 жыл бұрын
Imagining calling Tyrian cute
@tracyshen430
@tracyshen430 7 жыл бұрын
the math at 5:39 timeline has an error. it should be 0.98*4/98=0.04. i checked your blog. it was also wrong. There's no way that the product of the two numbers will become 0.04. Please correct this video and your blog
@Xerxz5115
@Xerxz5115 6 жыл бұрын
Easiest way to weigh your dog is to first weigh urself, then hold ur dog and weigh again. Then you subtract your weight from dog+yourweight = dog weight
@shirshanyaroy287
@shirshanyaroy287 7 жыл бұрын
Brandon I just want to tell you that you are a fantastic teacher.
@BrandonRohrer
@BrandonRohrer 7 жыл бұрын
Thank you very much Shirshanya. That is a huge compliment. I'm honored.
@shirshanyaroy287
@shirshanyaroy287 7 жыл бұрын
Please make more statistics videos! I have suggested your channel to my biostat teacher.
@fredvin27
@fredvin27 6 жыл бұрын
I'm confused how do you calculate the probabilities in @17:56 P(m=13.9|w=17) and so on?
@richardgordon
@richardgordon 3 ай бұрын
Wow! One of the clearest explanations of Bayes Theorem I’ve come across!
@BrandonRohrer
@BrandonRohrer 3 ай бұрын
Thanks!
@penponds
@penponds 5 ай бұрын
Now in 2024, and I can’t imagine the degree of triggering all these assumption examples would give a certain disturbed minority of the population… Also I guess it’s only because statistics inhabits the furthest recesses of YT land that someone hasn’t called for it’s banning or demonetisation at the very least!
@DigGil3
@DigGil3 7 жыл бұрын
I'm too dumb for probabilities. That's why I'm here...
@toufikhamdani5044
@toufikhamdani5044 Жыл бұрын
This is the most accessible explanation about Bayesian Inference. Thank you Brandon for the time taken to prepare this video. You rock !
@GeoffryGifari
@GeoffryGifari Жыл бұрын
How does the shape and mean of our prior distribution tell us the best end belief? our prior knowledge is that the puppy weighs 14.2 lbs last time and that weight changes aren't noticeable what if the mean of our prior distribution is chosen to be larger/smaller than 14.2 lbs? and what if we take that distribution to be narrower/broader (more than 1 lb std)? an extreme case, what if we first guess that the puppy's weight is *exactly* 14.2 lbs?
@feifa13
@feifa13 7 жыл бұрын
his dog's name makes the video
@keokawasaki7833
@keokawasaki7833 6 жыл бұрын
i like terror, then!
@hieuhienhoa29
@hieuhienhoa29 2 жыл бұрын
The explanation is verry good. But i have been thinking a lot but can't understand why 20:42 you multiply P(w=17) three times.? Please help me.😢
@garychap8384
@garychap8384 4 жыл бұрын
_" When you have excluded the impossible, whatever remains, however improbable, must be __-the truth-__ _*_possible_*_ "_ There 'ya go Sheerluck Holmes, I've fixed it for 'ya : )
@claudiorio
@claudiorio 3 жыл бұрын
I know the video is old but I have to agree with the pinned comment. I already knew Bayes Theorem buy as I don't use it often, I have to be constantly refreshing the details in my mind. KZbin algorithm recommended this video and it's hands down the best I have ever watched.
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thank you. I really appreciate that.
@wexwexexort
@wexwexexort 8 ай бұрын
What I won't forget from this lesson: P(small human | cute) > 0
@QuintinMassey
@QuintinMassey 2 жыл бұрын
It’s 2022, not all people standing in line for the men’s restrooms are actually men 🤣 more context needed haha.
@nginfrared
@nginfrared 7 жыл бұрын
EXCELLENT EXPLANATION!!! I am learning graphical modeling and a lot of these concepts were a bit unclear to me. Examples given here are absolutely to the point and demystify a lot of concepts. Thank you and looking forward to more videos.
@robertoarce-tx8yt
@robertoarce-tx8yt Жыл бұрын
this was very intuitive explanation, man do more!
@rjvaal
@rjvaal 7 жыл бұрын
Thank you for this excellent explanation. You are a patient and well-spoken teacher.
@nutrinogirl456
@nutrinogirl456 7 жыл бұрын
This is the first time I've felt like I've actually understood this... and it's such a simple concept! Thank you!
@redserpent
@redserpent 6 жыл бұрын
Thank you. Your video has been of great help. I have tried different resources to wrap my head around Bayesian theorem and always got knocked out at the front door. Excellent expalnation
@kimnguyen1227
@kimnguyen1227 7 жыл бұрын
This is fantastic. Thank you so much! I have been exposed to BT before, but have never understood it. As sad as it sounds, I didn't realize it was composed of joint probability which is composed of conditional probability, and marginal probability. Conditional probability and joint probability, and Bayes theorem all just looked the same. This really helped clarify things for me.
@bga9388
@bga9388 10 ай бұрын
Thank you for this excellent presentation!
@ComtoInterim
@ComtoInterim 11 ай бұрын
Thank you so much for excellent explanation
@Blooddarkstar
@Blooddarkstar 7 жыл бұрын
This video deserves more thumbs up. I understood a lot on a lazy sunday evening :) great explaination.
@jeppejwo
@jeppejwo 7 жыл бұрын
Great explanation, but i'm having a hard time understanding why you should use the standard error, as the width when you calculate the likelihood, and not the standard deviation. Doesn't this mean, that you are calculating the probability, that your measurement is the true mean, and not the probability of getting that measurement given your mean? Or perhaps that's what you'r supposed to do?
@BrandonRohrer
@BrandonRohrer 7 жыл бұрын
You hit it spot on, jeppe. The assumption that I glossed over is that Reign's actual weight will be the mean of all the measurements (if we kept taking measurements forever). However, since we only have a few measurements, we have to be content with an estimate of that true mean. The standard error helps us to determine the distribution of that estimate. We don't care about how far an individual measurement tends to be off, we only care about how far our estimate of the weight is off.
@rfpaintworkz0809
@rfpaintworkz0809 3 жыл бұрын
how to find the value of P(13.9|17)?
@razvanastrenie1455
@razvanastrenie1455 3 жыл бұрын
Excellent explanation ! This is the manner in which mathematics must be explained. With cases of practical applicability. Good job mr. Brandon !
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thanks Razvan. I'm so happy you enjoyed it.
@maayan1987
@maayan1987 6 жыл бұрын
I loved the Mark Twain citation in the end! Great video thanks! However I have one question, do you assume normal distributions of the likelihood? Why so?
@moazelsayed541
@moazelsayed541 Жыл бұрын
That's the best explanation ever❤️❤️❤️❤️
@HollySophia1
@HollySophia1 3 жыл бұрын
'a small human' - a baby?
@evgenykriukov4239
@evgenykriukov4239 6 жыл бұрын
Thanks for excellent presentation! One question though: at 17:49 the P(m=[13.9, 14.1, 17.5]|w=17) is factorized as following: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) = P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) then at 20:47 the P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is expanded into: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) = P(m=13.9|w=17) * P(w=17) * P(m=14.1|w=17) * P(w=17) * P(m=17.5|w=17) * P(w=17) how do you get that P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is equal to P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(w=17)^3 ? Thx in advance!
@keokawasaki7833
@keokawasaki7833 6 жыл бұрын
i hate stats because of those things... my teacher was teaching us utility analysis, and says "satisfaction is measured in utils" to that i ask "tell me how satisfied are you with your job and answer in the form: n-utils...". i still haven't got an answer!
@jinshuli4092
@jinshuli4092 6 жыл бұрын
Excellent video! But one question: at 20:43, should we just multiplied P(w=17) once? where do the other two times P(w=17) come from?
@bobcrunch
@bobcrunch 6 жыл бұрын
The dashed curve is P(m | w=17). There are three measurements (m), [13.9, 14.1, and 17.5]. All three must be computed and each multiplied by P(w=17) (the dotted curve) separately. The three results are then multiplied together. P(w=17) is very small, and thus P(w = 17 | m = [13.9, 14.1, 17.5] ) is very small.
@dbtmpl1437
@dbtmpl1437 6 жыл бұрын
@@bobcrunch I would disagree here. I thin @Jinshu Li has a valid point. The three measurements are independent w.r.t. w, and we can write them therefore as p(m_1|w)*p(m_2|w)*p(m_3|w). However, this does not affect the prior p(w) at this point. So the prior should be multiplied only once in my opinion.
@هشامأبوسارة-ن7و
@هشامأبوسارة-ن7و 5 жыл бұрын
@db: I can't agree with you more as p(w) is our prior belief about w before seeing any data (measuremenrs ).
@johneagle4384
@johneagle4384 2 жыл бұрын
Great example! Very easy to follow and understand. On a side note: I showed your video to my students and some of them objected rather "emphatically". They said it was too sexist. Crazy times we live in....Instead of math and statistics, they wanted to discuss gender roles and stereotypes in a Stat class. Gosh!
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thanks John! I agree with your students. When I watch this now, I cringe. I definitely need to re-do it with a better example, one that doesn't reinforce outdated gender norms.
@johneagle4384
@johneagle4384 2 жыл бұрын
@@BrandonRohrer No....Please, do not follow the mad crowds... this an innocent, simple math example. People are getting crazy and finding excuses to feel offended and start meaningless fights!
@Terszel
@Terszel 10 ай бұрын
So sad. Long hair and standing in the womens restroom line and we cant even use Bayes' thereom to assume its a woman 😂
@lenkapenka6976
@lenkapenka6976 3 жыл бұрын
Superb lecture - esp. the MLE explanation!
@attrapehareng
@attrapehareng 7 жыл бұрын
You're very good at explaining and also you go in some details which is nice. Too often youtube tutorials are too simple. keep going.
@keokawasaki7833
@keokawasaki7833 6 жыл бұрын
now that you have said that (an year ago), i kinda feel like finding the probability of likelihood of a youtuber making too simple tutorials!
@mohamedanasselyamani4323
@mohamedanasselyamani4323 5 жыл бұрын
Thank you very much for the best explanation, It's very interesting
@ukktor
@ukktor 7 жыл бұрын
The knowledge that Bayes was a theologian and that his theory requires at least some belief or faith in improbable things earns a "Well played, Mr. Bayes" slow clap. I've been enjoying your videos Brandon, thanks for keeping things approachable!
@themennoniteatheist1255
@themennoniteatheist1255 3 жыл бұрын
I’m not sure that Bayesian epistemology requires a belief in improbable things. I love this video, but I think that’s an overstatement. I do think that it requires us to be open to the possibility that improbable things may be true. It does not require me to have faith in anything improbable, but rather to proportion our beliefs to the evidence (probabilities)- which is the antithesis of faith-while accepting the possibility of being wrong. To accept something as true that is improbable… is intellectually irresponsible and lacks due caution and humility. But to withhold belief (proportionally to the evidence) from improbable things is intellectually responsible and does not exclude being open to surprise-to the possibility that something improbable is true. I don’t think Bayesian epistemology intrinsically expects you to hold as true some improbable thing (faith). Abstinence from faith is acceptable in all cases as long as the possibility of error is operative. This suggestion that it’s necessary in Bayesian epistemology to believe something that is improbable was the only sloppy part of the video, no? I’m open to correction…
@ahmedemadsamy4244
@ahmedemadsamy4244 2 жыл бұрын
I think you guys got it the opposite way, the video was trying to say, be open to believe in the improbable things that come from the data (evidence), rather than only holding on a prior belief.
@fahad3802
@fahad3802 7 жыл бұрын
I have been struggling with bayesian inference and your tutorial makes it so easy to understand! Thank You! Keep up the good work.
@BrandonRohrer
@BrandonRohrer 7 жыл бұрын
I'm very happy to hear it Fahad. Thanks.
@kebman
@kebman 4 жыл бұрын
01:07 2020 Bayesian guesser: "Excuse me ma'am, is this your ticket?" Non-binary person: "Did you just assume my gender?"
@athousandwords5469
@athousandwords5469 7 жыл бұрын
"a small human" lol
@rasraster
@rasraster 2 жыл бұрын
I don't understand your method of getting the posterior distribution. There is nothing in Bayes' formula that implies a summation over all possible parameters in the likelihood function. It seems to me that the way to get the posterior is to compute the likelihood distribution based on measured samples and then for each possible weight to multiply the value of the (static) likelihood distribution by the value of the prior distribution. Can you explain why your method works? Thank you.
@Atozanycome111
@Atozanycome111 6 жыл бұрын
Reign of terror. LOL
@ingobethke2413
@ingobethke2413 5 жыл бұрын
Great video. I'm slightly confused about the dog example at around 20:00: Why can we use the standard error of the prior distribution in the likelihood computation for the measurements? Don't we have to model distribution mean and spread seperately i.e. explore many different standard error values? Generally, is w supposed to represent only one number (e.g. the true dog weight in the example) or an entire distribution that can be characterised by its moments (with the true dog weight simply being its first moment)?
@bharathwajan6079
@bharathwajan6079 Жыл бұрын
hey man,this is the one good explanation for conditional prob i had ever heard
@BrandonRohrer
@BrandonRohrer Жыл бұрын
Thanks!
@joserobertopacheco298
@joserobertopacheco298 2 жыл бұрын
I am from Brazil. What a fantastic explanation!
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thanks Jose! Welcome to the channel
@dalelu9422
@dalelu9422 7 жыл бұрын
Simply the best ! Thank you Brandon
@syedmurtazaarshad3434
@syedmurtazaarshad3434 5 ай бұрын
Loved the analogies with real life philosophies, brilliant!
@Blitzkid82
@Blitzkid82 6 жыл бұрын
Great,...now i`m stuck thinking about puppies (Still: Thankx alot for the Video:-) )
@josuanaiborhunaiborhu
@josuanaiborhunaiborhu 2 жыл бұрын
Amazing explanations
@wysiwyg2489
@wysiwyg2489 6 жыл бұрын
I believe you don't know much about statistics (the impossible thing), but I do believe you really know how to explain Bayesian Inference. Great video.
@liucloud6317
@liucloud6317 5 жыл бұрын
I have viewed many explanations about Bayes rule but this is no doubt the best! Thanks Brandon
@UpayanRoy-n6u
@UpayanRoy-n6u Ай бұрын
17:57 One query here. How is the P(m | w=17) distribution function calculated? What is the spread (S.D.)? How do we, for example, arrive at any certain value of the probability of getting m = 15.6lb given the true weight is 17lb? Thanks! Nice explanation.
@GigaMarou
@GigaMarou 2 жыл бұрын
Super explanation!
@adammontgomery7980
@adammontgomery7980 5 жыл бұрын
I feel like using this to trade stocks now.
@thefilth7368
@thefilth7368 7 жыл бұрын
Did you just assume my gender??
@hamoudrodriguez2702
@hamoudrodriguez2702 7 жыл бұрын
Thefilth haha
@ralphschraven339
@ralphschraven339 7 жыл бұрын
No, he inferred it! :P
@KetanSingh
@KetanSingh 5 жыл бұрын
I thought this would require some sort of integration considering that we were talking about genders ;)
@BrandonRohrer
@BrandonRohrer 5 жыл бұрын
Yes, I am regretting my choice of example. It is vastly oversimplified and the presentation doesn't do justice to the realities of sex and gender. A lot of people have suffered because of similar oversimplifications. I have my eyes open for a new example and a new presentation.
@NoActuallyGo-KCUF-Yourself
@NoActuallyGo-KCUF-Yourself 5 жыл бұрын
Ha, I had just poured a cup of coffee when the coffee cup distribution example started. What are the chances? New data: I just woke up. Chances are HIGH.
@regozs
@regozs 5 жыл бұрын
This is the best explanation yet, it helped me get a greater intuitive sense of Bayesian inferences.
@brendawilliams8062
@brendawilliams8062 2 жыл бұрын
Yes it was great. It seems running into Feigenbaum maths or simular
@SiLEXiJP
@SiLEXiJP 3 жыл бұрын
Hi, At 20:42, Since P(m=[13.9,14.1,17.5]|w=17) = P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17), so I think P(m=[13.9,14.1,17.5]|w=17) * P(17) should be P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(17). Why it is P(m=[13.9,14.1,17.5]|w=17) * P(17) = P(m=13.9|w=17) * P(17) * P(m=14.1|w=17) * P(17) * P(m=17.5|w=17) * P(17)? Thanks!
@briseboy
@briseboy 2 ай бұрын
"Cute" is a word used by a HIGH proportion of females, and nearly not at all by males. We should choose a sample containing 50% female, 50% male to estimatethe ratio of "cute" meaning, in actuality, ugly. Now, about 90 to near 100% of females use cute to refer to ugly objects and prganisms. Organisms to which females are highly unlikely to regard as "cute" tend to be hairless organisms, such as slugs, biting indects, slime molds, though some significant percentage of females DO refer to baldo males, hairless cats, and babies as cute. These miscategorizations, such as males who shave their heads for the ALWAYS twin purpose of pretending to be socially dominant, AND to hide their follicle shutdown, may skew our data, and so such persons must be eliminated from data sets. "Cute" of course, involves Other female miscategorizations. As science has proven that females use the term in reference to sharpei dogs, we KNOW that cute does NOT mean sexually appealing, but instead refers to massive wrinkling. Yet, females themselves go to toxic lengths yet fail to remove wrinkles from their own skin, even though many species of felids and canids require loose skin in order to transport pups and kits. Applying this latter developmental period phenomenon , we CAN use some Bayesian inference to estimate that females may very likely be using the otherwise indefinable word to mean that she desires to carry any hairy, wrinkly organism in her teeth. Thus, we approach the true, previously unknown meaning of "cute, " formerly unknown and indefinable. Ne can anly speculate the meaning of the word to the male narrator here., as the sample size, one, is too small to interpret any meaning whatsoever. Is the male seeking to "female-speak, imitating the noise uttered by that species, in hopes of attracting a sample, however variable in number, for reproductive, or other, pruposes? Does he mock the indigenous female who uses the word to identify the ugly? Is this good? Is it bad? Or is it Clint Eastwood? Rare events for which mean, median, mode, cannot be established, remain difficult to estimate probabilities. On the one hand, females use the word ubiquitously, as often as that species can fit it into their communication signaling. Thus it MAY be essentially meaningless, and observers might be misconstruing a noise uttered from infantile developmental stages ATTEMPTING but FAILING to form an actualword. " Cute" may, thus, be a TOTAL misinterpretation of vocalizations, just as the Ancient Greeks, hearing languages foreign to them, heard only "bar-bar-bar" in sophisticated unknown languages, called them barbar-ians. Is it, finally, that we are hearing a tongue to which we have no Rosetta, so far hidden from us, and females speak profound insights, complex prose, poetries to which we remain oblivious, our ears buried in sharpei skinfolds? Or, is it that they merely babble imitatively at one another, "cute-cute-cute", without meaning, as they plot to cast offal upon us more earthbound creatures below? In any case, that single most common female utterance so far escapes us completely. All we can hypothesize, is that, used in nearly every female signal, it is either of VAST import, or as meaningless as the apostrophes littered about in youtube commentary by the fabled monkeys on typewriters, someday, should the Universe persist sufficiently, to become part of profound, Shakespearean works of literature, to be discovered by more intelligent species than ours. Those canids will howl their unearthing of meaning, and caw in their Raven voices, in ecstasy, at last, understanding the combination folded hirsute pelage apparently so attractive to the females of a long-disappeared naked ape.
@theinsanify7802
@theinsanify7802 5 жыл бұрын
i don't know what to say, i'm a computer science student and i have never seen an explanation better than this ... thank you veryyyyy much
@RJOHNWESLEYPHD
@RJOHNWESLEYPHD 6 жыл бұрын
simple..yy great. saves years
@shreeniwaz
@shreeniwaz 29 күн бұрын
This video was shot before the contemporary LGBTQ movement.. it made it so easy to explain certain facts..
@williamliamsmith4923
@williamliamsmith4923 11 ай бұрын
Amazing explanation and graphics!
@BrandonRohrer
@BrandonRohrer 10 ай бұрын
Thanks!
@Erin-uk2jj
@Erin-uk2jj 3 жыл бұрын
Brandon, this was great, thank you. Very easy to follow and really interesting and concise!
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thanks Erin!
@frbaucop
@frbaucop 7 ай бұрын
Bonjour Q1 : At 5:42. Where the .96 comes from? The square "says" P(woman==0.2 , P(man) = .98. Should we read P(man AND long) = P(man) * P(man | long) = 0.98 * 0.04 = 0.04 Q2 : At 17:00 I understand the mean of the normal distribution in the back is 17. OK, but what is the standard deviation. Is it equal to the one calculated with the 3 values (13.9, 17.5, 14.1) , do we use the standard error or something else? This is not yet clear for me. Merci
@area51xi
@area51xi 3 жыл бұрын
You completely lost me at the slide where C1/C2 disappears and is equal to one. It only takes one unexplained slide like that to throw everything else off. I couldn't buy the rest of the video from that point on.
@area51xi
@area51xi 3 жыл бұрын
The entire first half of the video says that A|B is not B|A but then at 16:55 that is exactly what is written. Please clarify.
@manssternerbostrom3148
@manssternerbostrom3148 Жыл бұрын
Everyone in the comments just swallowing this BS without hesitation.. The latter half makes zero sense and I'm pretty sure it's plain wrong
@SamuelShaw1986
@SamuelShaw1986 6 жыл бұрын
Great explanation and video lesson production. Best Bayesian lesson I've found on youtube
@knowgnod
@knowgnod 6 жыл бұрын
awesome explanation. thanks!!
@karthiksalian5715
@karthiksalian5715 4 жыл бұрын
Best video I found with all the information that I needed at one place. Thanks.
@bryancc2012
@bryancc2012 7 жыл бұрын
finally understand it first time...
@himanshu8006
@himanshu8006 5 жыл бұрын
I must say, this is the explanation of Bayes theorem, I have ever seen..... PERFECT!!!!!
@hyunseokjeong7994
@hyunseokjeong7994 7 жыл бұрын
Great Video. Thanks a lot!
@abdulmukit4420
@abdulmukit4420 3 жыл бұрын
At 20:42 why P(w=17) is multiplied 3 times? Shouldn't it be P(m=13.9 | w=17) * P(m=14.1 | w=17) * P(m=17.5 | w=17) * P(w=17) ?
@jithunniks
@jithunniks 6 жыл бұрын
The way you connect things with appropriate easy to examples...Amazing...
@pfeeneyp
@pfeeneyp Жыл бұрын
This video was helpful. Thank you 🙏 If i may give feedback, the cancellation of two unequal constants in top and bottom of Bayes at 16m46 requires comment. Also there is some sloppy use of the terms "std dev" and "std error" interchangeably. Methinks. But maybe i m missing something.
@relax2583
@relax2583 3 ай бұрын
this video talks about two examples; 1. theater 2. a dog with name of Reign. The first part is easy to follow, but the dog's weight distribution is not easy to follow.
@yavdhesh
@yavdhesh 6 жыл бұрын
Superb explanation Sir
@maxinelyu7875
@maxinelyu7875 7 жыл бұрын
when your teacher don't make sense, had to go through teaching videos online and came across this one... Lucky lucky lucky! Thank you Mr!
@AakashBhardwaj-dk3mi
@AakashBhardwaj-dk3mi Жыл бұрын
Hey @Brandon Rogers, At 18:12, y axis is likelihood not probability. Probability is area under curve for this graph.
@mariorodriguesperes1501
@mariorodriguesperes1501 5 жыл бұрын
Great video!! Very nice and easy to digest explanation of Bayes theorem! Thank you very much for sharing this excellent material. I have got a better understanding on how to apply it to my problems. Keep the great work!
@alekseev.yeskela
@alekseev.yeskela 6 жыл бұрын
Awesome, thank you very match!
@kalevipoeg6916
@kalevipoeg6916 Жыл бұрын
More helpful than most! I will say though that the distribution for heights is really funny, unless you're crazy enough to think that the most common measurement would be in the 185 cm range, which is around 6'1"....which is taller than around 90% of men in the west and 95% of men globally.
@rezNezami
@rezNezami Жыл бұрын
Thanks for the first half of the video. The second half? hm... Not so much!! At 21::05 for example what does it even mean multiplying the probabilities for 3 measurements?!!
@AI_ML_DL_LLM
@AI_ML_DL_LLM Жыл бұрын
a great video, i think the Bayes is going to syncing in my head, Two questions if i may? Q1: at 16:44 shouldn't C1 to be equal to C2? Q2: at 16:28, you ignored P(m) as you said it is constant. can you please explain why? In machine learning this term is the most problematic one and they go extra miles to address it, thanks
@pythonicly2646
@pythonicly2646 2 жыл бұрын
At 6:51, Cant we simply calculate P(man | long hair) = Total number of men with long hair/ total number of people with long hair = 2/27 = 0.07? (for cinema example)
@SuperJg007
@SuperJg007 6 жыл бұрын
Amazing. I already knew what Bayes theorem was, but you have an awesome intro to Bayes. Thanks for the video.
@francescogiacomelli403
@francescogiacomelli403 5 жыл бұрын
Next level lesson
@sreemantokesh3999
@sreemantokesh3999 5 жыл бұрын
At 5:25 can you someone tell me s P(woman with short hair) should be P(woman)*P(short hair). P(A|B) = P(A)*P(B). I know I am missing something. But I'll be grateful is someone tries to brief me. Also in 7:25 probability that it's a Man and has a long hair should it Not be simply P(long hair)*P(man)??
@timdudd5996
@timdudd5996 5 жыл бұрын
Great video!
@canmetan670
@canmetan670 3 жыл бұрын
Can someone explain what 20:22 means? 1) I get that P(w) is the probability of the weight w on the normal distribution that we have chosen - this is our prior. 2) P (m | w) is probability of measurement m happening in our distribution with P(m) - where it corresponds to in the normal distribution. 3) What is P(m)?? Thanks for the video and responses.
@gautamjain2487
@gautamjain2487 5 жыл бұрын
Nothing much to say only thank you! you may have helped me in clearing my exam!
@SupremeSkeptic
@SupremeSkeptic 6 жыл бұрын
Why did you choose 14.2 lbs as the prior and not 15.2 lbs, or 13.9, 14.1 or 17.5 lbs for that matter? Maybe (14.2+15.2)/2 = 14.7 lbs would be a better prior?
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,4 МЛН
Introduction to Bayesian data analysis - part 1: What is Bayes?
29:30
Ozoda - Lada ( Ko’k jiguli 2 )
06:07
Ozoda
Рет қаралды 15 МЛН
Bayes Theorem and some of the mysteries it has solved
16:18
Zach Star
Рет қаралды 485 М.
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 105 М.
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 475 М.
The biggest beef in statistics explained
21:04
Very Normal
Рет қаралды 48 М.
Solve Crimes Using Bayes' Theorem - Visual Guide
7:46
Habboub's Lab
Рет қаралды 21 М.
A visual guide to Bayesian thinking
11:25
Julia Galef
Рет қаралды 1,8 МЛН
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4,1 МЛН
The medical test paradox, and redesigning Bayes' rule
21:14
3Blue1Brown
Рет қаралды 1,2 МЛН
Bayesian Statistics with Hannah Fry
13:48
Stand-up Maths
Рет қаралды 391 М.
Ozoda - Lada ( Ko’k jiguli 2 )
06:07
Ozoda
Рет қаралды 15 МЛН