How Bayes Theorem works

  Рет қаралды 551,919

Brandon Rohrer

Brandon Rohrer

Күн бұрын

Пікірлер: 410
@phytasea
@phytasea 7 жыл бұрын
Wow best explanation and example ever I saw ^^ Fantastic.
@marciasola462
@marciasola462 5 жыл бұрын
Excellent
@romanemul1
@romanemul1 9 ай бұрын
exactly. These pacient disease examples were driving me nuts.
@welcome33333
@welcome33333 7 жыл бұрын
this is by far the most accessible explanation of Bayes theorem. Well done Brandon!
@claudiorio
@claudiorio 3 жыл бұрын
I know the video is old but I have to agree with the pinned comment. I already knew Bayes Theorem buy as I don't use it often, I have to be constantly refreshing the details in my mind. KZbin algorithm recommended this video and it's hands down the best I have ever watched.
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thank you. I really appreciate that.
@Twopheek
@Twopheek Жыл бұрын
This is the most accessible explanation about Bayesian Inference. Thank you Brandon for the time taken to prepare this video. You rock !
@danrattray8884
@danrattray8884 8 жыл бұрын
Best explanation of Bayes theorem I have seen. Fantastic teaching.
@shirshanyaroy287
@shirshanyaroy287 8 жыл бұрын
Brandon I just want to tell you that you are a fantastic teacher.
@BrandonRohrer
@BrandonRohrer 8 жыл бұрын
Thank you very much Shirshanya. That is a huge compliment. I'm honored.
@shirshanyaroy287
@shirshanyaroy287 8 жыл бұрын
Please make more statistics videos! I have suggested your channel to my biostat teacher.
@richardgordon
@richardgordon 6 ай бұрын
Wow! One of the clearest explanations of Bayes Theorem I’ve come across!
@BrandonRohrer
@BrandonRohrer 6 ай бұрын
Thanks!
@maxinelyu7875
@maxinelyu7875 7 жыл бұрын
when your teacher don't make sense, had to go through teaching videos online and came across this one... Lucky lucky lucky! Thank you Mr!
@jithunniks
@jithunniks 6 жыл бұрын
The way you connect things with appropriate easy to examples...Amazing...
@razvanastrenie1455
@razvanastrenie1455 3 жыл бұрын
Excellent explanation ! This is the manner in which mathematics must be explained. With cases of practical applicability. Good job mr. Brandon !
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thanks Razvan. I'm so happy you enjoyed it.
@theinsanify7802
@theinsanify7802 6 жыл бұрын
i don't know what to say, i'm a computer science student and i have never seen an explanation better than this ... thank you veryyyyy much
@nginfrared
@nginfrared 7 жыл бұрын
EXCELLENT EXPLANATION!!! I am learning graphical modeling and a lot of these concepts were a bit unclear to me. Examples given here are absolutely to the point and demystify a lot of concepts. Thank you and looking forward to more videos.
@Skachmo1
@Skachmo1 7 жыл бұрын
There are never lines at the men's room.
@selamysfendegu8239
@selamysfendegu8239 6 жыл бұрын
haha.
@dinikosama
@dinikosama 6 жыл бұрын
lol
@Beebo
@Beebo 5 жыл бұрын
Unless it's cocaine.
@masterpogi1818
@masterpogi1818 4 жыл бұрын
So the probability of this sample being true is 0%, hahhaha
@dhanushram6137
@dhanushram6137 4 жыл бұрын
You have never been to developer conferences :)
@fahad3802
@fahad3802 8 жыл бұрын
I have been struggling with bayesian inference and your tutorial makes it so easy to understand! Thank You! Keep up the good work.
@BrandonRohrer
@BrandonRohrer 8 жыл бұрын
I'm very happy to hear it Fahad. Thanks.
@nutrinogirl456
@nutrinogirl456 7 жыл бұрын
This is the first time I've felt like I've actually understood this... and it's such a simple concept! Thank you!
@rjvaal
@rjvaal 7 жыл бұрын
Thank you for this excellent explanation. You are a patient and well-spoken teacher.
@regozs
@regozs 5 жыл бұрын
This is the best explanation yet, it helped me get a greater intuitive sense of Bayesian inferences.
@brendawilliams8062
@brendawilliams8062 2 жыл бұрын
Yes it was great. It seems running into Feigenbaum maths or simular
@SupremeSkeptic
@SupremeSkeptic 7 жыл бұрын
Definitely the best explanation of the theorem told in an easily understandable way, I can find in the internet...
@syedmurtazaarshad3434
@syedmurtazaarshad3434 8 ай бұрын
Loved the analogies with real life philosophies, brilliant!
@yurysambale
@yurysambale 7 жыл бұрын
Stop looking for a descent tutorial... this one is the best!
@rameshmaddali6208
@rameshmaddali6208 4 жыл бұрын
For 5 years i kept Bayes aside , you are the guru in teaching stuff.. God bless you Brandon
@AakashBhardwaj-dk3mi
@AakashBhardwaj-dk3mi Жыл бұрын
Hey @Brandon Rogers, At 18:12, y axis is likelihood not probability. Probability is area under curve for this graph.
@ukktor
@ukktor 7 жыл бұрын
The knowledge that Bayes was a theologian and that his theory requires at least some belief or faith in improbable things earns a "Well played, Mr. Bayes" slow clap. I've been enjoying your videos Brandon, thanks for keeping things approachable!
@themennoniteatheist1255
@themennoniteatheist1255 3 жыл бұрын
I’m not sure that Bayesian epistemology requires a belief in improbable things. I love this video, but I think that’s an overstatement. I do think that it requires us to be open to the possibility that improbable things may be true. It does not require me to have faith in anything improbable, but rather to proportion our beliefs to the evidence (probabilities)- which is the antithesis of faith-while accepting the possibility of being wrong. To accept something as true that is improbable… is intellectually irresponsible and lacks due caution and humility. But to withhold belief (proportionally to the evidence) from improbable things is intellectually responsible and does not exclude being open to surprise-to the possibility that something improbable is true. I don’t think Bayesian epistemology intrinsically expects you to hold as true some improbable thing (faith). Abstinence from faith is acceptable in all cases as long as the possibility of error is operative. This suggestion that it’s necessary in Bayesian epistemology to believe something that is improbable was the only sloppy part of the video, no? I’m open to correction…
@ahmedemadsamy4244
@ahmedemadsamy4244 2 жыл бұрын
I think you guys got it the opposite way, the video was trying to say, be open to believe in the improbable things that come from the data (evidence), rather than only holding on a prior belief.
@himanshu8006
@himanshu8006 6 жыл бұрын
I must say, this is the explanation of Bayes theorem, I have ever seen..... PERFECT!!!!!
@liucloud6317
@liucloud6317 5 жыл бұрын
I have viewed many explanations about Bayes rule but this is no doubt the best! Thanks Brandon
@bharathwajan6079
@bharathwajan6079 Жыл бұрын
hey man,this is the one good explanation for conditional prob i had ever heard
@BrandonRohrer
@BrandonRohrer Жыл бұрын
Thanks!
@kimnguyen1227
@kimnguyen1227 7 жыл бұрын
This is fantastic. Thank you so much! I have been exposed to BT before, but have never understood it. As sad as it sounds, I didn't realize it was composed of joint probability which is composed of conditional probability, and marginal probability. Conditional probability and joint probability, and Bayes theorem all just looked the same. This really helped clarify things for me.
@SamuelShaw1986
@SamuelShaw1986 6 жыл бұрын
Great explanation and video lesson production. Best Bayesian lesson I've found on youtube
@taghreedalghamdi6812
@taghreedalghamdi6812 5 жыл бұрын
I was reading about Bayes Theory for months ! And this is the first time I understand the concept!! Wow!! such an amazing way of teaching!!
@BrandonRohrer
@BrandonRohrer 5 жыл бұрын
I'm so happy to hear it Taghreed. That was exactly my hope.
@ssundaraju
@ssundaraju 5 жыл бұрын
Great explanation and simplification of a difficult concept. The three quotations at the end are poetic and purposeful. Thanks
@BrandonRohrer
@BrandonRohrer 5 жыл бұрын
I found them surprising relevant too. Thanks Sridhar.
@karthiksalian5715
@karthiksalian5715 4 жыл бұрын
Best video I found with all the information that I needed at one place. Thanks.
@redserpent
@redserpent 6 жыл бұрын
Thank you. Your video has been of great help. I have tried different resources to wrap my head around Bayesian theorem and always got knocked out at the front door. Excellent expalnation
@raghurrai
@raghurrai 6 жыл бұрын
I wish everyone taught like this. Your presentation was awesome. Thank you
@attrapehareng
@attrapehareng 7 жыл бұрын
You're very good at explaining and also you go in some details which is nice. Too often youtube tutorials are too simple. keep going.
@keokawasaki7833
@keokawasaki7833 6 жыл бұрын
now that you have said that (an year ago), i kinda feel like finding the probability of likelihood of a youtuber making too simple tutorials!
@WahranRai
@WahranRai 3 жыл бұрын
0:40 Bayes wrote 2 books one about theology and one about probabilty. He planed to write a third book infering the existence / non existence of God with probability (likelihood distribution = humanity, Prior distribution= miracles !
@johneagle4384
@johneagle4384 2 жыл бұрын
Great example! Very easy to follow and understand. On a side note: I showed your video to my students and some of them objected rather "emphatically". They said it was too sexist. Crazy times we live in....Instead of math and statistics, they wanted to discuss gender roles and stereotypes in a Stat class. Gosh!
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thanks John! I agree with your students. When I watch this now, I cringe. I definitely need to re-do it with a better example, one that doesn't reinforce outdated gender norms.
@johneagle4384
@johneagle4384 2 жыл бұрын
@@BrandonRohrer No....Please, do not follow the mad crowds... this an innocent, simple math example. People are getting crazy and finding excuses to feel offended and start meaningless fights!
@Terszel
@Terszel Жыл бұрын
So sad. Long hair and standing in the womens restroom line and we cant even use Bayes' thereom to assume its a woman 😂
@fredvin27
@fredvin27 6 жыл бұрын
I'm confused how do you calculate the probabilities in @17:56 P(m=13.9|w=17) and so on?
@KunwarPratapSingh41951
@KunwarPratapSingh41951 6 жыл бұрын
I was looking for intuitive content to introduce me about essence of Bayes theorem in statistics, thanks for this. Luckily I found your blog about machine learning and robotics. That's all what I wanted under a roof, robotics, data science and machine learning.
@SuperJg007
@SuperJg007 6 жыл бұрын
Amazing. I already knew what Bayes theorem was, but you have an awesome intro to Bayes. Thanks for the video.
@ingobethke2413
@ingobethke2413 5 жыл бұрын
Great video. I'm slightly confused about the dog example at around 20:00: Why can we use the standard error of the prior distribution in the likelihood computation for the measurements? Don't we have to model distribution mean and spread seperately i.e. explore many different standard error values? Generally, is w supposed to represent only one number (e.g. the true dog weight in the example) or an entire distribution that can be characterised by its moments (with the true dog weight simply being its first moment)?
@Blooddarkstar
@Blooddarkstar 8 жыл бұрын
This video deserves more thumbs up. I understood a lot on a lazy sunday evening :) great explaination.
@dmhowe2001
@dmhowe2001 7 жыл бұрын
I thought this was not only a great example of Bayes but also a nice intro for Cox's Theorem. Nice jobQ
@BrandonRohrer
@BrandonRohrer 7 жыл бұрын
* quickly looks up Cox's Theorem * Why, yes it does Donna. Thank you! :)
@Explorer4239
@Explorer4239 6 жыл бұрын
Thanks for excellent presentation! One question though: at 17:49 the P(m=[13.9, 14.1, 17.5]|w=17) is factorized as following: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) = P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) then at 20:47 the P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is expanded into: P(w=17|m=[13.9, 14.1, 17.5]) = P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) = P(m=13.9|w=17) * P(w=17) * P(m=14.1|w=17) * P(w=17) * P(m=17.5|w=17) * P(w=17) how do you get that P(m=[13.9, 14.1, 17.5]|w=17) * P(w=17) is equal to P(m=13.9|w=17) * P(m=14.1|w=17) * P(m=17.5|w=17) * P(w=17)^3 ? Thx in advance!
@keokawasaki7833
@keokawasaki7833 6 жыл бұрын
i hate stats because of those things... my teacher was teaching us utility analysis, and says "satisfaction is measured in utils" to that i ask "tell me how satisfied are you with your job and answer in the form: n-utils...". i still haven't got an answer!
@UpayanRoy-n6u
@UpayanRoy-n6u 4 ай бұрын
17:57 One query here. How is the P(m | w=17) distribution function calculated? What is the spread (S.D.)? How do we, for example, arrive at any certain value of the probability of getting m = 15.6lb given the true weight is 17lb? Thanks! Nice explanation.
@xxlolxx447
@xxlolxx447 6 жыл бұрын
This was the best explanation of Bayes I've ever heard, I had such a hard time wrapping my head around it from other sources
@urielvaknin6904
@urielvaknin6904 2 ай бұрын
Great video, many thanks! Can someone explain the part between 19:30 to 21:40? All the calculations and finding the final posterior distribution
@robertoarce-tx8yt
@robertoarce-tx8yt Жыл бұрын
this was very intuitive explanation, man do more!
@joshuafancher3111
@joshuafancher3111 6 жыл бұрын
Excellent explanation. At the 15:20 and beyond is when everything really started to come together. Also thanks for deriving the formula at the 7:10 mark.
@EANTYcrown
@EANTYcrown 7 жыл бұрын
hands down best explanation i've seen, thank you
@billgiles9662
@billgiles9662 7 жыл бұрын
Brandon, GREAT explanations!! I am taking a "Math for Data Sciences" class and have been flying through it until the final week and "Bayes Theorem". Achk...... It was poorly explained and very confusing. I was going to drop the class as I just couldn't get it. After watching your KZbin explanation I am excited about the possibilities and understand the way it works - cool stuff! Thank you for all you do!!!
@Zachor-v-Aseh
@Zachor-v-Aseh 5 жыл бұрын
Excellent. For those for whom this is the first lesson on Bayes, you've left out a few steps here and there. But still excellent. It's difficult to make things understandable. You're excellent at it.
@BrandonRohrer
@BrandonRohrer 5 жыл бұрын
Thanks Dov! And good callout - this focuses on the concepts and doesn't tell you quite enough to code it up. That will be the subject of a future course on e2eml.school
@Erin-uk2jj
@Erin-uk2jj 3 жыл бұрын
Brandon, this was great, thank you. Very easy to follow and really interesting and concise!
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thanks Erin!
@wysiwyg2489
@wysiwyg2489 6 жыл бұрын
I believe you don't know much about statistics (the impossible thing), but I do believe you really know how to explain Bayesian Inference. Great video.
@sreekanthk2911
@sreekanthk2911 5 жыл бұрын
I have been searching for explanation like this for sometime and a big WOW to this guy. Wonderful explanation!!
@MonsieurSchue
@MonsieurSchue Жыл бұрын
Wow can't believe I only came across this video now. This is by far the best explanation on Bayes with great examples! Thanks @BrandonRohrer !! Love the example with the weight of puppy! May I ask if you have codes to deal with multiple priors/ multiple events? Say such as an extension of the weight of the puppy, if the weight change is more than one pound, plus she may be showing some other symptoms (say losing appetite), the likelihood of her being sick from something is x. Or even, losing appetite can be just due to weather being too hot. So the lost of weight of one pound from the last vet visit and losing appetite may not be significant at all and doesn't warrant multiple expensive test suggested by the vet.
@GeoffryGifari
@GeoffryGifari Жыл бұрын
How does the shape and mean of our prior distribution tell us the best end belief? our prior knowledge is that the puppy weighs 14.2 lbs last time and that weight changes aren't noticeable what if the mean of our prior distribution is chosen to be larger/smaller than 14.2 lbs? and what if we take that distribution to be narrower/broader (more than 1 lb std)? an extreme case, what if we first guess that the puppy's weight is *exactly* 14.2 lbs?
@williamliamsmith4923
@williamliamsmith4923 Жыл бұрын
Amazing explanation and graphics!
@BrandonRohrer
@BrandonRohrer Жыл бұрын
Thanks!
@mathiasmews1122
@mathiasmews1122 4 жыл бұрын
You're so much better than my Statistics teacher, thank you so much for this explanation!
@BrandonRohrer
@BrandonRohrer 4 жыл бұрын
Thanks Mathias!
@canmetan670
@canmetan670 4 жыл бұрын
Can someone explain what 20:22 means? 1) I get that P(w) is the probability of the weight w on the normal distribution that we have chosen - this is our prior. 2) P (m | w) is probability of measurement m happening in our distribution with P(m) - where it corresponds to in the normal distribution. 3) What is P(m)?? Thanks for the video and responses.
@dalelu9422
@dalelu9422 7 жыл бұрын
Simply the best ! Thank you Brandon
@Atoyanable
@Atoyanable 3 жыл бұрын
such a well-thought -through video, very good explanations for every instance, the ending was the bonus, loved it, thank you
@BrandonRohrer
@BrandonRohrer 3 жыл бұрын
Thank you Lilit! I appreciate that.
@joserobertopacheco298
@joserobertopacheco298 2 жыл бұрын
I am from Brazil. What a fantastic explanation!
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thanks Jose! Welcome to the channel
@jinshuli4092
@jinshuli4092 7 жыл бұрын
Excellent video! But one question: at 20:43, should we just multiplied P(w=17) once? where do the other two times P(w=17) come from?
@bobcrunch
@bobcrunch 6 жыл бұрын
The dashed curve is P(m | w=17). There are three measurements (m), [13.9, 14.1, and 17.5]. All three must be computed and each multiplied by P(w=17) (the dotted curve) separately. The three results are then multiplied together. P(w=17) is very small, and thus P(w = 17 | m = [13.9, 14.1, 17.5] ) is very small.
@dbtmpl1437
@dbtmpl1437 6 жыл бұрын
@@bobcrunch I would disagree here. I thin @Jinshu Li has a valid point. The three measurements are independent w.r.t. w, and we can write them therefore as p(m_1|w)*p(m_2|w)*p(m_3|w). However, this does not affect the prior p(w) at this point. So the prior should be multiplied only once in my opinion.
@هشامأبوسارة-ن7و
@هشامأبوسارة-ن7و 5 жыл бұрын
@db: I can't agree with you more as p(w) is our prior belief about w before seeing any data (measuremenrs ).
@hieuhienhoa29
@hieuhienhoa29 2 жыл бұрын
The explanation is verry good. But i have been thinking a lot but can't understand why 20:42 you multiply P(w=17) three times.? Please help me.😢
@karannchew2534
@karannchew2534 Жыл бұрын
16:45 P(w) is constant as we assume w to be uniformly distributed. But why is P(m) also a constant?
@maayan1987
@maayan1987 7 жыл бұрын
I loved the Mark Twain citation in the end! Great video thanks! However I have one question, do you assume normal distributions of the likelihood? Why so?
@thomasprobst8601
@thomasprobst8601 Жыл бұрын
Excellent introduction, thanks. Is there also a continuitaton concerning a graphic prior Selektion and Jeffreys priors?
@s45510325
@s45510325 6 жыл бұрын
the best explanation I ever seen! Super clear.
@Nifty-Stuff
@Nifty-Stuff 2 жыл бұрын
Absolutely brilliant! Your presentation, examples, etc. were perfect and applicable! Thanks!
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thank you very much :)
@lenkapenka6976
@lenkapenka6976 3 жыл бұрын
Superb lecture - esp. the MLE explanation!
@ramkomusique
@ramkomusique 2 жыл бұрын
Very clear examples and explanations, thank you! Do you happen to have some R workshops on how to apply Bayesian inference as well?
@BrandonRohrer
@BrandonRohrer 2 жыл бұрын
Thanks! I don't have anything like workshops, but if I were looking for them I'd start with Richard McElreath xcelab.net/rm/courses/
@lonandon
@lonandon 6 жыл бұрын
At 20:42, why that normal distribution is turned to the flat one? is it because we have a prior normal distribution and we need to fit 17 to its curve? It confuses me as it seems like there are two prior, prior(14) is used to make prior(17) flat and then prior(17) is used to calculate the probability.
@mariorodriguesperes1501
@mariorodriguesperes1501 5 жыл бұрын
Great video!! Very nice and easy to digest explanation of Bayes theorem! Thank you very much for sharing this excellent material. I have got a better understanding on how to apply it to my problems. Keep the great work!
@sammorningstar6818
@sammorningstar6818 6 жыл бұрын
Great video!!..But i am confused at 14:00 in which we have got SD as 2.0 lb and in next slide you took a Normal distribution with SD of 1.2 lb. What is reason for not using sample SD and instead using SE for preparing Normal Distribution? If this requires a lot of explanation,then a link to any website explaining this will also be very helpful.
@sbz_
@sbz_ 6 жыл бұрын
Ya, i also had same doubt.
@bga9388
@bga9388 Жыл бұрын
Thank you for this excellent presentation!
@AI_ML_DL_LLM
@AI_ML_DL_LLM Жыл бұрын
a great video, i think the Bayes is going to syncing in my head, Two questions if i may? Q1: at 16:44 shouldn't C1 to be equal to C2? Q2: at 16:28, you ignored P(m) as you said it is constant. can you please explain why? In machine learning this term is the most problematic one and they go extra miles to address it, thanks
@amanjain3341
@amanjain3341 7 жыл бұрын
Thank you sooo much Brandon for explaining the concepts so clearly.
@Blazzerek
@Blazzerek 7 жыл бұрын
14:05 In computation of standard deviation, why are you using 15.1 instead of 15.2 as calculated in the mean? And also division by 2 (instead of 3 that is number of data points).
@wexwexexort
@wexwexexort 7 жыл бұрын
Július Marko In statistics you divide by n-1 to find sample's std dev. 15.1 must be nothing but typo.
@АлександрФедоров-п5э
@АлександрФедоров-п5э 7 жыл бұрын
20:47 Why we multiply by P(w=17) 3 times instead of one? upd. oh, I found in the comments below. So that, shouldn't you don't put the equals between first and second line? Because they are really not equal And It is not go align Bayes Theorem. P.S. Sorry for my bad english
@jacktretton7815
@jacktretton7815 4 жыл бұрын
best explanation I found on the topic so far. great work!!!
@jeppejwo
@jeppejwo 8 жыл бұрын
Great explanation, but i'm having a hard time understanding why you should use the standard error, as the width when you calculate the likelihood, and not the standard deviation. Doesn't this mean, that you are calculating the probability, that your measurement is the true mean, and not the probability of getting that measurement given your mean? Or perhaps that's what you'r supposed to do?
@BrandonRohrer
@BrandonRohrer 8 жыл бұрын
You hit it spot on, jeppe. The assumption that I glossed over is that Reign's actual weight will be the mean of all the measurements (if we kept taking measurements forever). However, since we only have a few measurements, we have to be content with an estimate of that true mean. The standard error helps us to determine the distribution of that estimate. We don't care about how far an individual measurement tends to be off, we only care about how far our estimate of the weight is off.
@ssrrapper
@ssrrapper 7 жыл бұрын
At 14:25 you say: "one standard deviation on that curve is our standard error of 1.2 pounds". How can this be? How can standard error be the same as standard deviation when standard error equals standard deviation divided by the square root of the number of observations? Standard deviation is 2 pounds, not 1.2 pounds, correct? The previous slide shows: std dev = 2.0 lbs std err = 1.16 lbs
@randykintzley5923
@randykintzley5923 6 жыл бұрын
Great video. I have a basic question though. At 17:53 You point to y-axis values on the normal curve as probabilities. I thought probability was area under the curve between two points. i.e. probability of an individual point on the continuous curve is zero. I'm missing something fundamental here. Maybe you can provide additional explanation? Thanks!
@УэстернСпай
@УэстернСпай 3 жыл бұрын
I was wondering the same....
@rasraster
@rasraster 2 жыл бұрын
I don't understand your method of getting the posterior distribution. There is nothing in Bayes' formula that implies a summation over all possible parameters in the likelihood function. It seems to me that the way to get the posterior is to compute the likelihood distribution based on measured samples and then for each possible weight to multiply the value of the (static) likelihood distribution by the value of the prior distribution. Can you explain why your method works? Thank you.
@houssemguidara4467
@houssemguidara4467 4 жыл бұрын
Thank you for the video, it helped me understand the concept of Bayesian inference. The concept is simple. In a nutshell, you have an idea about what the quantity is and then you use the measurements to sharpen your assumption.
@frbaucop
@frbaucop 10 ай бұрын
Bonjour Q1 : At 5:42. Where the .96 comes from? The square "says" P(woman==0.2 , P(man) = .98. Should we read P(man AND long) = P(man) * P(man | long) = 0.98 * 0.04 = 0.04 Q2 : At 17:00 I understand the mean of the normal distribution in the back is 17. OK, but what is the standard deviation. Is it equal to the one calculated with the 3 values (13.9, 17.5, 14.1) , do we use the standard error or something else? This is not yet clear for me. Merci
@StephenHsiang
@StephenHsiang 7 жыл бұрын
best explanation on youtube so far
@kirankk9565
@kirankk9565 7 жыл бұрын
the best explanation i hv seen about bayes theorem... awsome... thnx a ton....
@specialkender
@specialkender 5 жыл бұрын
GOD THANKS FOR EXISTING. Finally somebody that fucking breaks down the most important part of it (namely, how the do you calculate the likelyhood in practice). I hope life rewards you beautifully.
@tracyshen430
@tracyshen430 7 жыл бұрын
the math at 5:39 timeline has an error. it should be 0.98*4/98=0.04. i checked your blog. it was also wrong. There's no way that the product of the two numbers will become 0.04. Please correct this video and your blog
@mariamedrano4348
@mariamedrano4348 7 жыл бұрын
Excellent examples and explanation! Now everything is so much clearer. :)
@eyesonthetube
@eyesonthetube 6 жыл бұрын
Thanks for the excellent video. A good refresher! Keep up the good work!
@lemyul
@lemyul 5 жыл бұрын
i like the quotes you put at the end and how you reword them
@lonandon
@lonandon 6 жыл бұрын
At 18:23, he said that by the time we are done, what does that mean? Does it mean the sum of the probabilities of those three measurements are the maximum?
@pythonicly2646
@pythonicly2646 3 жыл бұрын
At 6:51, Cant we simply calculate P(man | long hair) = Total number of men with long hair/ total number of people with long hair = 2/27 = 0.07? (for cinema example)
@balasubramanianilangovan888
@balasubramanianilangovan888 4 жыл бұрын
Hi Brandon, your video was simple, superb, and stupendous!
@SeverSava
@SeverSava 2 жыл бұрын
On min 07:20 you I found a bit difficult to understand the difference between P(man with ling hair) and P(man | long hair). I guess the first refers to the probability of being a man with long hair from the entire cohort of man and women, whereas the second refers the the probability of being a man with long hair but just from the men's cohort. Is it right?
@AnasHawasli
@AnasHawasli Жыл бұрын
The best explaination on youtube thank you man
@boyangfu820
@boyangfu820 5 жыл бұрын
May I know why the standard error decrease after applying MAP in 21:25?
@abdulmukit4420
@abdulmukit4420 4 жыл бұрын
At 20:42 why P(w=17) is multiplied 3 times? Shouldn't it be P(m=13.9 | w=17) * P(m=14.1 | w=17) * P(m=17.5 | w=17) * P(w=17) ?
Introduction to Bayesian data analysis - part 1: What is Bayes?
29:30
Introduction to Bayesian Statistics - A Beginner's Guide
1:18:47
Woody Lewenstein
Рет қаралды 91 М.
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН
IL'HAN - Qalqam | Official Music Video
03:17
Ilhan Ihsanov
Рет қаралды 700 М.
人是不能做到吗?#火影忍者 #家人  #佐助
00:20
火影忍者一家
Рет қаралды 20 МЛН
Quilt Challenge, No Skills, Just Luck#Funnyfamily #Partygames #Funny
00:32
Family Games Media
Рет қаралды 55 МЛН
Bayes Theorem and some of the mysteries it has solved
16:18
Zach Star
Рет қаралды 487 М.
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 482 М.
Bayesian Statistics Demystified
1:34:06
Skeptic
Рет қаралды 6 М.
The medical test paradox, and redesigning Bayes' rule
21:14
3Blue1Brown
Рет қаралды 1,2 МЛН
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,6 МЛН
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4,2 МЛН
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 157 М.
The better way to do statistics
17:25
Very Normal
Рет қаралды 262 М.
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН