Estimating the posterior predictive distribution by sampling

  Рет қаралды 28,389

Ben Lambert

Ben Lambert

Күн бұрын

Пікірлер: 19
@Taysky
@Taysky 4 жыл бұрын
I'm paying 1000s of dollars for a uni course and come here to actually learn what is going on. Thanks!
@NeverHadMakingsOfAVarsityAthle
@NeverHadMakingsOfAVarsityAthle 8 ай бұрын
Fantastic explanation, thank you so much! I failed in understanding so many other explanations, but yours really made it click for me:)
@tranle5614
@tranle5614 Жыл бұрын
Awesome explanation. Thank you so much, Dr. Lambert.
@AshutoshRaj
@AshutoshRaj 9 ай бұрын
🎯 Key Takeaways for quick navigation: 00:00 *Predicting new data* 01:21 *Sampling procedure steps* 10:51 *Dominant uncertainty source*
@engenhariaquimica6590
@engenhariaquimica6590 Жыл бұрын
Awesome !!! Thabks a lot for such valuable information!!! And clear explanation
@kiranskamble
@kiranskamble 7 ай бұрын
Excellent Ben! Thank you!
@mirotivo
@mirotivo 3 жыл бұрын
It's a bit confusing In the video you are trying to come up with the posterior approximation given the sample data by sampling methods, You mentioned the left is the beta distribution which is the posterior already, what are we trying to approximate then, how is the samples drawn to be clear?
@budaejjigae-o9v
@budaejjigae-o9v 9 ай бұрын
Thanks for the content. I guess here we are implicitly assuming the predicted value $\tilde{x}_{i}$ does not depend on the data $x$?
@gregoryhall9276
@gregoryhall9276 5 жыл бұрын
I'm a little confused about how the sampling of the posterior distribution is done. Looking at the mathematica simulation, I didn't see any samples taken from the right side of the beta(3,9)...is the sampling restricted somehow to only a portion of the posterior distribution? Or are those samples discarded because they have no effect on the marginal?
@jimip6c12
@jimip6c12 4 жыл бұрын
The chance of a particular theta being selected depends on the probability density of the posterior distribution. Because the right side of the beta(3,9) has a very low probability density, its very unlikely to be selected (sampled)
@GuruprakashAcademy
@GuruprakashAcademy 2 жыл бұрын
Thanks Ben. It is a nice video. I am trying to simulate Posterior predictive distribution for NHPP. I have expression for P(X tilda I alpha beta)*P(alpha, beta | X). Can you please help how can i simulate the X tilda using MCMC in R or WInbug. Thanks
@abhinavtyagi7231
@abhinavtyagi7231 6 жыл бұрын
Really great work, Thank you sir for all the videos. When the solution manual of your book will be available?
@SpartacanUsuals
@SpartacanUsuals 6 жыл бұрын
Hi, thanks for your comment. It should be available ASAP on the book website (waiting on publisher). If you email me on Ben.c.lambert@gmail.com, however, I can share it with you. Best, Ben
@jacobschultz7201
@jacobschultz7201 4 жыл бұрын
Very cool video! So if our posterior was not conjugate and was instead approximated using a gibbs sampler, could we do something similar? I'm imagining randomly selecting a gibbs iteration (excluding burn in), and recording that vector of parameters as a sample from the posterior. Plug these parameters into the likelihood, sample, repeat. It seems especially important to sample the entire vector at once, since the marginal posteriors might not be independent. Sound reasonable?
@ZezaoCH
@ZezaoCH 4 жыл бұрын
In practice, how is the posterior distribution related to AQL's and RQL's in real life sampling?
@Gatitohomicida
@Gatitohomicida 4 жыл бұрын
Hi there, do you know if I can obtain the mean of each parameter, in a gaussian mixture, and then obtain the posterior predictive, or I should obtain each gaussian mixture simulation and then obtain the predictive?? it is the same result??
@abhijithv3047
@abhijithv3047 2 жыл бұрын
Hi sir could you please explain how bayesian model averaging works Including how parameters are estimated in a simple way so that And if possible could you demonstrate it with a problem Thanks in advance
@jacobmoore8734
@jacobmoore8734 5 жыл бұрын
In your simulation towards the end of the video, I'm having some difficulty keeping track of what each process represents. Left process output = sample-theta from actual posterior Middle process output = sample-x (from some distribution?) using output of precious step Right process output = histogram of sample-x values from previous step Definitely missed something important here, yikes
@holloloh
@holloloh 5 жыл бұрын
I think the left process output is the parameter likelihood, middle is the distribution based on the parameter and the right is the sampled posterior. If we knew the formula for the actual posterior, there is no point in sampling it, we already have the formula, so we can compute all the parameters and the fits we want from the formula itself. I can be wrong and I agree that the video was quite confusing, but at least intuitively it kinda makes sense.
An introduction to the Bernoulli and binomial distributions
8:27
An introduction to importance sampling
14:19
Ben Lambert
Рет қаралды 59 М.
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
We Attempted The Impossible 😱
00:54
Topper Guild
Рет қаралды 56 МЛН
Каха и дочка
00:28
К-Media
Рет қаралды 3,4 МЛН
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
The intuition behind the Hamiltonian Monte Carlo algorithm
32:09
Ben Lambert
Рет қаралды 60 М.
An introduction to Gibbs sampling
18:58
Ben Lambert
Рет қаралды 82 М.
Statistical Rethinking 2022 Lecture 02 - Bayesian Inference
1:12:46
Richard McElreath
Рет қаралды 89 М.
29 - Posterior predictive distribution: example Disease
9:41
An introduction to the Random Walk Metropolis algorithm
11:28
Ben Lambert
Рет қаралды 62 М.
How to systematically approach truth - Bayes' rule
19:08
Rational Animations
Рет қаралды 125 М.
Introducing Bayes factors and marginal likelihoods
13:10
Ben Lambert
Рет қаралды 33 М.
Bayesian statistics - the basics
31:35
TileStats
Рет қаралды 3,4 М.
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 156 М.
What the Heck is Bayesian Stats ?? : Data Science Basics
20:30
ritvikmath
Рет қаралды 70 М.
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН