An introduction to the Random Walk Metropolis algorithm

  Рет қаралды 61,362

Ben Lambert

Ben Lambert

Күн бұрын

Пікірлер: 32
@collincherubim2698
@collincherubim2698 4 жыл бұрын
Finally, visuals for MCMC! Highly illuminating, thank you.
@jiachengchen7828
@jiachengchen7828 4 жыл бұрын
This video is 10000x better than the equations in the class.
@distrologic2925
@distrologic2925 2 жыл бұрын
Right? I don't understand how actual lecturers can be THAT terrible at conveying knowledge. Its like they don't want people to understand it.
@AmrutaOfAllTrades
@AmrutaOfAllTrades 4 жыл бұрын
Finally found out why its called Monte Carlo. This is the best explanation of the algorithm I have ever seen. Thanks for this.
@vman049
@vman049 5 жыл бұрын
Best explanation of MH on KZbin. Thank you!
@johnedwardhills4529
@johnedwardhills4529 3 жыл бұрын
Thanks Ben. This is a really clear visual representation of what the algorithm is doing and how it works in principle. Excellent stuff!
@MisterCactus777
@MisterCactus777 2 жыл бұрын
I used this for my Bachelors thesis to simulate ultracold fermions in a harmonic trap, which was a replication of real expermients! Thank you for explaining, I had forgotten what it did...
@bradh2649
@bradh2649 3 ай бұрын
Beautifully explained
@lemyul
@lemyul 5 жыл бұрын
thank God there's a video about this
@tergl.s
@tergl.s 7 ай бұрын
the simulation is so helpful! thanks
@mikolajwojnicki2169
@mikolajwojnicki2169 4 жыл бұрын
Great video. Way easier to understand than my uni lectures.
@darcycordell7156
@darcycordell7156 3 жыл бұрын
Maybe a dumb question, but at 5:46 aren't you using the unknown distribution to calculate r? Isn't the black line on the graph the unknown distribution you are trying to estimate?
@GabeNicholson
@GabeNicholson 3 жыл бұрын
No thats a good question. The way I understand it is that all you need to compare is the ratio of the numerator for Bayes rule. And you sample through different values of the parameter (theta). You basically run through lots of different possible parameter values and the way it walks across the graph is when it hits a value of the parameter that corresponds to the true posterior distribution. Most of the guesses of the parameter are wrong and don't lead to anything at all which is why the animation at around 8 minutes has many more wrong guesses than correct ones. The numerator of Bayes's rule is what dictates where those true parameter spots are. The denominator decides the height of it. This sampling method estimates the height of them which is decided by the denominator. So for your question of r. All we know is the numerator, not the denominator.
@هشامأبوسارة-ن7و
@هشامأبوسارة-ن7و 7 ай бұрын
Very insightful.
@mikotokitahara9923
@mikotokitahara9923 3 жыл бұрын
Best one on KZbin, thanks a lot.
@distrologic2925
@distrologic2925 2 жыл бұрын
Don't gaps in the true distribution skew the samples to the borders of these gaps because the random walk is less likely to cross the gap, especially with a low sigma in the jumping distribution?
@Penrodyn
@Penrodyn 5 жыл бұрын
Are the Mathematica programs you used in the video available? I particularly liked the last one where you showed a more complicated surface. I also just ordered your book, are there available with that?
@karimaelouahmani7078
@karimaelouahmani7078 2 жыл бұрын
Brilliant honestly.
@DJRaagaMuffin
@DJRaagaMuffin 5 жыл бұрын
Great explanation. Thank you
@ahmedjunaidkhalid3929
@ahmedjunaidkhalid3929 5 жыл бұрын
I have a question. Suppose rather than having just one value theta, I have multiple values [A,B,C] in my state. Each variable can only have four values [0,1,2,3]. How would I choose a new state from the previous one? Would I calculate a new value for each variable and call it a proposed state and then calculate the value for the complete system?
@jelmerdevries7827
@jelmerdevries7827 5 жыл бұрын
probably too late, but for a multi-variable model you want to use a gibbs samples
@FluxProGaming
@FluxProGaming 5 жыл бұрын
Subscribed. Good voice, good explanations !!
@raycyst-k9v
@raycyst-k9v 8 ай бұрын
where can i find the code for this?
@mojiheydari
@mojiheydari 3 жыл бұрын
awesome
@wunderjahr
@wunderjahr 3 жыл бұрын
👏👏👏
@siarez
@siarez 6 жыл бұрын
I don't get where the likelihood term and the prior term come from. Here we assume they exist. What is an example of a practical application where we have these two terms but don't have the posterior?
@GeoffRuddock
@GeoffRuddock 6 жыл бұрын
@Siarez the difficult part in calculating the posterior is usually the denominator (marginal distribution). This algorithm uses the ratio of unnormalized posteriors, so the cumbersome marginal distribution cancels out.
@AP-rs5wz
@AP-rs5wz 5 жыл бұрын
Yes, the marginal can be quite costly to compute, as you have to integrate out so many (potentially) unknown dimensions.
@payam-bagheri
@payam-bagheri 4 жыл бұрын
I agree with you. There's a disconnect in the explanation in the video. The video mentions that the prior can be anything (just to have a starting point) but doesn't explain where the likelihood comes from.
@jimbocho660
@jimbocho660 3 жыл бұрын
@@payam-bagheri The likelihood is computed in the usual way from your data and proposed data generation model. This video is about how to sample from your unnormalized posterior once you obtain an expression for it using the likelihood x prior.
@johannesluttmer1285
@johannesluttmer1285 3 жыл бұрын
Metropolis 1927 illumination= moloch Conspiracy of dunges = owl of minerva =satan
@amenaalhassan2807
@amenaalhassan2807 2 жыл бұрын
ước gì có tiền ăn tết
The intuition behind the Hamiltonian Monte Carlo algorithm
32:09
Ben Lambert
Рет қаралды 58 М.
Flipping Robot vs Heavier And Heavier Objects
00:34
Mark Rober
Рет қаралды 58 МЛН
This mother's baby is too unreliable.
00:13
FUNNY XIAOTING 666
Рет қаралды 39 МЛН
Un coup venu de l’espace 😂😂😂
00:19
Nicocapone
Рет қаралды 11 МЛН
РОДИТЕЛИ НА ШКОЛЬНОМ ПРАЗДНИКЕ
01:00
SIDELNIKOVVV
Рет қаралды 3,9 МЛН
An introduction to Gibbs sampling
18:58
Ben Lambert
Рет қаралды 81 М.
Constrained parameters? Use Metropolis-Hastings
13:14
Ben Lambert
Рет қаралды 11 М.
An introduction to rejection sampling
10:37
Ben Lambert
Рет қаралды 52 М.
Metropolis - Hastings : Data Science Concepts
18:15
ritvikmath
Рет қаралды 103 М.
Rejection Sampling - VISUALLY EXPLAINED with EXAMPLES!
15:27
Kapil Sachdeva
Рет қаралды 27 М.
"A Random Variable is NOT Random and NOT a Variable"
29:04
Dr Mihai Nica
Рет қаралды 22 М.
Statistical Rethinking 2023 - 08 - Markov Chain Monte Carlo
1:16:17
Richard McElreath
Рет қаралды 19 М.
Introduction to Bayesian Statistics - A Beginner's Guide
1:18:47
Woody Lewenstein
Рет қаралды 86 М.
Importance Sampling
12:46
Mutual Information
Рет қаралды 62 М.
Flipping Robot vs Heavier And Heavier Objects
00:34
Mark Rober
Рет қаралды 58 МЛН