The Cramer-Rao Lower Bound ... MADE EASY!!!

  Рет қаралды 4,146

Learn Statistics with Brian

Learn Statistics with Brian

Күн бұрын

Пікірлер: 16
@huanranchen
@huanranchen 24 күн бұрын
I'm so grateful for all the videos you make that inspire our curiosity!
@statswithbrian
@statswithbrian 24 күн бұрын
Thank you! :)
@JuhiMaurya-ym3ud
@JuhiMaurya-ym3ud 4 күн бұрын
perfect and easiest explanation in utube....thanku so much sir it is really helpful
@chrisolande1061
@chrisolande1061 Ай бұрын
Exactly the kind of video I was looking for, Perfect explanation.
@QuantNovice
@QuantNovice 2 ай бұрын
That video is gold for every stats student! Thanks a lot for this amazing content!
@santiagodm3483
@santiagodm3483 5 ай бұрын
Nice videos. I'm now preparing for my masters and it will be quite useful; the connection between CRLW and the standard error of the estimates by MLE makes this very nice.
@RoyalYoutube_PRO
@RoyalYoutube_PRO 4 ай бұрын
Fantastic video... preparing for IIT JAM MS
@jayjain1033
@jayjain1033 23 күн бұрын
Soo good! Didn't get it in class at all
@ligandro
@ligandro 4 ай бұрын
Thanks for uploading all this content. I am about to begin my masters in data science soon and I was trying to grasp some math theory which is hard for me coming from a CS Background. Your videos make it so simple to digest all these topics.
@LAQ24
@LAQ24 Ай бұрын
Brilliant!
@swatigoel5387
@swatigoel5387 Ай бұрын
Super helpful video! Thank you:)
@jayanthiSaibalaji
@jayanthiSaibalaji 3 ай бұрын
Many thanks 🙏
@ridwanwase7444
@ridwanwase7444 4 ай бұрын
Fisher information is negative of expected value of double derivative of log L, then why we multiply with 'n' to get it?
@statswithbrian
@statswithbrian 4 ай бұрын
I was assuming the L here is the likelihood of a single data point. In that case, you just multiply by n at the end to get the information of all n observations. If L is the likelihood of all n data points, then the answer will already contain the n and you don't have to multiply at the end. The two methods are equivalent when the data is independent and identically distributed.
@ridwanwase7444
@ridwanwase7444 4 ай бұрын
@@statswithbrian Thanks for replying so quickly! I have another question, is MLE of population mean always guarantee that it will have the CRLB variance?
@statswithbrian
@statswithbrian 4 ай бұрын
Hmm, I don't think this is true in general. At some level, it's certainly not true if we're talking about the CRLB of unbiased estimators, because the MLE is sometimes biased. For example, in a uniform distribution on [0,theta], the MLE is biased, and the Fisher Information is not even defined. My guess is that this applies for some "location families", which the normal, binomial, poisson would all be. For a "scale family" like the exponential distribution, in the parameterization where the mean is 1/lambda, I do not believe the MLE meets the CRLB.
The Rao-Blackwell Theorem Explained
15:12
Learn Statistics with Brian
Рет қаралды 1,3 М.
Cramer-Rao Lower Bound / Inequality
9:44
statisticsmatt
Рет қаралды 6 М.
1, 2, 3, 4, 5, 6, 7, 8, 9 🙈⚽️
00:46
Celine Dept
Рет қаралды 108 МЛН
They Chose Kindness Over Abuse in Their Team #shorts
00:20
I migliori trucchetti di Fabiosa
Рет қаралды 12 МЛН
Method of Moments Estimation
3:59
math et al
Рет қаралды 191 М.
Maximum Likelihood - Cramer Rao Lower Bound Intuition
8:00
Ben Lambert
Рет қаралды 131 М.
Likelihood Ratio Tests Clearly Explained
18:28
Learn Statistics with Brian
Рет қаралды 468
Maximum Likelihood Estimation ... MADE EASY!!!
9:12
Learn Statistics with Brian
Рет қаралды 24 М.
Sufficient Statistics and the Factorization Theorem
15:19
Learn Statistics with Brian
Рет қаралды 9 М.
Cramer Rao Inequality
22:02
Hopefully Helpful Mathematics Videos
Рет қаралды 9 М.
Chebyshev's Inequality ... Made Easy!
9:46
Learn Statistics with Brian
Рет қаралды 16 М.
p-values explained, in 5 levels of complexity
10:19
Learn Statistics with Brian
Рет қаралды 3,2 М.
1, 2, 3, 4, 5, 6, 7, 8, 9 🙈⚽️
00:46
Celine Dept
Рет қаралды 108 МЛН