Central Limit Theorems: An Introduction

  Рет қаралды 44,365

Ben Lambert

Ben Lambert

11 жыл бұрын

This video provides an introduction to central limit theorems, in particular the Lindeberg-Levy CLT. Check out ben-lambert.com/econometrics-... for course materials, and information regarding updates on each of the courses. Quite excitingly (for me at least), I am about to publish a whole series of new videos on Bayesian statistics on youtube. See here for information: ben-lambert.com/bayesian/ Accompanying this series, there will be a book: www.amazon.co.uk/gp/product/1...

Пікірлер: 19
@Nathan-sw8kb
@Nathan-sw8kb 8 жыл бұрын
Brilliant, your videos have literally gotten me through an entire econometrics degree.
@zihanchen4312
@zihanchen4312 3 жыл бұрын
I think this video is definitely one of the best videos on KZbin which explain CLT and WLLN. Don't know why the number of views is so small. Anyway, thank you so much for the clear explanation.
@afifkhaja
@afifkhaja 11 жыл бұрын
Clear and concise as usual. Thank you
@d_15745
@d_15745 7 жыл бұрын
amazing! very clear explanations
@tteej
@tteej 5 жыл бұрын
Ditto Nathan’s comment, a lot of students owe you a great deal of thanks!
@ranadeepkundu5357
@ranadeepkundu5357 6 жыл бұрын
brilliant explanation
@bobo0612
@bobo0612 3 жыл бұрын
thank you for the nice explanation
@RealMcDudu
@RealMcDudu 5 жыл бұрын
Great video! Not sure I understand the misconception you mentioned. If you divide by sqrt(n) you get a probability whose variance goes to 0 as N goes to infinity - so both sides turn to be constants, as a distribution with variance 0 is essentially a constant.
@Hatallula
@Hatallula 4 жыл бұрын
thank you , really.
@raulabcastroc
@raulabcastroc 8 жыл бұрын
Graphs on 6:48 are also pdf's, right? Thanks for your videos. They're awesome!
@ARMofficial
@ARMofficial 3 жыл бұрын
Dear @Ben Lambert, would you mind explaining the maths between the concept at 11:04 I get the result but not while multiplying by 1/sigma would lead to 1...
@alex_8704
@alex_8704 6 жыл бұрын
Dear Ben, I started studying R language and decided to check the CLT on i.i.d. variables which PDF is defined over a range limited on one or both ends. I started with applying the CI formulas based on the CLT to i.i.d. variables with beta-distributions and uniform 0-to-1 distributions and the sample mean close to 1 and got the 95% CIs that span over 1. Then I ran Monte Carlo simulations on R with different parameters, for instance: i.i.d. lognormal variables with μ = −0.5, σ = 2, 10 000 iterations of drawing a random sample of 10 000. The simulation gives me the difference between the mean of and median of the sample mean distribution equal to 12.7% of SEM, and the difference between the 5th percentile and the mean of the sample means and the 95th percentile and the mean of the sample means equal to 32% of SEM. It seems the CLT just doesn't work... at least for the i.i.d. variables that don't take all the values between −∞ and +∞.
@alex_8704
@alex_8704 6 жыл бұрын
My code in R: x1
@alex_8704
@alex_8704 6 жыл бұрын
Now I decided to run Monte Carlo simulation for several sample sizes to see if the asymmetry of the sample mean distribution diminishes when the sample size grows. And it seems to do so. So I assume that the sample mean distribution does converge to a more symmetrical distribution. However the degree of asymmetry at sample sizes up to the range of 10 000-100 000 are still significant, which compromises the validity of using z (or t) statistics to precisely estimate the location of the population mean or in hypothesis testing. Below is my R code (it takes many minutes to finish the calculations): m1
@alex_8704
@alex_8704 6 жыл бұрын
In other words, even with the sample size of 100 000, if you try to find a 90% CI for the population mean with the z-score of 1.64485363 or t-score of 1.64486886, the distance from the lower bound of CI to the mean point estimate should be around 5.8% shorter than the distance from the upper bound of CI to the mean point estimate. This is pretty significant, isn't it?
@henryalferink1941
@henryalferink1941 4 жыл бұрын
At about 2:40 you asked, "what happens if we increase the size of the sample?" My question is, what would be the difference between increasing the size of the samples, compared with increasing the number of samples taken, or are they basically the same thing? Also, if converging in distribution causes convergence to a constant, is this the same as converging in probability? Thanks for the video, very useful!
@nelswas2869
@nelswas2869 3 жыл бұрын
Usually we only consider one sample from the population. Here we are concerned with the theoretical distribution of an estimator for a given sample size. The second statement is actually not correct. Convergence in probability implies convergence in distribution but not vice versa.
@redr2222
@redr2222 3 жыл бұрын
Surely dividing the variance by n is not 'nonsense'.. at least, it helps with my intuition. It means (xbar - mu) converges to a N(0,0) distribution, which is like you say, a degenerate distribution, and can (and should) be written as 0.. But is not entirely 'wrong', as for finite sample sizes, it surely is correct? It is just in the realm of asymptotics where it becomes degenerate.
@DJGOULDING
@DJGOULDING 4 жыл бұрын
Can you stop saying "sort of" in every sentence. It's like when americans say "like" all the time.
Characteristic functions introduction
6:06
Ben Lambert
Рет қаралды 70 М.
Central Limit Theorem
7:09
Ben Lambert
Рет қаралды 43 М.
Дарю Самокат Скейтеру !
00:42
Vlad Samokatchik
Рет қаралды 8 МЛН
Chebyshev's Inequality intuition - part 2
5:11
Ben Lambert
Рет қаралды 24 М.
Hear what VP Harris says she told Israel's Netanyahu
11:01
French rail network hit by ‘malicious acts’ ahead of Paris Olympics
3:56
Markov's Inequality
5:49
Ben Lambert
Рет қаралды 138 М.
Centered versus non-centered hierarchical models
20:28
Ben Lambert
Рет қаралды 10 М.
(PP 5.5) Law of large numbers and Central limit theorem
11:08
mathematicalmonk
Рет қаралды 38 М.
Amazing 3 iPhone Trick Shot
0:32
That's Amazing Shorts
Рет қаралды 78 МЛН
When Brother Refuses to Listen #shorts #funny #fypシ゚viral
0:19
Javi’s Family Adventures
Рет қаралды 17 МЛН
When Brother Refuses to Listen #shorts #funny #fypシ゚viral
0:19
Javi’s Family Adventures
Рет қаралды 17 МЛН
🥹Он герой (shawn_cnhk on IG)
0:15
Бутылочка
Рет қаралды 854 М.
Dad made juice from watermelon pulp for his son.
0:32
Valja & Maxim Family
Рет қаралды 7 МЛН
PRADO 250 - классная машина!
0:28
Тарасов Auto
Рет қаралды 3 МЛН
Жду в тг: @kedrovaalyona
0:59
Кедрова Алёна
Рет қаралды 3,2 МЛН