Why We Divide by N-1 in the Sample Variance (Standard Deviation) Formula | The Bessel's Correction

  Рет қаралды 8,553

DataMListic

DataMListic

Күн бұрын

In this video we discuss why and when we divide by n-1 instead of n in the sample variance and the sample standard deviation formula, known in the statistics literature as the Bessel's correction. This method corrects the bias in the estimation of the population variance. but only partially corrects the bias in the estimation of the population standard deviation (and that's why I didn't include the standard deviation in this video).
References
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Expected value of sample variance proof: proofwiki.org/wiki/Bias_of_Sa...
Related Videos
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Why neural networks are universal functions approximators: • Why Neural Networks Ca...
Bagging vs Boosting: • Bagging vs Boosting - ...
Why we need activations in neural nets: • Why We Need Activation...
Bias variance Trade-off: • Why Models Overfit and...
Neural networks on tabular data: • Why Deep Neural Networ...
Contents
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
00:00 - Intro
00:19 - Population vs Sample Statistics
01:22 - Population vs Sample Biased Variance Example
02:13 - Expected Value of the Biased Variance
03:34 - Bias Source Intuition
04:38 - Degrees of Freedom
05:55 - Outro
Follow Me
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
🐦 Twitter: @datamlistic / datamlistic
📸 Instagram: @datamlistic / datamlistic
📱 TikTok: @datamlistic / datamlistic
Channel Support
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
The best way to support the channel is to share the content. ;)
If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary)
► Patreon: / datamlistic
► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq
► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281
► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5
► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a
#variance #standarddeviation #bias #statistics

Пікірлер: 36
@datamlistic
@datamlistic Жыл бұрын
I had a small cold while recording the video, hope my voice didn't sound too weird...
@JoaqoRiquelme
@JoaqoRiquelme 9 ай бұрын
You were great! Thanks for the effort. I learned a lot.
@datamlistic
@datamlistic 9 ай бұрын
@@JoaqoRiquelme Thanks a lot! I am happy to hear you found the video helpful! :)
@pushkargarg4946
@pushkargarg4946 Ай бұрын
I have been going through other bs videos people be putting using the same number line example, no one even talking about the DOF reason. Great vid man thanks.
@datamlistic
@datamlistic Ай бұрын
Many thanks! Glad you think so about this explanation! :)
@sayangoon2180
@sayangoon2180 6 ай бұрын
Crisp and Clear.Thank you
@datamlistic
@datamlistic 6 ай бұрын
Thanks! Glad you liked it! :)
@iacobsorina6924
@iacobsorina6924 Жыл бұрын
Very clean explanation! Thank you!😊
@datamlistic
@datamlistic Жыл бұрын
Glad it was helpful! :)
@AlbinJames
@AlbinJames Жыл бұрын
Thanks for this excellent presentation! It's good to see material on such subtle aspects of statistics, and your friendly way of building an intuition around it :) This was way more accessible than the Wikipedia entry and encourages one to continue studying it.
@datamlistic
@datamlistic Жыл бұрын
Thank you for your kind words! I am happy to hear that you found this video useful. :)
@alnune03
@alnune03 9 ай бұрын
Thank you! Very good excellent explanation!
@datamlistic
@datamlistic 9 ай бұрын
Thanks! Happy you enjoyed it! :)
@gmaxmath
@gmaxmath 6 ай бұрын
Beautiful explanation. I will share this with my students.
@datamlistic
@datamlistic 6 ай бұрын
Thanks! Happy to hear that you liked it! :)
@szngyun8891
@szngyun8891 4 ай бұрын
Thanks for your proofing sir
@bharathishankar4870
@bharathishankar4870 6 ай бұрын
Very Well Explained
@datamlistic
@datamlistic 6 ай бұрын
Thanks! Glad you liked it! :)
@klevisimeri607
@klevisimeri607 8 ай бұрын
Very nice!
@datamlistic
@datamlistic 8 ай бұрын
Thanks!
@tutorchristabel
@tutorchristabel Жыл бұрын
Well explained
@datamlistic
@datamlistic Жыл бұрын
Thank you!
@user-ot2vx8pd5u
@user-ot2vx8pd5u 10 ай бұрын
Forte bine ecplicat😢
@benlee3545
@benlee3545 2 ай бұрын
Hi DataMlistic, not sure you can share your dataset so that I can know how this population mean=175 is calculated. As I did try a few sample and yes they are all below 175 but how do I know this 175 is derived? Also the USL and LSL if provided will be extremely helpful. Thank you in advance.
@datamlistic
@datamlistic 2 ай бұрын
Thanks for the question! I simply defined a gaussian with a mean of 175 and a std of 6 and took samples from it, nothing more, nothing less.
@benlee3545
@benlee3545 2 ай бұрын
@@datamlistic Sir, noted and thank you very much for your reply.
@benlee3545
@benlee3545 2 ай бұрын
Hi DataMlistic, I try a sample of 160,155,180,190 and I get a Variance of 204 which is far larger than 36. By the way, when we do sampling, we never know the population mean since the population is so big. So how do I know when to use N-1 or bessel correction?
@datamlistic
@datamlistic 2 ай бұрын
Hi Ben Lee Thanks for the question. How did you calculate the variance, what formula did you use? It seems to be really off? To answer your second question, population is more of a theoretical concept in statistics, and you almost never have access to it. I suggest to alawys use the Bessel correction to compute the variance, unless it's clearly stated that the samples you've got are the entire population.
@benlee3545
@benlee3545 2 ай бұрын
@@datamlistic Sir, thank you very much.
@datamlistic
@datamlistic 2 ай бұрын
Glad I could help! :)
@datamlistic
@datamlistic 2 ай бұрын
Glad I could help! :)
@benlee3545
@benlee3545 2 ай бұрын
@@datamlistic Yes! That is a big help.
@dennisestenson7820
@dennisestenson7820 3 ай бұрын
5:00
@datamlistic
@datamlistic 3 ай бұрын
?
@szngyun8891
@szngyun8891 4 ай бұрын
Thanks for your proofing sir
@datamlistic
@datamlistic 4 ай бұрын
You're welcome! Glad tou liked it! :)
Kullback-Leibler (KL) Divergence Mathematics Explained
3:21
DataMListic
Рет қаралды 1,9 М.
Dividing By n-1 Explained
14:18
PsychExamReview
Рет қаралды 4,1 М.
БАБУШКИН КОМПОТ В СОЛО
00:23
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 16 МЛН
Каха заблудился в горах
00:57
К-Media
Рет қаралды 4,7 МЛН
But what is the Central Limit Theorem?
31:15
3Blue1Brown
Рет қаралды 3,4 МЛН
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4 МЛН
Measures of Variability (Range, Standard Deviation, Variance)
9:30
Daniel Storage
Рет қаралды 302 М.
Why Dividing By N Underestimates the Variance
17:15
StatQuest with Josh Starmer
Рет қаралды 124 М.
Variance and Standard Deviation: Why divide by n-1?
13:47
zedstatistics
Рет қаралды 268 М.
How We’re Fooled By Statistics
7:38
Veritasium
Рет қаралды 3,6 МЛН
БАБУШКИН КОМПОТ В СОЛО
00:23
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 16 МЛН