What is meant by entropy in statistics?

  Рет қаралды 34,985

Ben Lambert

Ben Lambert

Күн бұрын

Пікірлер: 24
@sarrae100
@sarrae100 5 жыл бұрын
you must be knowing this surely, but still let me reiterate once , you are a Genius at explaining things.
@ernsthenle8431
@ernsthenle8431 4 жыл бұрын
Excellent Video. Using log based 2, the entropy for theta = 0.75 is 0.81 not 0.56 as described in the video. You get 0.56 if you use the natural log.
@rojanshrestha822
@rojanshrestha822 3 жыл бұрын
thanks
@cerioscha
@cerioscha 2 жыл бұрын
I like this explanation, well done. I'm surprised people don't adopt an intuition of Entropy based on the degree of "even mixture" or "even population or occupation" of the possible states of the system after the process is repeated (ie across multiple instances).
@chalize1
@chalize1 5 жыл бұрын
Thank you for providing your explanation! But, some subheaders and not cramming it all onto one slide would make it so much more digestible
@andrewlee1032
@andrewlee1032 4 жыл бұрын
I agree 100%. It gets confusing
@joed3325
@joed3325 6 жыл бұрын
Excellent explanations as always.
@justinw8370
@justinw8370 3 жыл бұрын
When in doubt, taking a derivative and setting it equal to 0 is your friend
@eannaohara9082
@eannaohara9082 3 жыл бұрын
so true lol
@VignettesinPhysics
@VignettesinPhysics 3 жыл бұрын
Sir, very nice explanation. A small doubt, I have two series of data points X and Y. X is independent of Y but Y depends on X. To see the dependency of Y on X, I have calculated Pearson's correlation coefficient. But, one of my friend suggested me to calculate entropy for better correlation between distribution. Can you guide me how to do that.
@rembautimes8808
@rembautimes8808 3 жыл бұрын
Good explanation thanks.
@AK-fj4tb
@AK-fj4tb 5 жыл бұрын
Most of your videos are very good but I have watched this one five times and still have little idea what the value of entropy is. I can do the math but I am confused as to the purpose. What does the recipient know before hand? How does telling someone the outcome of a coin toss "reduce uncertainty"? What exactly is it they were uncertain about? Any insights would be appreciated. Thank you.
@AK-fj4tb
@AK-fj4tb 5 жыл бұрын
On my sixth watching of this I now think I have it. Please verify if correct. The information recipient knows the distributional form and the value of theta ex ante. A coin is then flipped but the outcome hidden. How much uncertainty is reduced about the outcome of the flip by showing the coin to the information recipient.
@ziliestarrive
@ziliestarrive 3 жыл бұрын
I have the same doubt. If someone can verify, it'd be greatly appreciated.
@happygrace8065
@happygrace8065 Жыл бұрын
thank you😊
@jayalekshmis4977
@jayalekshmis4977 5 жыл бұрын
SUperbbb Explanation Thank you
@CGKittenz
@CGKittenz 4 жыл бұрын
Omg! Thank you for this.
@awr359
@awr359 5 жыл бұрын
Thank you!
@shijingsi8288
@shijingsi8288 4 жыл бұрын
it should the expected value of Negative log p(x)
@FndfnnFnefnfn
@FndfnnFnefnfn 3 ай бұрын
Oh, no... you don't know where it is coming from. It is expected value of suprise WHICH is equal to log(p⁻¹(x)).
@indurthi
@indurthi 4 жыл бұрын
Thank you
@gunjansethi2896
@gunjansethi2896 4 жыл бұрын
Brilliant!
@zoozolplexOne
@zoozolplexOne 3 жыл бұрын
Cool !!
@vlaaady
@vlaaady 4 жыл бұрын
Pretty unintuitive. There are much better intuitive interpretation with the depth of binary tree.
An introduction to mutual information
8:33
Ben Lambert
Рет қаралды 70 М.
The intuition behind the Hamiltonian Monte Carlo algorithm
32:09
Ben Lambert
Рет қаралды 59 М.
ТЮРЕМЩИК В БОКСЕ! #shorts
00:58
HARD_MMA
Рет қаралды 2,3 МЛН
МЕНЯ УКУСИЛ ПАУК #shorts
00:23
Паша Осадчий
Рет қаралды 4,5 МЛН
How Much Tape To Stop A Lamborghini?
00:15
MrBeast
Рет қаралды 200 МЛН
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 206 М.
Why Information Theory is Important - Computerphile
12:33
Computerphile
Рет қаралды 154 М.
The Most Misunderstood Concept in Physics
27:15
Veritasium
Рет қаралды 16 МЛН
How Quantum Entanglement Creates Entropy
19:36
PBS Space Time
Рет қаралды 1,1 МЛН
KL Divergence - How to tell how different two distributions are
13:48
Serrano.Academy
Рет қаралды 7 М.
The better way to do statistics
17:25
Very Normal
Рет қаралды 256 М.
Information Theory Basics
16:22
Intelligent Systems Lab
Рет қаралды 69 М.
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,5 МЛН
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 629 М.
The Misunderstood Nature of Entropy
12:20
PBS Space Time
Рет қаралды 1,2 МЛН
ТЮРЕМЩИК В БОКСЕ! #shorts
00:58
HARD_MMA
Рет қаралды 2,3 МЛН