Information entropy | Journey into information theory | Computer Science | Khan Academy

  Рет қаралды 327,220

Khan Academy Labs

Khan Academy Labs

Күн бұрын

Пікірлер: 189
@vandanachandola322
@vandanachandola322 4 жыл бұрын
I remember my teacher in high school defined entropy as "the degree of randomness". I decided it was an abstract concept that I don't get. Now learning about information entropy in my master's class, I found this video and I'm so glad I did!! Thanks, it's very well explained :)
@itsRAWRtime007
@itsRAWRtime007 9 жыл бұрын
good video. i like the way it shows the intuition behind the concept, that is the reason why the concepts actually exists rather than plainly defining it and then showing its properties.
@youngsublee1102
@youngsublee1102 4 жыл бұрын
couldn't agree more
@amirizaiah7179
@amirizaiah7179 3 жыл бұрын
I guess Im kind of randomly asking but do anybody know of a good site to watch newly released series online?
@andrewrandall9989
@andrewrandall9989 3 жыл бұрын
@Amir Izaiah Flixportal :D
@amirizaiah7179
@amirizaiah7179 3 жыл бұрын
@Andrew Randall Thank you, I signed up and it seems like a nice service :) I appreciate it !!
@andrewrandall9989
@andrewrandall9989 3 жыл бұрын
@Amir Izaiah you are welcome xD
@XavierGlenKuei
@XavierGlenKuei 6 жыл бұрын
at 1:24, i would argue the 3rd question (ie, the question on the right of 2nd hierarchy) should be "Is it C?" (or "Is it D?") rather than "Is it B?" (i think this is so because, as the 1st machine answered "No" to the 1st question ["Is it AB?"], it essentially rules-out both A and B, leaving only C(or D) as the possible outcome; hence no role for "B" anymore)
@marcuschiu8615
@marcuschiu8615 5 жыл бұрын
yea, I agree with you. dam, so many mistakes in this video, 1:24 and 3:50. makes me question their reliability... good video though
@dien2971
@dien2971 4 жыл бұрын
I thought I understood wrong lol. Thank you!
@muhaymenulislam1942
@muhaymenulislam1942 2 жыл бұрын
But here the probability of D is 25% which Is more than 12.5%, so in the second they ask is it D? .
@pedrogorilla483
@pedrogorilla483 5 жыл бұрын
I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.
@kempisabel9945
@kempisabel9945 4 жыл бұрын
this video blew my mind away. Thank you! I love these intelligent yet fun videos!
@mathaha2922
@mathaha2922 4 жыл бұрын
This is one of the most informative -- and I use that term advisedly -- videos I have ever seen. Thank you!
@Dhanush-zj7mf
@Dhanush-zj7mf 4 жыл бұрын
1:24 You are asking same question twice because you already asked "Is it A or B" in the root if the answer is no that means "it will be either C or D" but you are asking again whether it is B, or not in the sub branch. It should be either "Is it C" or "Is it D".
@MohdFirdaus-fk6no
@MohdFirdaus-fk6no 3 жыл бұрын
yes, you are correct
@salrite
@salrite 6 жыл бұрын
What a Beautiful Explanation!!!
@MaryMary-ep4hd
@MaryMary-ep4hd 4 жыл бұрын
Ingenious interpretation! I applaud!
@someshsharma6683
@someshsharma6683 6 жыл бұрын
Awesome explanation with a very intuitive example.Thanks a lot...
@boredomgotmehere
@boredomgotmehere Жыл бұрын
Makes it all so super clear and easy to follow. Love this.
@boredomgotmehere
@boredomgotmehere 10 ай бұрын
Just a tiny error at 3:50 - the final calculation shld be 0.25*2.
@antoinecantin1780
@antoinecantin1780 3 жыл бұрын
What a formidable way of visualizing and introducing information entropy. Your contributions are deeply appreciated
@suryacharan5184
@suryacharan5184 4 жыл бұрын
What a video!!....This is how education should be.
@YYchen713
@YYchen713 2 жыл бұрын
This is such a great way to explain information entropy! Classic!
@YuriValentines
@YuriValentines 3 жыл бұрын
This video has explained entropy better than any teacher I've had in my entire life. It makes me so angry to think of all my time wasted in endless lectures, listening to people with no communication skills.
@twoplustwo5
@twoplustwo5 7 ай бұрын
Kudos for linking number of bounces -> binary tree -> log. And overall very nice explanation. That's like 3rd explanation for info entropy i liked.
@raultellegen5512
@raultellegen5512 8 жыл бұрын
Amazing video. Seldom seen a better explanation of anything. Thanks!
@SAGEmania-q8s
@SAGEmania-q8s 5 ай бұрын
Thank you so much. It explains the entropy so well.
@BambiOnIce19
@BambiOnIce19 2 жыл бұрын
Perfectly well explained. The best video on information entropy I’ve seen so far
@youngsublee1102
@youngsublee1102 4 жыл бұрын
Wonderful idea of "bounce" that express the amount of information. It's so exciting.
@daihung3824
@daihung3824 3 жыл бұрын
I have one question: Say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces should remain 3 is it? Would anyone mind explaining this possible difference? Thanks a lot!
@potatocoder5090
@potatocoder5090 2 жыл бұрын
Brilliant explanation. So simple yet so profound. Thanks!
@hrivera4201
@hrivera4201 2 жыл бұрын
previous lesson: kzbin.info/www/bejne/jaqkpYKnm6iceNk next lesson: kzbin.info/www/bejne/iqnOcmiLjZmen9U
@phycosmos
@phycosmos 4 ай бұрын
thanks
@bouzouidjasidahmed1203
@bouzouidjasidahmed1203 3 жыл бұрын
Very comprehensible thank you !! it very helpful
@tythedev9582
@tythedev9582 4 жыл бұрын
Yessss I finally got the concept after this video.
@mostafaomar5441
@mostafaomar5441 5 жыл бұрын
Thank you. Explains the intuition behind Entropy very clearly.
@kartikbansal6439
@kartikbansal6439 4 жыл бұрын
Loved the piano bit towards the conclusion!
@Ewerlopes
@Ewerlopes 10 жыл бұрын
Perfect explanation! :)
@waylonbarrett3456
@waylonbarrett3456 Жыл бұрын
I found a few errors. Am I the only one seeing this?
@argha-qi5hf
@argha-qi5hf 2 жыл бұрын
I can't imagine how someone could ever come up with such abstract ideas.
@miketor2011
@miketor2011 3 жыл бұрын
Great video but is it just me or there is an error on 3:49. The correct calculation for the number of bounces should be 0.5*1+0.125*3+0.125*3+0.25*2 = 1.75 instead the video shows 0.5*1+0.125*3+0.125*3+0.25*4 =2.25? Any thoughts?
@musicmaker33428
@musicmaker33428 3 жыл бұрын
I was just thinking this. Thank you for pointing it out. I thought maybe I misunderstood something fundamental.
@rkiyanchuk
@rkiyanchuk 2 ай бұрын
Yep, that's a typo, number of bounces for P(D) = 2.
@gaofan2856
@gaofan2856 3 жыл бұрын
The most beautiful explanation of entropy
@jonathan.gasser
@jonathan.gasser 5 жыл бұрын
Wow, what a presentation!
@vlaaady
@vlaaady 4 жыл бұрын
The most intuitive explanation
@swazza9999
@swazza9999 4 жыл бұрын
This should have more likes!
@nomann5244
@nomann5244 2 жыл бұрын
you are truly a genius.
@daviddeleon292
@daviddeleon292 5 жыл бұрын
Huh??? Why am I finding out that information entropy was a concept. MIND BLOWN!!!
@tingwen524
@tingwen524 3 жыл бұрын
Great video! I totally understood entropy!
@ГримМорген
@ГримМорген 5 жыл бұрын
The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!
@karrde666666
@karrde666666 3 жыл бұрын
why can't textbooks or lectures be this easy
@Hopemkhize-d2i
@Hopemkhize-d2i 5 ай бұрын
Tell me about it😢
@csaracho2009
@csaracho2009 4 ай бұрын
I have an answer for that: The zen pupil asks the master, is the flag moving with the wind? The master replies: neither the flag or the wind move, it is your mind that moves.
@FrancescoDeToni
@FrancescoDeToni 8 жыл бұрын
Isn't there a mistake at 3:50? Shouldn't it be 0.25 x 2 instead of 0.25 x 4?
@philtrem
@philtrem 8 жыл бұрын
+Francesco De Toni yup!
@sighage
@sighage 5 жыл бұрын
Yes, it's 2.25 I guess
@rah2023
@rah2023 5 жыл бұрын
It's indeed a mistake
@nikhilsrajan
@nikhilsrajan 4 жыл бұрын
@@sighage no it's 1.75, just there was a typo. you get 1.75 with 0.25 x 2
@yudong8820
@yudong8820 3 жыл бұрын
Really good one, thanks!
@russianescapist5262
@russianescapist5262 3 жыл бұрын
I loved this surreal music and real life objects to move in a grey 60s like atmosphere.)
@edwardjurkowitz1663
@edwardjurkowitz1663 3 жыл бұрын
Excellent video. I think one point mistakenly refers to "information" when the author means 'entropy.' Machine 2 requires fewer questions. It produces more information and less entropy. Machine one produces maximum entropy and minimum information. Information is 'negative entropy.'
@lovenishkavat46
@lovenishkavat46 8 күн бұрын
My problem has been sold 10 years ago you are superman
@osobliwynick
@osobliwynick 2 жыл бұрын
Great explanation.
@shelendrasharma9680
@shelendrasharma9680 6 жыл бұрын
Best explaination , salute ....
@Puneethmypadi
@Puneethmypadi 3 жыл бұрын
Now I understand decision tree properly
@mohammadrezamoohebat9407
@mohammadrezamoohebat9407 10 жыл бұрын
It was perfect. thx
@malevip
@malevip 3 жыл бұрын
Another way to look at entropy: Measure of distribution of probability in a probability distribution.
@science_electronique
@science_electronique 9 ай бұрын
قوة في الشرح و وضوح
@n0MC
@n0MC 7 жыл бұрын
this is wonderful. thank you
@Chrls5
@Chrls5 3 жыл бұрын
Nice!
@leoyuk-tingcheung3587
@leoyuk-tingcheung3587 8 жыл бұрын
Could anyone help explain why less uncertainty means less information (Machine 2)? Isn't it the other way round? Many thanks.
@TheOmanzano
@TheOmanzano 8 жыл бұрын
there is less certainty in machine 2 because on "average" there will be less questions...meaning after many trials on average there will be 1.75 questions needed to get right result meaning there is less variety, randomness, chaos in machine 2 due to the fact that "A" will be occurring alot more than other letters
@hirakmondal6174
@hirakmondal6174 6 жыл бұрын
Think of it as a Hollywood film..where a police inspector interrogates a criminal and he must speak truth each and every time. After 175 questions the inspector found out that he knows no more than that, where as when he interrogated other criminal in an adjacent cell he found out that after asking 175 questions he can still answer 25 more.. Now U Tell Me Who Has More Information? . . . . U are welcome!! 8)
@Exhora
@Exhora 6 жыл бұрын
HIRAK MONDAL That was a great example! Thank you so much!!!
@salrite
@salrite 6 жыл бұрын
Less uncertainty means your chances to predict the outcome is more aka predictability increases and hence it refers to less information, means you don't require as many bits (say: to represent outcome) as you would when uncertainty was high or the outcome was pure random. Make Sense?
@rcgonzalezf
@rcgonzalezf 5 жыл бұрын
I also have this question. I guess we need to define information in this context, for the answers and the video itself I think they're referring to data and as I posted below, data is different than information. More questions = more data to get the same information (the output), but I might be missing something.
@chandragupta2828
@chandragupta2828 5 жыл бұрын
awesome video!
@hiteshjambhale301
@hiteshjambhale301 2 жыл бұрын
Hey there is one mistake at timestamp 1:26 ..... the question should be "is it C?" instead of B
@souvikmajumdar5604
@souvikmajumdar5604 6 жыл бұрын
Thank You
@jorgeleirana-alcocer5642
@jorgeleirana-alcocer5642 3 жыл бұрын
The equation in 3:48 should result in 2.25 not 1.75 (0.5*1)+(0.125*3)+(0.125*3)+(0.25*4)= 2.25 I think it should have been (0.5*1)+(0.125*3)+(0.125*3)+(0.25*2)
@jayrar6645
@jayrar6645 5 жыл бұрын
so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?
@ahmedelsharkawy1474
@ahmedelsharkawy1474 6 жыл бұрын
just awesome
@hingaglaiawong7815
@hingaglaiawong7815 2 жыл бұрын
at @3:15 I think there's a typo? The last term should be 0.25*2 instead of 0.25*4 I guess.
@juanpablovaca-lago5659
@juanpablovaca-lago5659 3 жыл бұрын
Is there a direct analogy for the second and third law of thermodynamics and the information entropy?
@AhmedKMoustafa2
@AhmedKMoustafa2 6 жыл бұрын
great explanation bro :)
@mikibellomillo
@mikibellomillo 5 ай бұрын
note: number of bounces - entropy is maximum when all outcomes are equally likely . when introduce predictability the entropy must go down. thanks for sharing this video! God bless you!🎉
@ChusKon1
@ChusKon1 3 жыл бұрын
Beautiful
@sholi9718
@sholi9718 5 жыл бұрын
can someone explain in #bounces=p(a)x1+p(b)x3+p(c)x3+p(d)x2 at 3:44 , how numbers 1,3,3 & 2 came?
@achrafamrani2730
@achrafamrani2730 3 жыл бұрын
x1 is the number of bounces needed to get to point A which is equal to 1 (steps needed for the disc to fall in case A) , x2=x3 is the number to get to points C and B separately and it equals to 3. For x4 it takes 2 bounces to fall in D
@shepbryan4315
@shepbryan4315 5 жыл бұрын
Why is the number of bounces the log of the outcomes?
@monicarenas
@monicarenas 3 жыл бұрын
In minute 3:51, I guess there is a mistake, for p_D, the value is 2 instead of 4, does not?
@alexhsia9510
@alexhsia9510 5 жыл бұрын
What do they mean by number of outcomes? Can someone give me an example using the ABCD examples they used?
@temenoujkafuller4757
@temenoujkafuller4757 2 жыл бұрын
Yes, I asked myself this question and watch it twice. (5:45) Count the number of branches at the bottom The number of final outcomes = 2^(number of bounces) Therefore, the inverse function of exponent is logarithm >>> The number of bounces = = the number of questions = log_2 (number of outcomes)
@btsandtxtloverstraykidzfan3486
@btsandtxtloverstraykidzfan3486 2 жыл бұрын
What are some good books on this topic ?
@_crispins
@_crispins 7 жыл бұрын
nice!
@FARHANSUBI
@FARHANSUBI 16 күн бұрын
Can someone please explain why machine 2 is producing less information? Shouldn't it be more information because we're asking fewer questions? Or is it the more number of questions we ask, the more information we receive?
@kawaikaede2269
@kawaikaede2269 2 жыл бұрын
cool
@sanjayrakshit8797
@sanjayrakshit8797 5 жыл бұрын
Heckin Shanon
@sanadarkia2724
@sanadarkia2724 5 жыл бұрын
can't we just ask one question? is it abc or d ? edit: nevermind, i just figured that 1 bit removes uncertainty of 1/2
@suliu2933
@suliu2933 6 жыл бұрын
Great video! I can follow it but I have trouble understanding the problem statement. Why "the most efficient way is to pose a question which divides the possibility by half"?
@vandanachandola322
@vandanachandola322 4 жыл бұрын
Too late, but maybe because we're trying to ask the minimum no. of questions (and therefore going with the higher probability first)?
@ArthurShelby-y8s
@ArthurShelby-y8s Жыл бұрын
👏🏻
@dissdad8744
@dissdad8744 8 жыл бұрын
Good explanation! If I wanted to calculate the entropy with log2, which calculator can do this? Is there an online calculator for this? What would be the best approach?
@ElectricChaplain
@ElectricChaplain 7 жыл бұрын
Hans Franz Too late now, but log2 b = ln b / ln 2 or more generally log 2 b = log base a of n / log base a of 2.
@gustavomartins007
@gustavomartins007 5 ай бұрын
Muito Bom
@최로봇
@최로봇 4 жыл бұрын
if it makes us ask less questions, doesn't it mean it provides more information?
@assylblog
@assylblog 5 жыл бұрын
Cool beat
@hirakmondal6174
@hirakmondal6174 6 жыл бұрын
Why outcome is 1/p?
@anirbanmukherjee4577
@anirbanmukherjee4577 6 жыл бұрын
Possibility of outcome=1/number of possibility
@betoib1504
@betoib1504 6 жыл бұрын
!Órale!
@samhe331
@samhe331 6 ай бұрын
I think the math at 3:52 is wrong.. should be 0.25 x 2 instead of 0.25 x 4 but the result is right 1.75
@FGNiniSun
@FGNiniSun 3 жыл бұрын
Hello please why does the number of outcomes at a level equal to 1/probability ?
@youssefdirani
@youssefdirani 4 жыл бұрын
4:45 Markoff or Markov ?
@mansigupta
@mansigupta 7 жыл бұрын
An excellent excellent excellent video. I finally get it.
@abdelrahmangamalmahdy
@abdelrahmangamalmahdy 7 жыл бұрын
I don't understand why he always divides by 2 !!
@mansigupta
@mansigupta 7 жыл бұрын
Because the way the questions are framed, allow for only two possible answers - Yes or No
@YTBxd227
@YTBxd227 5 жыл бұрын
still confused why #outcome=1/pi
@CZRaS
@CZRaS 3 жыл бұрын
because you need to "build" a binary tree to simulate bounces. E.g. you have probability p=1/2 (50%). From that outcome = 1/1/2 = 2. If you have p=1/8 (12,5%), you get outcome = 8. From which you can get the log2, which is basically the level on which the value is in the binary tree.
@pablobiedma
@pablobiedma 5 жыл бұрын
So if I recall correctly, the one with the highest entropy is the least informative one, then the, if a machine generates symbols, and apply the formula for each symbol, which symbol provides the most information? the one with the least amount of bits? how does that make sense, isn't it the one with the highest amount of bits? calculated by p log( 1/p)
@zkhandwala
@zkhandwala 4 жыл бұрын
Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here...
@daihung3824
@daihung3824 3 жыл бұрын
I agree with your statement. I try to have a go at changing the probabilities, say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces for D should still remain 3 is it?
@MNKPrototype
@MNKPrototype 6 жыл бұрын
Did anyone else notice the DEATH NOTE music at 4:40.
@GiuseppeRomagnuolo
@GiuseppeRomagnuolo 4 жыл бұрын
I was wondering what was that music, I really like it. Do you have any link I found this kzbin.info/www/bejne/nnzJfIyml8Zjmqc following your comment. Is that it? If so can you point me to the right minute? Tnx
@zainulabydeen2809
@zainulabydeen2809 4 жыл бұрын
Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated
@betbola5209
@betbola5209 8 жыл бұрын
como se calcula a entropia de um texto? e o que podemos fazer com isso?
@infernocaptures8739
@infernocaptures8739 2 жыл бұрын
4:36 **less** information?
@xxxxxx-wq2rd
@xxxxxx-wq2rd 3 жыл бұрын
is it valid to say less entropy = less effort required?
@sammyz1128
@sammyz1128 4 жыл бұрын
Why can't we ask whether it is AB, for the second distribution, same as the first distribution?
@sammyz1128
@sammyz1128 4 жыл бұрын
OHH i get it. If doing so, the average number of questions we ask will be bigger
@mikechristian-vn1le
@mikechristian-vn1le Жыл бұрын
Language is a much more powerful invention than the alphabet, and written language -- Chinese and the Japanese syllabary don't use alphabets -- are more powerful than the alphabet. And written language includes numbers and mathematical symbols . . .
4 жыл бұрын
Phụ đề Tiếng Việt ở 4:34 sai rồi, máy 2 sản xuất ít thông tin hơn máy 1
@tag_of_frank
@tag_of_frank 4 жыл бұрын
Why is entropy and information given the same symbol H, and why does the information formula given in video 5 of playlist include an "n" for the number of symbols transmitted, but this does not?
@Apolytus
@Apolytus 7 ай бұрын
In 3.50 you mistakenly have written 0.25*4 instead of 0.25*2.
@gregorywinter4132
@gregorywinter4132 4 жыл бұрын
身份证原件及复印件三张,每张附件是一张身份证正反面清楚的复印在a4纸同一面,千转夜海粤海街道办中国银行借记卡一张借记卡复印件一份需将借记卡与身份证正反面复印复,复印在a4纸同一面,并手抄卡号,身份证号签名不限制办卡地点中国银行借记卡即可。员工信息登记表,请将附件一中国员工信息登记表含样地打印后,参照样例手工填写一份表中涉及日期内容的,必须确认到日,请务必粘贴登记照。
The Biggest Ideas in the Universe | 20. Entropy and Information
1:38:43
Ozoda - Alamlar (Official Video 2023)
6:22
Ozoda Official
Рет қаралды 10 МЛН
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
10:41
Aurélien Géron
Рет қаралды 357 М.
Lecture 1: Introduction to Information Theory
1:01:51
Jakob Foerster
Рет қаралды 354 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
Entropy is not what you think!
9:15
MarbleScience
Рет қаралды 9 М.
Encryption & Entropy - Computerphile
8:08
Computerphile
Рет қаралды 61 М.
The Most Important (and Surprising) Result from Information Theory
9:10
Mutual Information
Рет қаралды 94 М.
A mathematical theory of communication | Computer Science | Khan Academy
4:02