Information entropy | Journey into information theory | Computer Science | Khan Academy

  Рет қаралды 330,651

Khan Academy Labs

Khan Academy Labs

Күн бұрын

Finally we arrive at our quantitative measure of entropy
Watch the next lesson: www.khanacadem...
Missed the previous lesson? www.khanacadem...
Computer Science on Khan Academy: Learn select topics from computer science - algorithms (how we solve common problems in computer science and measure the efficiency of our solutions), cryptography (how we protect secret information), and information theory (how we encode and compress information).
About Khan Academy: Khan Academy is a nonprofit with a mission to provide a free, world-class education for anyone, anywhere. We believe learners of all ages should have unlimited access to free educational content they can master at their own pace. We use intelligent software, deep data analytics and intuitive user interfaces to help students and teachers around the world. Our resources cover preschool through early college education, including math, biology, chemistry, physics, economics, finance, history, grammar and more. We offer free personalized SAT test prep in partnership with the test developer, the College Board. Khan Academy has been translated into dozens of languages, and 100 million people use our platform worldwide every year. For more information, visit www.khanacademy.org, join us on Facebook or follow us on Twitter at @khanacademy. And remember, you can learn anything.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to Khan Academy’s Computer Science channel: / channel
Subscribe to Khan Academy: www.youtube.co...

Пікірлер: 190
@vandanachandola322
@vandanachandola322 4 жыл бұрын
I remember my teacher in high school defined entropy as "the degree of randomness". I decided it was an abstract concept that I don't get. Now learning about information entropy in my master's class, I found this video and I'm so glad I did!! Thanks, it's very well explained :)
@itsRAWRtime007
@itsRAWRtime007 9 жыл бұрын
good video. i like the way it shows the intuition behind the concept, that is the reason why the concepts actually exists rather than plainly defining it and then showing its properties.
@youngsublee1102
@youngsublee1102 4 жыл бұрын
couldn't agree more
@amirizaiah7179
@amirizaiah7179 3 жыл бұрын
I guess Im kind of randomly asking but do anybody know of a good site to watch newly released series online?
@andrewrandall9989
@andrewrandall9989 3 жыл бұрын
@Amir Izaiah Flixportal :D
@amirizaiah7179
@amirizaiah7179 3 жыл бұрын
@Andrew Randall Thank you, I signed up and it seems like a nice service :) I appreciate it !!
@andrewrandall9989
@andrewrandall9989 3 жыл бұрын
@Amir Izaiah you are welcome xD
@bobbobdog
@bobbobdog 6 жыл бұрын
at 1:24, i would argue the 3rd question (ie, the question on the right of 2nd hierarchy) should be "Is it C?" (or "Is it D?") rather than "Is it B?" (i think this is so because, as the 1st machine answered "No" to the 1st question ["Is it AB?"], it essentially rules-out both A and B, leaving only C(or D) as the possible outcome; hence no role for "B" anymore)
@marcuschiu8615
@marcuschiu8615 5 жыл бұрын
yea, I agree with you. dam, so many mistakes in this video, 1:24 and 3:50. makes me question their reliability... good video though
@dien2971
@dien2971 4 жыл бұрын
I thought I understood wrong lol. Thank you!
@muhaymenulislam1942
@muhaymenulislam1942 2 жыл бұрын
But here the probability of D is 25% which Is more than 12.5%, so in the second they ask is it D? .
@pedrogorilla483
@pedrogorilla483 5 жыл бұрын
I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.
@kempisabel9945
@kempisabel9945 5 жыл бұрын
this video blew my mind away. Thank you! I love these intelligent yet fun videos!
@hrivera4201
@hrivera4201 3 жыл бұрын
previous lesson: kzbin.info/www/bejne/jaqkpYKnm6iceNk next lesson: kzbin.info/www/bejne/iqnOcmiLjZmen9U
@phycosmos
@phycosmos 6 ай бұрын
thanks
@mathaha2922
@mathaha2922 4 жыл бұрын
This is one of the most informative -- and I use that term advisedly -- videos I have ever seen. Thank you!
@MaryMary-ep4hd
@MaryMary-ep4hd 4 жыл бұрын
Ingenious interpretation! I applaud!
@salrite
@salrite 6 жыл бұрын
What a Beautiful Explanation!!!
@someshsharma6683
@someshsharma6683 6 жыл бұрын
Awesome explanation with a very intuitive example.Thanks a lot...
@antoinecantin1780
@antoinecantin1780 3 жыл бұрын
What a formidable way of visualizing and introducing information entropy. Your contributions are deeply appreciated
@boredomgotmehere
@boredomgotmehere Жыл бұрын
Makes it all so super clear and easy to follow. Love this.
@boredomgotmehere
@boredomgotmehere 11 ай бұрын
Just a tiny error at 3:50 - the final calculation shld be 0.25*2.
@Dhanush-zj7mf
@Dhanush-zj7mf 4 жыл бұрын
1:24 You are asking same question twice because you already asked "Is it A or B" in the root if the answer is no that means "it will be either C or D" but you are asking again whether it is B, or not in the sub branch. It should be either "Is it C" or "Is it D".
@MohdFirdaus-fk6no
@MohdFirdaus-fk6no 4 жыл бұрын
yes, you are correct
@suryacharan5184
@suryacharan5184 4 жыл бұрын
What a video!!....This is how education should be.
@twoplustwo5
@twoplustwo5 9 ай бұрын
Kudos for linking number of bounces -> binary tree -> log. And overall very nice explanation. That's like 3rd explanation for info entropy i liked.
@YYchen713
@YYchen713 3 жыл бұрын
This is such a great way to explain information entropy! Classic!
@daihung3824
@daihung3824 3 жыл бұрын
I have one question: Say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces should remain 3 is it? Would anyone mind explaining this possible difference? Thanks a lot!
@Ewerlopes
@Ewerlopes 10 жыл бұрын
Perfect explanation! :)
@waylonbarrett3456
@waylonbarrett3456 Жыл бұрын
I found a few errors. Am I the only one seeing this?
@youngsublee1102
@youngsublee1102 4 жыл бұрын
Wonderful idea of "bounce" that express the amount of information. It's so exciting.
@raultellegen5512
@raultellegen5512 8 жыл бұрын
Amazing video. Seldom seen a better explanation of anything. Thanks!
@BambiOnIce19
@BambiOnIce19 2 жыл бұрын
Perfectly well explained. The best video on information entropy I’ve seen so far
@karrde666666
@karrde666666 3 жыл бұрын
why can't textbooks or lectures be this easy
@Hopemkhize-d2i
@Hopemkhize-d2i 7 ай бұрын
Tell me about it😢
@csaracho2009
@csaracho2009 5 ай бұрын
I have an answer for that: The zen pupil asks the master, is the flag moving with the wind? The master replies: neither the flag or the wind move, it is your mind that moves.
@SAGEmania-q8s
@SAGEmania-q8s 6 ай бұрын
Thank you so much. It explains the entropy so well.
@YuriValentines
@YuriValentines 3 жыл бұрын
This video has explained entropy better than any teacher I've had in my entire life. It makes me so angry to think of all my time wasted in endless lectures, listening to people with no communication skills.
@miketor2011
@miketor2011 3 жыл бұрын
Great video but is it just me or there is an error on 3:49. The correct calculation for the number of bounces should be 0.5*1+0.125*3+0.125*3+0.25*2 = 1.75 instead the video shows 0.5*1+0.125*3+0.125*3+0.25*4 =2.25? Any thoughts?
@musicmaker33428
@musicmaker33428 3 жыл бұрын
I was just thinking this. Thank you for pointing it out. I thought maybe I misunderstood something fundamental.
@rkiyanchuk
@rkiyanchuk 4 ай бұрын
Yep, that's a typo, number of bounces for P(D) = 2.
@bouzouidjasidahmed1203
@bouzouidjasidahmed1203 3 жыл бұрын
Very comprehensible thank you !! it very helpful
@potatocoder5090
@potatocoder5090 2 жыл бұрын
Brilliant explanation. So simple yet so profound. Thanks!
@tythedev9582
@tythedev9582 4 жыл бұрын
Yessss I finally got the concept after this video.
@vlaaady
@vlaaady 4 жыл бұрын
The most intuitive explanation
@gaofan2856
@gaofan2856 3 жыл бұрын
The most beautiful explanation of entropy
@mostafaomar5441
@mostafaomar5441 5 жыл бұрын
Thank you. Explains the intuition behind Entropy very clearly.
@jonathan.gasser
@jonathan.gasser 5 жыл бұрын
Wow, what a presentation!
@kartikbansal6439
@kartikbansal6439 5 жыл бұрын
Loved the piano bit towards the conclusion!
@nomann5244
@nomann5244 2 жыл бұрын
you are truly a genius.
@yudong8820
@yudong8820 3 жыл бұрын
Really good one, thanks!
@osobliwynick
@osobliwynick 2 жыл бұрын
Great explanation.
@edwardjurkowitz1663
@edwardjurkowitz1663 3 жыл бұрын
Excellent video. I think one point mistakenly refers to "information" when the author means 'entropy.' Machine 2 requires fewer questions. It produces more information and less entropy. Machine one produces maximum entropy and minimum information. Information is 'negative entropy.'
@tingwen524
@tingwen524 3 жыл бұрын
Great video! I totally understood entropy!
@FrancescoDeToni
@FrancescoDeToni 8 жыл бұрын
Isn't there a mistake at 3:50? Shouldn't it be 0.25 x 2 instead of 0.25 x 4?
@philtrem
@philtrem 8 жыл бұрын
+Francesco De Toni yup!
@sighage
@sighage 5 жыл бұрын
Yes, it's 2.25 I guess
@rah2023
@rah2023 5 жыл бұрын
It's indeed a mistake
@nikhilsrajan
@nikhilsrajan 4 жыл бұрын
@@sighage no it's 1.75, just there was a typo. you get 1.75 with 0.25 x 2
@argha-qi5hf
@argha-qi5hf 2 жыл бұрын
I can't imagine how someone could ever come up with such abstract ideas.
@swazza9999
@swazza9999 4 жыл бұрын
This should have more likes!
@ГримМорген
@ГримМорген 5 жыл бұрын
The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!
@russianescapist5262
@russianescapist5262 3 жыл бұрын
I loved this surreal music and real life objects to move in a grey 60s like atmosphere.)
@lovenishkavat46
@lovenishkavat46 Ай бұрын
My problem has been sold 10 years ago you are superman
@Puneethmypadi
@Puneethmypadi 4 жыл бұрын
Now I understand decision tree properly
@shelendrasharma9680
@shelendrasharma9680 6 жыл бұрын
Best explaination , salute ....
@mohammadrezamoohebat9407
@mohammadrezamoohebat9407 10 жыл бұрын
It was perfect. thx
@daviddeleon292
@daviddeleon292 6 жыл бұрын
Huh??? Why am I finding out that information entropy was a concept. MIND BLOWN!!!
@science_electronique
@science_electronique 11 ай бұрын
قوة في الشرح و وضوح
@chandragupta2828
@chandragupta2828 5 жыл бұрын
awesome video!
@Chrls5
@Chrls5 3 жыл бұрын
Nice!
@n0MC
@n0MC 7 жыл бұрын
this is wonderful. thank you
@jayrar6645
@jayrar6645 5 жыл бұрын
so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?
@FARHANSUBI
@FARHANSUBI 2 ай бұрын
Can someone please explain why machine 2 is producing less information? Shouldn't it be more information because we're asking fewer questions? Or is it the more number of questions we ask, the more information we receive?
@souvikmajumdar5604
@souvikmajumdar5604 6 жыл бұрын
Thank You
@malevip
@malevip 4 жыл бұрын
Another way to look at entropy: Measure of distribution of probability in a probability distribution.
@AhmedKMoustafa2
@AhmedKMoustafa2 7 жыл бұрын
great explanation bro :)
@leoyuk-tingcheung3587
@leoyuk-tingcheung3587 9 жыл бұрын
Could anyone help explain why less uncertainty means less information (Machine 2)? Isn't it the other way round? Many thanks.
@TheOmanzano
@TheOmanzano 8 жыл бұрын
there is less certainty in machine 2 because on "average" there will be less questions...meaning after many trials on average there will be 1.75 questions needed to get right result meaning there is less variety, randomness, chaos in machine 2 due to the fact that "A" will be occurring alot more than other letters
@hirakmondal6174
@hirakmondal6174 7 жыл бұрын
Think of it as a Hollywood film..where a police inspector interrogates a criminal and he must speak truth each and every time. After 175 questions the inspector found out that he knows no more than that, where as when he interrogated other criminal in an adjacent cell he found out that after asking 175 questions he can still answer 25 more.. Now U Tell Me Who Has More Information? . . . . U are welcome!! 8)
@Exhora
@Exhora 6 жыл бұрын
HIRAK MONDAL That was a great example! Thank you so much!!!
@salrite
@salrite 6 жыл бұрын
Less uncertainty means your chances to predict the outcome is more aka predictability increases and hence it refers to less information, means you don't require as many bits (say: to represent outcome) as you would when uncertainty was high or the outcome was pure random. Make Sense?
@rcgonzalezf
@rcgonzalezf 5 жыл бұрын
I also have this question. I guess we need to define information in this context, for the answers and the video itself I think they're referring to data and as I posted below, data is different than information. More questions = more data to get the same information (the output), but I might be missing something.
@ahmedelsharkawy1474
@ahmedelsharkawy1474 6 жыл бұрын
just awesome
@btsandtxtloverstraykidzfan3486
@btsandtxtloverstraykidzfan3486 2 жыл бұрын
What are some good books on this topic ?
@shepbryan4315
@shepbryan4315 5 жыл бұрын
Why is the number of bounces the log of the outcomes?
@juanpablovaca-lago5659
@juanpablovaca-lago5659 3 жыл бұрын
Is there a direct analogy for the second and third law of thermodynamics and the information entropy?
@jorgeleirana-alcocer5642
@jorgeleirana-alcocer5642 3 жыл бұрын
The equation in 3:48 should result in 2.25 not 1.75 (0.5*1)+(0.125*3)+(0.125*3)+(0.25*4)= 2.25 I think it should have been (0.5*1)+(0.125*3)+(0.125*3)+(0.25*2)
@ChusKon1
@ChusKon1 3 жыл бұрын
Beautiful
@hiteshjambhale301
@hiteshjambhale301 2 жыл бұрын
Hey there is one mistake at timestamp 1:26 ..... the question should be "is it C?" instead of B
@dissdad8744
@dissdad8744 8 жыл бұрын
Good explanation! If I wanted to calculate the entropy with log2, which calculator can do this? Is there an online calculator for this? What would be the best approach?
@ElectricChaplain
@ElectricChaplain 7 жыл бұрын
Hans Franz Too late now, but log2 b = ln b / ln 2 or more generally log 2 b = log base a of n / log base a of 2.
@FGNiniSun
@FGNiniSun 3 жыл бұрын
Hello please why does the number of outcomes at a level equal to 1/probability ?
@hingaglaiawong7815
@hingaglaiawong7815 2 жыл бұрын
at @3:15 I think there's a typo? The last term should be 0.25*2 instead of 0.25*4 I guess.
@hirakmondal6174
@hirakmondal6174 7 жыл бұрын
Why outcome is 1/p?
@anirbanmukherjee4577
@anirbanmukherjee4577 6 жыл бұрын
Possibility of outcome=1/number of possibility
@alexhsia9510
@alexhsia9510 5 жыл бұрын
What do they mean by number of outcomes? Can someone give me an example using the ABCD examples they used?
@temenoujkafuller4757
@temenoujkafuller4757 2 жыл бұрын
Yes, I asked myself this question and watch it twice. (5:45) Count the number of branches at the bottom The number of final outcomes = 2^(number of bounces) Therefore, the inverse function of exponent is logarithm >>> The number of bounces = = the number of questions = log_2 (number of outcomes)
@_crispins
@_crispins 7 жыл бұрын
nice!
@YTBxd227
@YTBxd227 5 жыл бұрын
still confused why #outcome=1/pi
@CZRaS
@CZRaS 3 жыл бұрын
because you need to "build" a binary tree to simulate bounces. E.g. you have probability p=1/2 (50%). From that outcome = 1/1/2 = 2. If you have p=1/8 (12,5%), you get outcome = 8. From which you can get the log2, which is basically the level on which the value is in the binary tree.
@suliu2933
@suliu2933 6 жыл бұрын
Great video! I can follow it but I have trouble understanding the problem statement. Why "the most efficient way is to pose a question which divides the possibility by half"?
@vandanachandola322
@vandanachandola322 4 жыл бұрын
Too late, but maybe because we're trying to ask the minimum no. of questions (and therefore going with the higher probability first)?
@xxxxxx-wq2rd
@xxxxxx-wq2rd 4 жыл бұрын
is it valid to say less entropy = less effort required?
@최로봇
@최로봇 4 жыл бұрын
if it makes us ask less questions, doesn't it mean it provides more information?
@mikibellomillo
@mikibellomillo 6 ай бұрын
note: number of bounces - entropy is maximum when all outcomes are equally likely . when introduce predictability the entropy must go down. thanks for sharing this video! God bless you!🎉
@gustavomartins007
@gustavomartins007 7 ай бұрын
Muito Bom
@kawaikaede2269
@kawaikaede2269 2 жыл бұрын
cool
@sholi9718
@sholi9718 6 жыл бұрын
can someone explain in #bounces=p(a)x1+p(b)x3+p(c)x3+p(d)x2 at 3:44 , how numbers 1,3,3 & 2 came?
@achrafamrani2730
@achrafamrani2730 3 жыл бұрын
x1 is the number of bounces needed to get to point A which is equal to 1 (steps needed for the disc to fall in case A) , x2=x3 is the number to get to points C and B separately and it equals to 3. For x4 it takes 2 bounces to fall in D
@sanadarkia2724
@sanadarkia2724 5 жыл бұрын
can't we just ask one question? is it abc or d ? edit: nevermind, i just figured that 1 bit removes uncertainty of 1/2
@pablobiedma
@pablobiedma 6 жыл бұрын
So if I recall correctly, the one with the highest entropy is the least informative one, then the, if a machine generates symbols, and apply the formula for each symbol, which symbol provides the most information? the one with the least amount of bits? how does that make sense, isn't it the one with the highest amount of bits? calculated by p log( 1/p)
@divermike8943
@divermike8943 9 ай бұрын
What if machine 1 had 6 letters ABCDEF? How many questions on average would you have to ask? Using the Bounce analogy I get 16/6 = 2.6667. But -Log2(1/6) =2.585. What am I doing wrong? Is it ABC? If yes then Is it A or B? If yes then Is it A? for A 3 questions. 3 questions for B. . But if it's C only 2 questions : Is it ABC, Y. Is it AB, N, then it's C. Likewise if it is D or E or F? "3 Questions for D & E. Is it ABC? No. Is it D or E? Yes Is it D? 2 questions if it's F. 3+3+2 + 3+3+2 = 16. 16/6=2.6667 on average not 2.585 =log2 (1/6).
@monicarenas
@monicarenas 3 жыл бұрын
In minute 3:51, I guess there is a mistake, for p_D, the value is 2 instead of 4, does not?
@sanjayrakshit8797
@sanjayrakshit8797 5 жыл бұрын
Heckin Shanon
@assylblog
@assylblog 5 жыл бұрын
Cool beat
@samhe331
@samhe331 8 ай бұрын
I think the math at 3:52 is wrong.. should be 0.25 x 2 instead of 0.25 x 4 but the result is right 1.75
@mikechristian-vn1le
@mikechristian-vn1le Жыл бұрын
Language is a much more powerful invention than the alphabet, and written language -- Chinese and the Japanese syllabary don't use alphabets -- are more powerful than the alphabet. And written language includes numbers and mathematical symbols . . .
@mansigupta
@mansigupta 7 жыл бұрын
An excellent excellent excellent video. I finally get it.
@abdelrahmangamalmahdy
@abdelrahmangamalmahdy 7 жыл бұрын
I don't understand why he always divides by 2 !!
@mansigupta
@mansigupta 7 жыл бұрын
Because the way the questions are framed, allow for only two possible answers - Yes or No
@tag_of_frank
@tag_of_frank 4 жыл бұрын
Why is entropy and information given the same symbol H, and why does the information formula given in video 5 of playlist include an "n" for the number of symbols transmitted, but this does not?
@betbola5209
@betbola5209 8 жыл бұрын
como se calcula a entropia de um texto? e o que podemos fazer com isso?
@poisonpotato1
@poisonpotato1 6 жыл бұрын
What if we used ternary ?
@zainulabydeen2809
@zainulabydeen2809 5 жыл бұрын
Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated
@youssefdirani
@youssefdirani 4 жыл бұрын
4:45 Markoff or Markov ?
@infernocaptures8739
@infernocaptures8739 3 жыл бұрын
4:36 **less** information?
@betoib1504
@betoib1504 6 жыл бұрын
!Órale!
@zkhandwala
@zkhandwala 4 жыл бұрын
Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here...
@daihung3824
@daihung3824 3 жыл бұрын
I agree with your statement. I try to have a go at changing the probabilities, say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces for D should still remain 3 is it?
@ignatutka6202
@ignatutka6202 3 жыл бұрын
how come machine's two entropy is more than one? if entropy's maximum is one
@sammyz1128
@sammyz1128 5 жыл бұрын
Why can't we ask whether it is AB, for the second distribution, same as the first distribution?
@sammyz1128
@sammyz1128 5 жыл бұрын
OHH i get it. If doing so, the average number of questions we ask will be bigger
@ArthurShelby-y8s
@ArthurShelby-y8s Жыл бұрын
👏🏻
@phuocnguyenlethanh3104
@phuocnguyenlethanh3104 Жыл бұрын
the number of bounces is not equivalent to the number of questions asked
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 661 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
Information Theory Basics
16:22
Intelligent Systems Lab
Рет қаралды 74 М.
A better description of entropy
11:43
Steve Mould
Рет қаралды 2,2 МЛН
Why Information Theory is Important - Computerphile
12:33
Computerphile
Рет қаралды 159 М.
What is NOT Random?
10:00
Veritasium
Рет қаралды 8 МЛН
But what is a neural network? | Deep learning chapter 1
18:40
3Blue1Brown
Рет қаралды 18 МЛН
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 210 М.