Information entropy | Journey into information theory | Computer Science | Khan Academy

  Рет қаралды 314,451

Khan Academy Labs

Khan Academy Labs

10 жыл бұрын

Finally we arrive at our quantitative measure of entropy
Watch the next lesson: www.khanacademy.org/computing...
Missed the previous lesson? www.khanacademy.org/computing...
Computer Science on Khan Academy: Learn select topics from computer science - algorithms (how we solve common problems in computer science and measure the efficiency of our solutions), cryptography (how we protect secret information), and information theory (how we encode and compress information).
About Khan Academy: Khan Academy is a nonprofit with a mission to provide a free, world-class education for anyone, anywhere. We believe learners of all ages should have unlimited access to free educational content they can master at their own pace. We use intelligent software, deep data analytics and intuitive user interfaces to help students and teachers around the world. Our resources cover preschool through early college education, including math, biology, chemistry, physics, economics, finance, history, grammar and more. We offer free personalized SAT test prep in partnership with the test developer, the College Board. Khan Academy has been translated into dozens of languages, and 100 million people use our platform worldwide every year. For more information, visit www.khanacademy.org, join us on Facebook or follow us on Twitter at @khanacademy. And remember, you can learn anything.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to Khan Academy’s Computer Science channel: / channel
Subscribe to Khan Academy: kzbin.info_...

Пікірлер: 184
@vandanachandola322
@vandanachandola322 3 жыл бұрын
I remember my teacher in high school defined entropy as "the degree of randomness". I decided it was an abstract concept that I don't get. Now learning about information entropy in my master's class, I found this video and I'm so glad I did!! Thanks, it's very well explained :)
@itsRAWRtime007
@itsRAWRtime007 8 жыл бұрын
good video. i like the way it shows the intuition behind the concept, that is the reason why the concepts actually exists rather than plainly defining it and then showing its properties.
@youngsublee1102
@youngsublee1102 3 жыл бұрын
couldn't agree more
@amirizaiah7179
@amirizaiah7179 2 жыл бұрын
I guess Im kind of randomly asking but do anybody know of a good site to watch newly released series online?
@andrewrandall9989
@andrewrandall9989 2 жыл бұрын
@Amir Izaiah Flixportal :D
@amirizaiah7179
@amirizaiah7179 2 жыл бұрын
@Andrew Randall Thank you, I signed up and it seems like a nice service :) I appreciate it !!
@andrewrandall9989
@andrewrandall9989 2 жыл бұрын
@Amir Izaiah you are welcome xD
@XavierGlenKuei
@XavierGlenKuei 6 жыл бұрын
at 1:24, i would argue the 3rd question (ie, the question on the right of 2nd hierarchy) should be "Is it C?" (or "Is it D?") rather than "Is it B?" (i think this is so because, as the 1st machine answered "No" to the 1st question ["Is it AB?"], it essentially rules-out both A and B, leaving only C(or D) as the possible outcome; hence no role for "B" anymore)
@marcuschiu8615
@marcuschiu8615 4 жыл бұрын
yea, I agree with you. dam, so many mistakes in this video, 1:24 and 3:50. makes me question their reliability... good video though
@dien2971
@dien2971 4 жыл бұрын
I thought I understood wrong lol. Thank you!
@muhaymenulislam1942
@muhaymenulislam1942 2 жыл бұрын
But here the probability of D is 25% which Is more than 12.5%, so in the second they ask is it D? .
@kempisabel9945
@kempisabel9945 4 жыл бұрын
this video blew my mind away. Thank you! I love these intelligent yet fun videos!
@mathaha2922
@mathaha2922 3 жыл бұрын
This is one of the most informative -- and I use that term advisedly -- videos I have ever seen. Thank you!
@someshsharma6683
@someshsharma6683 6 жыл бұрын
Awesome explanation with a very intuitive example.Thanks a lot...
@mostafaomar5441
@mostafaomar5441 4 жыл бұрын
Thank you. Explains the intuition behind Entropy very clearly.
@MaryMary-ep4hd
@MaryMary-ep4hd 3 жыл бұрын
Ingenious interpretation! I applaud!
@raultellegen5512
@raultellegen5512 7 жыл бұрын
Amazing video. Seldom seen a better explanation of anything. Thanks!
@antoinecantin1780
@antoinecantin1780 2 жыл бұрын
What a formidable way of visualizing and introducing information entropy. Your contributions are deeply appreciated
@youngsublee1102
@youngsublee1102 3 жыл бұрын
Wonderful idea of "bounce" that express the amount of information. It's so exciting.
@YYchen713
@YYchen713 2 жыл бұрын
This is such a great way to explain information entropy! Classic!
@kartikbansal6439
@kartikbansal6439 4 жыл бұрын
Loved the piano bit towards the conclusion!
@BambiOnIce19
@BambiOnIce19 Жыл бұрын
Perfectly well explained. The best video on information entropy I’ve seen so far
@daihung3824
@daihung3824 3 жыл бұрын
I have one question: Say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces should remain 3 is it? Would anyone mind explaining this possible difference? Thanks a lot!
@potatocoder5090
@potatocoder5090 Жыл бұрын
Brilliant explanation. So simple yet so profound. Thanks!
@tythedev9582
@tythedev9582 4 жыл бұрын
Yessss I finally got the concept after this video.
@pedrogorilla483
@pedrogorilla483 4 жыл бұрын
I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.
@Dhanush-zj7mf
@Dhanush-zj7mf 3 жыл бұрын
1:24 You are asking same question twice because you already asked "Is it A or B" in the root if the answer is no that means "it will be either C or D" but you are asking again whether it is B, or not in the sub branch. It should be either "Is it C" or "Is it D".
@MohdFirdaus-fk6no
@MohdFirdaus-fk6no 3 жыл бұрын
yes, you are correct
@user-vr1so7tc7x
@user-vr1so7tc7x 4 жыл бұрын
The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!
@salrite
@salrite 6 жыл бұрын
What a Beautiful Explanation!!!
@n0MC
@n0MC 7 жыл бұрын
this is wonderful. thank you
@shelendrasharma9680
@shelendrasharma9680 5 жыл бұрын
Best explaination , salute ....
@jonathan.gasser
@jonathan.gasser 4 жыл бұрын
Wow, what a presentation!
@bouzouidjasidahmed1203
@bouzouidjasidahmed1203 2 жыл бұрын
Very comprehensible thank you !! it very helpful
@suryacharan5184
@suryacharan5184 4 жыл бұрын
What a video!!....This is how education should be.
@hrivera4201
@hrivera4201 2 жыл бұрын
previous lesson: kzbin.info/www/bejne/jaqkpYKnm6iceNk next lesson: kzbin.info/www/bejne/iqnOcmiLjZmen9U
@tingwen524
@tingwen524 3 жыл бұрын
Great video! I totally understood entropy!
@vlaaady
@vlaaady 3 жыл бұрын
The most intuitive explanation
@mohammadrezamoohebat9407
@mohammadrezamoohebat9407 9 жыл бұрын
It was perfect. thx
@daviddeleon292
@daviddeleon292 5 жыл бұрын
Huh??? Why am I finding out that information entropy was a concept. MIND BLOWN!!!
@twoplustwo5
@twoplustwo5 Ай бұрын
Kudos for linking number of bounces -> binary tree -> log. And overall very nice explanation. That's like 3rd explanation for info entropy i liked.
@boredomgotmehere
@boredomgotmehere Жыл бұрын
Makes it all so super clear and easy to follow. Love this.
@boredomgotmehere
@boredomgotmehere 4 ай бұрын
Just a tiny error at 3:50 - the final calculation shld be 0.25*2.
@gaofan2856
@gaofan2856 2 жыл бұрын
The most beautiful explanation of entropy
@Ewerlopes
@Ewerlopes 9 жыл бұрын
Perfect explanation! :)
@waylonbarrett3456
@waylonbarrett3456 Жыл бұрын
I found a few errors. Am I the only one seeing this?
@yudong8820
@yudong8820 3 жыл бұрын
Really good one, thanks!
@YuriValentines
@YuriValentines 2 жыл бұрын
This video has explained entropy better than any teacher I've had in my entire life. It makes me so angry to think of all my time wasted in endless lectures, listening to people with no communication skills.
@AhmedKMoustafa2
@AhmedKMoustafa2 6 жыл бұрын
great explanation bro :)
@letsgame514
@letsgame514 3 жыл бұрын
just amazing.
@argha-qi5hf
@argha-qi5hf 2 жыл бұрын
I can't imagine how someone could ever come up with such abstract ideas.
@russianescapist5262
@russianescapist5262 2 жыл бұрын
I loved this surreal music and real life objects to move in a grey 60s like atmosphere.)
@ahmedelsharkawy1474
@ahmedelsharkawy1474 5 жыл бұрын
just awesome
@fangyuanlin8966
@fangyuanlin8966 9 күн бұрын
3:51 Typo when computing the entropy for machine 2
@maierdanefan6998
@maierdanefan6998 4 жыл бұрын
Thank you!
@science.20246
@science.20246 3 ай бұрын
قوة في الشرح و وضوح
@souvikmajumdar5604
@souvikmajumdar5604 5 жыл бұрын
Thank You
@edwardjurkowitz1663
@edwardjurkowitz1663 3 жыл бұрын
Excellent video. I think one point mistakenly refers to "information" when the author means 'entropy.' Machine 2 requires fewer questions. It produces more information and less entropy. Machine one produces maximum entropy and minimum information. Information is 'negative entropy.'
@osobliwynick
@osobliwynick Жыл бұрын
Great explanation.
@nomann5244
@nomann5244 Жыл бұрын
you are truly a genius.
@_crispins
@_crispins 6 жыл бұрын
nice!
@swazza9999
@swazza9999 4 жыл бұрын
This should have more likes!
@Puneethmypadi
@Puneethmypadi 3 жыл бұрын
Now I understand decision tree properly
@chandragupta2828
@chandragupta2828 4 жыл бұрын
awesome video!
@TheLegendOfCockpunch
@TheLegendOfCockpunch 5 жыл бұрын
The 'M' and 'W' are switched and upside down while the 'Z' is just a sideways 'N'...my vote is intentional 6:32
@miketor2011
@miketor2011 3 жыл бұрын
Great video but is it just me or there is an error on 3:49. The correct calculation for the number of bounces should be 0.5*1+0.125*3+0.125*3+0.25*2 = 1.75 instead the video shows 0.5*1+0.125*3+0.125*3+0.25*4 =2.25? Any thoughts?
@musicmaker33428
@musicmaker33428 2 жыл бұрын
I was just thinking this. Thank you for pointing it out. I thought maybe I misunderstood something fundamental.
@betbola5209
@betbola5209 8 жыл бұрын
como se calcula a entropia de um texto? e o que podemos fazer com isso?
@malevip
@malevip 3 жыл бұрын
Another way to look at entropy: Measure of distribution of probability in a probability distribution.
@juanpablovaca-lago5659
@juanpablovaca-lago5659 2 жыл бұрын
Is there a direct analogy for the second and third law of thermodynamics and the information entropy?
@jayrar6645
@jayrar6645 4 жыл бұрын
so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?
@FrancescoDeToni
@FrancescoDeToni 8 жыл бұрын
Isn't there a mistake at 3:50? Shouldn't it be 0.25 x 2 instead of 0.25 x 4?
@philtrem
@philtrem 8 жыл бұрын
+Francesco De Toni yup!
@sighage
@sighage 5 жыл бұрын
Yes, it's 2.25 I guess
@rah2023
@rah2023 5 жыл бұрын
It's indeed a mistake
@nikhilsrajan
@nikhilsrajan 3 жыл бұрын
@@sighage no it's 1.75, just there was a typo. you get 1.75 with 0.25 x 2
@karrde666666
@karrde666666 2 жыл бұрын
why can't textbooks or lectures be this easy
@sanadarkia2724
@sanadarkia2724 5 жыл бұрын
can't we just ask one question? is it abc or d ? edit: nevermind, i just figured that 1 bit removes uncertainty of 1/2
@ChusKon1
@ChusKon1 3 жыл бұрын
Beautiful
@Chrls5
@Chrls5 2 жыл бұрын
Nice!
@assylblog
@assylblog 5 жыл бұрын
Cool beat
@dissdad8744
@dissdad8744 7 жыл бұрын
Good explanation! If I wanted to calculate the entropy with log2, which calculator can do this? Is there an online calculator for this? What would be the best approach?
@ElectricChaplain
@ElectricChaplain 7 жыл бұрын
Hans Franz Too late now, but log2 b = ln b / ln 2 or more generally log 2 b = log base a of n / log base a of 2.
@leoyuk-tingcheung3587
@leoyuk-tingcheung3587 8 жыл бұрын
Could anyone help explain why less uncertainty means less information (Machine 2)? Isn't it the other way round? Many thanks.
@TheOmanzano
@TheOmanzano 8 жыл бұрын
there is less certainty in machine 2 because on "average" there will be less questions...meaning after many trials on average there will be 1.75 questions needed to get right result meaning there is less variety, randomness, chaos in machine 2 due to the fact that "A" will be occurring alot more than other letters
@navketanbatra1522
@navketanbatra1522 8 жыл бұрын
+TheOmanzano Yea so since the number of questions we need to ask, to guess the symbol, on an average, is less in machine 2 - so this should imply that machine 2 is giving us 'more' information 'per answer for a question asked' right? I'm really confused at its physical interpretation of information gained.
@navketanbatra1522
@navketanbatra1522 8 жыл бұрын
Right! Got it! So its not that we're getting 'more information per answer', we will be getting same amount of information for each question asked for whichever machine. The fact that we have to ask less number of questions is because there is 'less uncertainty' in the outcome; we already have some 'idea' or 'prediction' for some outcome implying there will be less information gained when the outcome is observed. *phew*
@hirakmondal6174
@hirakmondal6174 6 жыл бұрын
Think of it as a Hollywood film..where a police inspector interrogates a criminal and he must speak truth each and every time. After 175 questions the inspector found out that he knows no more than that, where as when he interrogated other criminal in an adjacent cell he found out that after asking 175 questions he can still answer 25 more.. Now U Tell Me Who Has More Information? . . . . U are welcome!! 8)
@Exhora
@Exhora 6 жыл бұрын
HIRAK MONDAL That was a great example! Thank you so much!!!
@suliu2933
@suliu2933 6 жыл бұрын
Great video! I can follow it but I have trouble understanding the problem statement. Why "the most efficient way is to pose a question which divides the possibility by half"?
@vandanachandola322
@vandanachandola322 3 жыл бұрын
Too late, but maybe because we're trying to ask the minimum no. of questions (and therefore going with the higher probability first)?
@shepbryan4315
@shepbryan4315 4 жыл бұрын
Why is the number of bounces the log of the outcomes?
@YTBxd227
@YTBxd227 5 жыл бұрын
still confused why #outcome=1/pi
@CZRaS
@CZRaS 3 жыл бұрын
because you need to "build" a binary tree to simulate bounces. E.g. you have probability p=1/2 (50%). From that outcome = 1/1/2 = 2. If you have p=1/8 (12,5%), you get outcome = 8. From which you can get the log2, which is basically the level on which the value is in the binary tree.
@betoib1504
@betoib1504 6 жыл бұрын
!Órale!
@kawaikaede2269
@kawaikaede2269 Жыл бұрын
cool
@btsandtxtloverstraykidzfan3486
@btsandtxtloverstraykidzfan3486 2 жыл бұрын
What are some good books on this topic ?
@hirakmondal6174
@hirakmondal6174 6 жыл бұрын
Why outcome is 1/p?
@anirbanmukherjee4577
@anirbanmukherjee4577 5 жыл бұрын
Possibility of outcome=1/number of possibility
@pablobiedma
@pablobiedma 5 жыл бұрын
So if I recall correctly, the one with the highest entropy is the least informative one, then the, if a machine generates symbols, and apply the formula for each symbol, which symbol provides the most information? the one with the least amount of bits? how does that make sense, isn't it the one with the highest amount of bits? calculated by p log( 1/p)
@sanjayrakshit8797
@sanjayrakshit8797 4 жыл бұрын
Heckin Shanon
@alexhsia9510
@alexhsia9510 5 жыл бұрын
What do they mean by number of outcomes? Can someone give me an example using the ABCD examples they used?
@temenoujkafuller4757
@temenoujkafuller4757 Жыл бұрын
Yes, I asked myself this question and watch it twice. (5:45) Count the number of branches at the bottom The number of final outcomes = 2^(number of bounces) Therefore, the inverse function of exponent is logarithm >>> The number of bounces = = the number of questions = log_2 (number of outcomes)
@user-ut4ii6ge2t
@user-ut4ii6ge2t 8 ай бұрын
👏🏻
@poisonpotato1
@poisonpotato1 5 жыл бұрын
What if we used ternary ?
@hingaglaiawong7815
@hingaglaiawong7815 2 жыл бұрын
at @3:15 I think there's a typo? The last term should be 0.25*2 instead of 0.25*4 I guess.
@FGNiniSun
@FGNiniSun 3 жыл бұрын
Hello please why does the number of outcomes at a level equal to 1/probability ?
@robot_boi
@robot_boi 3 жыл бұрын
if it makes us ask less questions, doesn't it mean it provides more information?
@jorgeleirana-alcocer5642
@jorgeleirana-alcocer5642 2 жыл бұрын
The equation in 3:48 should result in 2.25 not 1.75 (0.5*1)+(0.125*3)+(0.125*3)+(0.25*4)= 2.25 I think it should have been (0.5*1)+(0.125*3)+(0.125*3)+(0.25*2)
@zainulabydeen2809
@zainulabydeen2809 4 жыл бұрын
Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated
@MNKPrototype
@MNKPrototype 6 жыл бұрын
Did anyone else notice the DEATH NOTE music at 4:40.
@GiuseppeRomagnuolo
@GiuseppeRomagnuolo 4 жыл бұрын
I was wondering what was that music, I really like it. Do you have any link I found this kzbin.info/www/bejne/nnzJfIyml8Zjmqc following your comment. Is that it? If so can you point me to the right minute? Tnx
@lookstheory2.0
@lookstheory2.0 2 жыл бұрын
3:51 their is an error. you have Pd*4 term instead of Pd*2.
@gomdol_archive
@gomdol_archive 4 жыл бұрын
겁나 이해 잘됨
@youssefdirani
@youssefdirani 4 жыл бұрын
4:45 Markoff or Markov ?
@tag_of_frank
@tag_of_frank 3 жыл бұрын
Why is entropy and information given the same symbol H, and why does the information formula given in video 5 of playlist include an "n" for the number of symbols transmitted, but this does not?
@mansigupta
@mansigupta 6 жыл бұрын
An excellent excellent excellent video. I finally get it.
@abdelrahmangamalmahdy
@abdelrahmangamalmahdy 6 жыл бұрын
I don't understand why he always divides by 2 !!
@mansigupta
@mansigupta 6 жыл бұрын
Because the way the questions are framed, allow for only two possible answers - Yes or No
@xxxxxx-wq2rd
@xxxxxx-wq2rd 3 жыл бұрын
is it valid to say less entropy = less effort required?
@sholi9718
@sholi9718 5 жыл бұрын
can someone explain in #bounces=p(a)x1+p(b)x3+p(c)x3+p(d)x2 at 3:44 , how numbers 1,3,3 & 2 came?
@achrafamrani2730
@achrafamrani2730 2 жыл бұрын
x1 is the number of bounces needed to get to point A which is equal to 1 (steps needed for the disc to fall in case A) , x2=x3 is the number to get to points C and B separately and it equals to 3. For x4 it takes 2 bounces to fall in D
@monicarenas
@monicarenas 3 жыл бұрын
In minute 3:51, I guess there is a mistake, for p_D, the value is 2 instead of 4, does not?
@infernocaptures8739
@infernocaptures8739 2 жыл бұрын
4:36 **less** information?
@sammyz1128
@sammyz1128 4 жыл бұрын
Why can't we ask whether it is AB, for the second distribution, same as the first distribution?
@sammyz1128
@sammyz1128 4 жыл бұрын
OHH i get it. If doing so, the average number of questions we ask will be bigger
@phuocnguyenlethanh3104
@phuocnguyenlethanh3104 6 ай бұрын
the number of bounces is not equivalent to the number of questions asked
@ignatutka6202
@ignatutka6202 2 жыл бұрын
how come machine's two entropy is more than one? if entropy's maximum is one
@zkhandwala
@zkhandwala 4 жыл бұрын
Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here...
@daihung3824
@daihung3824 3 жыл бұрын
I agree with your statement. I try to have a go at changing the probabilities, say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces for D should still remain 3 is it?
@user-nf6jl9cg1t
@user-nf6jl9cg1t 7 ай бұрын
why is the number of bounces 1/p
4 жыл бұрын
Phụ đề Tiếng Việt ở 4:34 sai rồi, máy 2 sản xuất ít thông tin hơn máy 1
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 201 М.
Haha😂 Power💪 #trending #funny #viral #shorts
00:18
Reaction Station TV
Рет қаралды 14 МЛН
The day of the sea 🌊 🤣❤️ #demariki
00:22
Demariki
Рет қаралды 84 МЛН
Just try to use a cool gadget 😍
00:33
123 GO! SHORTS
Рет қаралды 85 МЛН
Information Theory Basics
16:22
Intelligent Systems Lab
Рет қаралды 61 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
How Quantum Entanglement Creates Entropy
19:36
PBS Space Time
Рет қаралды 1 МЛН
How Much Information?
5:47
Veritasium
Рет қаралды 2,5 МЛН
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
10:41
Aurélien Géron
Рет қаралды 342 М.
Entropy is not what you think!
9:15
MarbleScience
Рет қаралды 7 М.
The Most Important (and Surprising) Result from Information Theory
9:10
Mutual Information
Рет қаралды 85 М.
Haha😂 Power💪 #trending #funny #viral #shorts
00:18
Reaction Station TV
Рет қаралды 14 МЛН