I remember my teacher in high school defined entropy as "the degree of randomness". I decided it was an abstract concept that I don't get. Now learning about information entropy in my master's class, I found this video and I'm so glad I did!! Thanks, it's very well explained :)
@itsRAWRtime0079 жыл бұрын
good video. i like the way it shows the intuition behind the concept, that is the reason why the concepts actually exists rather than plainly defining it and then showing its properties.
@youngsublee11024 жыл бұрын
couldn't agree more
@amirizaiah71793 жыл бұрын
I guess Im kind of randomly asking but do anybody know of a good site to watch newly released series online?
@andrewrandall99893 жыл бұрын
@Amir Izaiah Flixportal :D
@amirizaiah71793 жыл бұрын
@Andrew Randall Thank you, I signed up and it seems like a nice service :) I appreciate it !!
@andrewrandall99893 жыл бұрын
@Amir Izaiah you are welcome xD
@XavierGlenKuei6 жыл бұрын
at 1:24, i would argue the 3rd question (ie, the question on the right of 2nd hierarchy) should be "Is it C?" (or "Is it D?") rather than "Is it B?" (i think this is so because, as the 1st machine answered "No" to the 1st question ["Is it AB?"], it essentially rules-out both A and B, leaving only C(or D) as the possible outcome; hence no role for "B" anymore)
@marcuschiu86155 жыл бұрын
yea, I agree with you. dam, so many mistakes in this video, 1:24 and 3:50. makes me question their reliability... good video though
@dien29714 жыл бұрын
I thought I understood wrong lol. Thank you!
@muhaymenulislam19422 жыл бұрын
But here the probability of D is 25% which Is more than 12.5%, so in the second they ask is it D? .
@pedrogorilla4835 жыл бұрын
I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.
@kempisabel99454 жыл бұрын
this video blew my mind away. Thank you! I love these intelligent yet fun videos!
@mathaha29224 жыл бұрын
This is one of the most informative -- and I use that term advisedly -- videos I have ever seen. Thank you!
@Dhanush-zj7mf4 жыл бұрын
1:24 You are asking same question twice because you already asked "Is it A or B" in the root if the answer is no that means "it will be either C or D" but you are asking again whether it is B, or not in the sub branch. It should be either "Is it C" or "Is it D".
@MohdFirdaus-fk6no3 жыл бұрын
yes, you are correct
@salrite6 жыл бұрын
What a Beautiful Explanation!!!
@MaryMary-ep4hd4 жыл бұрын
Ingenious interpretation! I applaud!
@someshsharma66836 жыл бұрын
Awesome explanation with a very intuitive example.Thanks a lot...
@boredomgotmehere Жыл бұрын
Makes it all so super clear and easy to follow. Love this.
@boredomgotmehere10 ай бұрын
Just a tiny error at 3:50 - the final calculation shld be 0.25*2.
@antoinecantin17803 жыл бұрын
What a formidable way of visualizing and introducing information entropy. Your contributions are deeply appreciated
@suryacharan51844 жыл бұрын
What a video!!....This is how education should be.
@YYchen7132 жыл бұрын
This is such a great way to explain information entropy! Classic!
@YuriValentines3 жыл бұрын
This video has explained entropy better than any teacher I've had in my entire life. It makes me so angry to think of all my time wasted in endless lectures, listening to people with no communication skills.
@twoplustwo57 ай бұрын
Kudos for linking number of bounces -> binary tree -> log. And overall very nice explanation. That's like 3rd explanation for info entropy i liked.
@raultellegen55128 жыл бұрын
Amazing video. Seldom seen a better explanation of anything. Thanks!
@SAGEmania-q8s5 ай бұрын
Thank you so much. It explains the entropy so well.
@BambiOnIce192 жыл бұрын
Perfectly well explained. The best video on information entropy I’ve seen so far
@youngsublee11024 жыл бұрын
Wonderful idea of "bounce" that express the amount of information. It's so exciting.
@daihung38243 жыл бұрын
I have one question: Say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces should remain 3 is it? Would anyone mind explaining this possible difference? Thanks a lot!
@potatocoder50902 жыл бұрын
Brilliant explanation. So simple yet so profound. Thanks!
@hrivera42012 жыл бұрын
previous lesson: kzbin.info/www/bejne/jaqkpYKnm6iceNk next lesson: kzbin.info/www/bejne/iqnOcmiLjZmen9U
@phycosmos4 ай бұрын
thanks
@bouzouidjasidahmed12033 жыл бұрын
Very comprehensible thank you !! it very helpful
@tythedev95824 жыл бұрын
Yessss I finally got the concept after this video.
@mostafaomar54415 жыл бұрын
Thank you. Explains the intuition behind Entropy very clearly.
@kartikbansal64394 жыл бұрын
Loved the piano bit towards the conclusion!
@Ewerlopes10 жыл бұрын
Perfect explanation! :)
@waylonbarrett3456 Жыл бұрын
I found a few errors. Am I the only one seeing this?
@argha-qi5hf2 жыл бұрын
I can't imagine how someone could ever come up with such abstract ideas.
@miketor20113 жыл бұрын
Great video but is it just me or there is an error on 3:49. The correct calculation for the number of bounces should be 0.5*1+0.125*3+0.125*3+0.25*2 = 1.75 instead the video shows 0.5*1+0.125*3+0.125*3+0.25*4 =2.25? Any thoughts?
@musicmaker334283 жыл бұрын
I was just thinking this. Thank you for pointing it out. I thought maybe I misunderstood something fundamental.
@rkiyanchuk2 ай бұрын
Yep, that's a typo, number of bounces for P(D) = 2.
@gaofan28563 жыл бұрын
The most beautiful explanation of entropy
@jonathan.gasser5 жыл бұрын
Wow, what a presentation!
@vlaaady4 жыл бұрын
The most intuitive explanation
@swazza99994 жыл бұрын
This should have more likes!
@nomann52442 жыл бұрын
you are truly a genius.
@daviddeleon2925 жыл бұрын
Huh??? Why am I finding out that information entropy was a concept. MIND BLOWN!!!
@tingwen5243 жыл бұрын
Great video! I totally understood entropy!
@ГримМорген5 жыл бұрын
The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!
@karrde6666663 жыл бұрын
why can't textbooks or lectures be this easy
@Hopemkhize-d2i5 ай бұрын
Tell me about it😢
@csaracho20094 ай бұрын
I have an answer for that: The zen pupil asks the master, is the flag moving with the wind? The master replies: neither the flag or the wind move, it is your mind that moves.
@FrancescoDeToni8 жыл бұрын
Isn't there a mistake at 3:50? Shouldn't it be 0.25 x 2 instead of 0.25 x 4?
@philtrem8 жыл бұрын
+Francesco De Toni yup!
@sighage5 жыл бұрын
Yes, it's 2.25 I guess
@rah20235 жыл бұрын
It's indeed a mistake
@nikhilsrajan4 жыл бұрын
@@sighage no it's 1.75, just there was a typo. you get 1.75 with 0.25 x 2
@yudong88203 жыл бұрын
Really good one, thanks!
@russianescapist52623 жыл бұрын
I loved this surreal music and real life objects to move in a grey 60s like atmosphere.)
@edwardjurkowitz16633 жыл бұрын
Excellent video. I think one point mistakenly refers to "information" when the author means 'entropy.' Machine 2 requires fewer questions. It produces more information and less entropy. Machine one produces maximum entropy and minimum information. Information is 'negative entropy.'
@lovenishkavat468 күн бұрын
My problem has been sold 10 years ago you are superman
@osobliwynick2 жыл бұрын
Great explanation.
@shelendrasharma96806 жыл бұрын
Best explaination , salute ....
@Puneethmypadi3 жыл бұрын
Now I understand decision tree properly
@mohammadrezamoohebat940710 жыл бұрын
It was perfect. thx
@malevip3 жыл бұрын
Another way to look at entropy: Measure of distribution of probability in a probability distribution.
@science_electronique9 ай бұрын
قوة في الشرح و وضوح
@n0MC7 жыл бұрын
this is wonderful. thank you
@Chrls53 жыл бұрын
Nice!
@leoyuk-tingcheung35878 жыл бұрын
Could anyone help explain why less uncertainty means less information (Machine 2)? Isn't it the other way round? Many thanks.
@TheOmanzano8 жыл бұрын
there is less certainty in machine 2 because on "average" there will be less questions...meaning after many trials on average there will be 1.75 questions needed to get right result meaning there is less variety, randomness, chaos in machine 2 due to the fact that "A" will be occurring alot more than other letters
@hirakmondal61746 жыл бұрын
Think of it as a Hollywood film..where a police inspector interrogates a criminal and he must speak truth each and every time. After 175 questions the inspector found out that he knows no more than that, where as when he interrogated other criminal in an adjacent cell he found out that after asking 175 questions he can still answer 25 more.. Now U Tell Me Who Has More Information? . . . . U are welcome!! 8)
@Exhora6 жыл бұрын
HIRAK MONDAL That was a great example! Thank you so much!!!
@salrite6 жыл бұрын
Less uncertainty means your chances to predict the outcome is more aka predictability increases and hence it refers to less information, means you don't require as many bits (say: to represent outcome) as you would when uncertainty was high or the outcome was pure random. Make Sense?
@rcgonzalezf5 жыл бұрын
I also have this question. I guess we need to define information in this context, for the answers and the video itself I think they're referring to data and as I posted below, data is different than information. More questions = more data to get the same information (the output), but I might be missing something.
@chandragupta28285 жыл бұрын
awesome video!
@hiteshjambhale3012 жыл бұрын
Hey there is one mistake at timestamp 1:26 ..... the question should be "is it C?" instead of B
@souvikmajumdar56046 жыл бұрын
Thank You
@jorgeleirana-alcocer56423 жыл бұрын
The equation in 3:48 should result in 2.25 not 1.75 (0.5*1)+(0.125*3)+(0.125*3)+(0.25*4)= 2.25 I think it should have been (0.5*1)+(0.125*3)+(0.125*3)+(0.25*2)
@jayrar66455 жыл бұрын
so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?
@ahmedelsharkawy14746 жыл бұрын
just awesome
@hingaglaiawong78152 жыл бұрын
at @3:15 I think there's a typo? The last term should be 0.25*2 instead of 0.25*4 I guess.
@juanpablovaca-lago56593 жыл бұрын
Is there a direct analogy for the second and third law of thermodynamics and the information entropy?
@AhmedKMoustafa26 жыл бұрын
great explanation bro :)
@mikibellomillo5 ай бұрын
note: number of bounces - entropy is maximum when all outcomes are equally likely . when introduce predictability the entropy must go down. thanks for sharing this video! God bless you!🎉
@ChusKon13 жыл бұрын
Beautiful
@sholi97185 жыл бұрын
can someone explain in #bounces=p(a)x1+p(b)x3+p(c)x3+p(d)x2 at 3:44 , how numbers 1,3,3 & 2 came?
@achrafamrani27303 жыл бұрын
x1 is the number of bounces needed to get to point A which is equal to 1 (steps needed for the disc to fall in case A) , x2=x3 is the number to get to points C and B separately and it equals to 3. For x4 it takes 2 bounces to fall in D
@shepbryan43155 жыл бұрын
Why is the number of bounces the log of the outcomes?
@monicarenas3 жыл бұрын
In minute 3:51, I guess there is a mistake, for p_D, the value is 2 instead of 4, does not?
@alexhsia95105 жыл бұрын
What do they mean by number of outcomes? Can someone give me an example using the ABCD examples they used?
@temenoujkafuller47572 жыл бұрын
Yes, I asked myself this question and watch it twice. (5:45) Count the number of branches at the bottom The number of final outcomes = 2^(number of bounces) Therefore, the inverse function of exponent is logarithm >>> The number of bounces = = the number of questions = log_2 (number of outcomes)
@btsandtxtloverstraykidzfan34862 жыл бұрын
What are some good books on this topic ?
@_crispins7 жыл бұрын
nice!
@FARHANSUBI16 күн бұрын
Can someone please explain why machine 2 is producing less information? Shouldn't it be more information because we're asking fewer questions? Or is it the more number of questions we ask, the more information we receive?
@kawaikaede22692 жыл бұрын
cool
@sanjayrakshit87975 жыл бұрын
Heckin Shanon
@sanadarkia27245 жыл бұрын
can't we just ask one question? is it abc or d ? edit: nevermind, i just figured that 1 bit removes uncertainty of 1/2
@suliu29336 жыл бұрын
Great video! I can follow it but I have trouble understanding the problem statement. Why "the most efficient way is to pose a question which divides the possibility by half"?
@vandanachandola3224 жыл бұрын
Too late, but maybe because we're trying to ask the minimum no. of questions (and therefore going with the higher probability first)?
@ArthurShelby-y8s Жыл бұрын
👏🏻
@dissdad87448 жыл бұрын
Good explanation! If I wanted to calculate the entropy with log2, which calculator can do this? Is there an online calculator for this? What would be the best approach?
@ElectricChaplain7 жыл бұрын
Hans Franz Too late now, but log2 b = ln b / ln 2 or more generally log 2 b = log base a of n / log base a of 2.
@gustavomartins0075 ай бұрын
Muito Bom
@최로봇4 жыл бұрын
if it makes us ask less questions, doesn't it mean it provides more information?
@assylblog5 жыл бұрын
Cool beat
@hirakmondal61746 жыл бұрын
Why outcome is 1/p?
@anirbanmukherjee45776 жыл бұрын
Possibility of outcome=1/number of possibility
@betoib15046 жыл бұрын
!Órale!
@samhe3316 ай бұрын
I think the math at 3:52 is wrong.. should be 0.25 x 2 instead of 0.25 x 4 but the result is right 1.75
@FGNiniSun3 жыл бұрын
Hello please why does the number of outcomes at a level equal to 1/probability ?
@youssefdirani4 жыл бұрын
4:45 Markoff or Markov ?
@mansigupta7 жыл бұрын
An excellent excellent excellent video. I finally get it.
@abdelrahmangamalmahdy7 жыл бұрын
I don't understand why he always divides by 2 !!
@mansigupta7 жыл бұрын
Because the way the questions are framed, allow for only two possible answers - Yes or No
@YTBxd2275 жыл бұрын
still confused why #outcome=1/pi
@CZRaS3 жыл бұрын
because you need to "build" a binary tree to simulate bounces. E.g. you have probability p=1/2 (50%). From that outcome = 1/1/2 = 2. If you have p=1/8 (12,5%), you get outcome = 8. From which you can get the log2, which is basically the level on which the value is in the binary tree.
@pablobiedma5 жыл бұрын
So if I recall correctly, the one with the highest entropy is the least informative one, then the, if a machine generates symbols, and apply the formula for each symbol, which symbol provides the most information? the one with the least amount of bits? how does that make sense, isn't it the one with the highest amount of bits? calculated by p log( 1/p)
@zkhandwala4 жыл бұрын
Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here...
@daihung38243 жыл бұрын
I agree with your statement. I try to have a go at changing the probabilities, say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces for D should still remain 3 is it?
@MNKPrototype6 жыл бұрын
Did anyone else notice the DEATH NOTE music at 4:40.
@GiuseppeRomagnuolo4 жыл бұрын
I was wondering what was that music, I really like it. Do you have any link I found this kzbin.info/www/bejne/nnzJfIyml8Zjmqc following your comment. Is that it? If so can you point me to the right minute? Tnx
@zainulabydeen28094 жыл бұрын
Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated
@betbola52098 жыл бұрын
como se calcula a entropia de um texto? e o que podemos fazer com isso?
@infernocaptures87392 жыл бұрын
4:36 **less** information?
@xxxxxx-wq2rd3 жыл бұрын
is it valid to say less entropy = less effort required?
@sammyz11284 жыл бұрын
Why can't we ask whether it is AB, for the second distribution, same as the first distribution?
@sammyz11284 жыл бұрын
OHH i get it. If doing so, the average number of questions we ask will be bigger
@mikechristian-vn1le Жыл бұрын
Language is a much more powerful invention than the alphabet, and written language -- Chinese and the Japanese syllabary don't use alphabets -- are more powerful than the alphabet. And written language includes numbers and mathematical symbols . . .
4 жыл бұрын
Phụ đề Tiếng Việt ở 4:34 sai rồi, máy 2 sản xuất ít thông tin hơn máy 1
@tag_of_frank4 жыл бұрын
Why is entropy and information given the same symbol H, and why does the information formula given in video 5 of playlist include an "n" for the number of symbols transmitted, but this does not?
@Apolytus7 ай бұрын
In 3.50 you mistakenly have written 0.25*4 instead of 0.25*2.