Shannon Entropy and Information Gain

  Рет қаралды 209,996

Serrano.Academy

Serrano.Academy

Күн бұрын

Пікірлер: 336
@sphengle
@sphengle 5 жыл бұрын
This was exactly the baby step I needed to get me on my way with entropy. Far too many people try to explain it by going straight to the equation. There's no intuition in that. Brilliant explanation. I finally understand it.
@jankinsics
@jankinsics 5 жыл бұрын
Sean Walsh feel the same way.
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
how does one make something so complicated into something so intuitive that others can finally see the picture. your explanation itself is an amazing feat.
@josephbolton8092
@josephbolton8092 4 ай бұрын
an amazing teacher is an invaluable thing
@AlexMcClung97
@AlexMcClung97 7 жыл бұрын
Excellent explanation, very clear and concise! I have always pondered the significance of the log in cross-entropy loss function. The explanation (particularly: "products are small and volatile, sums are good") completely clears this up.
@effemmkay
@effemmkay 4 жыл бұрын
I have been scared of delving into entropy in detail for so long because the first time I studied it, it wasn’t a good experience. All I want to say is THANK YOU!!!!!! I should have been supplementing the udacity ND lesson videos with these since the beginning.
@freemanguess8634
@freemanguess8634 6 жыл бұрын
With great knowledge comes low entropy
@SerranoAcademy
@SerranoAcademy 6 жыл бұрын
Hahaaa, love it!!!
@fantomraja9137
@fantomraja9137 5 жыл бұрын
lol
@hyperduality2838
@hyperduality2838 4 жыл бұрын
@@SerranoAcademy Repetition (redundancy) is dual to variation -- music. Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics. Randomness (entropy) is dual to order (predictability) -- "Always two there are" -- Yoda.
@B2T7RID2QGLEHH5UZFB0T
@B2T7RID2QGLEHH5UZFB0T 3 жыл бұрын
And low entropy is easier to rig
@lani0
@lani0 3 жыл бұрын
You win
@carnivalwrestler
@carnivalwrestler 6 жыл бұрын
Luis, you are such an incredibly gifted teacher and so meticulous in your explanations. Thank you for your hard work.
@RyanJensenEE
@RyanJensenEE 2 жыл бұрын
Good video! Minor correction of calculations: at 5:50, the probability of getting the same configuration is 0.25. This is because there are only 4 possible configurations of the balls (there is only one blue ball, and only four slots, so only 4 places the blue ball can be). This can also be calculated by selecting red balls first multiplying 0.75 * 0.66667 * 0.5 = 0.25. Similarly, at 6:58, the probability is 1/6 because there are 6 possible configurations. We can calculate the probability by multiplying (2/4) * (1/3) = (2/12) = (1/6) ~= 0.166667.
@elmoreglidingclub3030
@elmoreglidingclub3030 4 жыл бұрын
Excellent! Great explanation. Enjoyable video (except YT’s endless, annoying ads). Thank you for composing and posting.
@TheGenerationGapPodcast
@TheGenerationGapPodcast 3 жыл бұрын
Confession: I was a math kiddy; I know to use it but I often missed the deeper meaning and intuition. Your videos are turning me into a math hacker.
@Asli_Dexter
@Asli_Dexter 7 жыл бұрын
i wish i had this lecture during college examination.....still it's nice to finally understand the intuition behind the formulas i already knew.
@pixboi
@pixboi 6 жыл бұрын
Teaching should be like this, from practice to theory - no the other way around!
@dyutinrobin
@dyutinrobin 10 ай бұрын
Thank you so much. This was the only video in youtube that clarified all my doubts regarding the topic of entropy.
@NoOne-uz4vs
@NoOne-uz4vs 5 жыл бұрын
I'm studying Decision Tree (Machine Learning Algorithm) and it uses Entropy to efficiently build the tree. I finally understand the details. Thank you!!
@drakZes
@drakZes 5 жыл бұрын
Great work. Compared to my textbook you explained it 100 times better, Thank you.
@jackallread
@jackallread Жыл бұрын
Thanks for the relationship between knowledge and entropy, that was very helpful. Your explanation of statistics is also good! Though, I am only half way through the video at this point, I will finish it! Thanks
@123liveo
@123liveo 6 жыл бұрын
2nd time I found this video and loved it both times. Much better description than the prof at the uni I am at!!!
@eprabhat
@eprabhat 7 жыл бұрын
Luis, You have a great way of explaining. At times , I like your videos more than even some highly rated professors
@sdsa007
@sdsa007 Жыл бұрын
Wow! Awesome, so books and encyclopedias and biographies of Shannon to understand what you just clearly explained! Thank You!
@msctube45
@msctube45 4 жыл бұрын
I needed this video to get me up to speed on entropy. Great job Luis!
@Skandar0007
@Skandar0007 5 жыл бұрын
That moment when you realize you don't need to search for another video because you got it from the first time. What I'm trying to say is Thank You!
@ketlebelninja
@ketlebelninja 5 жыл бұрын
This was one of the best explanations on entropy. Thanks
@sasthra3159
@sasthra3159 2 жыл бұрын
Great clarity. Have never got this idea about the Shannon Entropy. Thank you. Great work!
@Bvic3
@Bvic3 6 жыл бұрын
At 13:44 it's not 0.000488 but 0.00006103515 ! There is a computation error. The entropy is correct, 1.75.
@SerranoAcademy
@SerranoAcademy 5 жыл бұрын
Thank you for the correction! Yes, you're right.
@SenhorMsandiFelipe
@SenhorMsandiFelipe 3 жыл бұрын
Gracias. Muito claro Senhor. I have been struggling to wrap my head around this and you just made it easy. Thank you.
@Johncowk
@Johncowk 5 жыл бұрын
You made a mistake/approximation by saying the entropy is equal to the number of question needed to be asked in order to find out which letter it is. If I do a scenario with only three letters, all equiprobable, the entropy is about 1.59 but the average number of question needed to find out the correct letter is about 1.66. Your presentation gives a great way to gain an intuitive feeling about the entropy, but maybe you should include a small disclaimer on this point.
@victorialeigh2726
@victorialeigh2726 3 жыл бұрын
Hola Luis, estupendo, espectacular, excelente!
@mau_lopez
@mau_lopez 6 жыл бұрын
What a great explanation ! I wish I had a teacher like you Luis, everything wold be way easier ! Thanks a lot
@patricklemaire225
@patricklemaire225 6 жыл бұрын
Great video! Now I understand what Claude Shannon discovered and how useful and essential maths are in Computer Science.
@AJK544
@AJK544 4 жыл бұрын
your explain is perfect. Even though I am not good at listening english. I can understand everything :)
@dianafarhat9479
@dianafarhat9479 10 ай бұрын
Can you make a part 2 with the full proof, not just the intuition behind the formula? Your explanation's amazing & would love to see a part 2.
@jordyb4862
@jordyb4862 Жыл бұрын
I find sum(p*log(p^-1)) more intuitive. Inverse p (i.e. 1/P) is the ratio of total samples to this sample. If you ask perfect questions you'll ask log(1/p) questions. Entropy is then the sum of these values, each multiplied by the probability of each, which is how much it contributes to the total entropy.
@hanaelkhalifa2630
@hanaelkhalifa2630 4 жыл бұрын
Thank you for excellent explanation of entropy concept first... Then reach to final equation step-by-step it is really good and simple way
@shekelboi
@shekelboi 6 жыл бұрын
Thanks a lot Luis, just had an exam about this Wednesday and your video helped me a lot to understand the whole concept.
@haimmadmon3531
@haimmadmon3531 4 жыл бұрын
Very good explanation - hope to hear more of your videos
@Vuvuzella16
@Vuvuzella16 4 жыл бұрын
This video is helping to keep me floating in my Data Science course; thank you so much for your time!
@therealsachin
@therealsachin 7 жыл бұрын
The best explanation about Shannon entropy that I have ever heard. Thanks!
@kleberloayza7839
@kleberloayza7839 5 жыл бұрын
hi Luis, nice to meet you, I am reading the book of Deep learning of Ian Godfellow, and I needed to view your video for understand the chapter, 3.13 information theory. thanks very much.
@mehmetzekeriyayangn3782
@mehmetzekeriyayangn3782 5 жыл бұрын
You are the best.Such a great explanation.Better than lots of text books.
@poxyu_was_here
@poxyu_was_here 7 жыл бұрын
Easy and Great explanation! Thank you very much, Luis
@mulangonando2942
@mulangonando2942 Жыл бұрын
I love the explanation of the negative sign in the Entropy Equation many people wonder
@JohnsonChen-t9r
@JohnsonChen-t9r 5 жыл бұрын
It's very helpful for me to introduce the concept of entropy to students. Thank you for your clear presentation of entropy.
@TheZilizopendwa
@TheZilizopendwa 3 жыл бұрын
Excellent presentation for an otherwise complex concept.
@SixStringTheory6
@SixStringTheory6 7 жыл бұрын
Wow ..... I wish more people could teach like you this is so insightful
@eka2213
@eka2213 5 жыл бұрын
So, after watching the video, the entropy for giving you thumbs up and subcribing to your channel was 0 - i.e. great explanation!
@amperro
@amperro 3 жыл бұрын
I watched it straight through. Very good.
@christinebraun9610
@christinebraun9610 5 жыл бұрын
Great explanation. But I think what’s still missing is an explanation of why we use log base 2....didn’t quite get that
@olivercopleston
@olivercopleston 5 жыл бұрын
In the last minute of the video, he explains that using Log base 2 corresponds to the level of a decision tree, which is the number of questions you'd have to ask to determine a value.
@MatheusSilva-dragon
@MatheusSilva-dragon 6 жыл бұрын
Wow, thank you, man. I needed that information! There are many ways to teach the same stuff! That number of question stuff is great! It's good to have more than one way to measure something!
@clarakorfmacher7394
@clarakorfmacher7394 4 жыл бұрын
Great Video! I really liked the intuitive approach. My professors was waaaay messier.
@rajudey1673
@rajudey1673 4 жыл бұрын
Really, you have given us outstanding information.
@RenanCostaYT
@RenanCostaYT 4 жыл бұрын
Great explanation, greetings from Brazil!
@hanaizdihar4368
@hanaizdihar4368 4 жыл бұрын
What a great explanation! And so i subscribed😊
@subhashkonda5000
@subhashkonda5000 7 жыл бұрын
Its always hard to understand the equations but u made it so simple :-)
@patriciof.calatayud9861
@patriciof.calatayud9861 3 жыл бұрын
I think that the Huffman compression that you use and the end of the video is near the entropy value but not exactly the same
@aryamahima3
@aryamahima3 4 жыл бұрын
Thank you so much for a such a easy explanation...respect from india...
@karinasakurai9867
@karinasakurai9867 5 жыл бұрын
Brilliant lecture! I learn so much with this explanation. Thanks from Brazil :)
@Dennis12869
@Dennis12869 5 жыл бұрын
Best explanation I found so far
@Darnoc-sudo
@Darnoc-sudo 3 жыл бұрын
Very nice video. Insightful, inutuitive and very well explained. Thank you!
@tilugulilwa
@tilugulilwa 4 жыл бұрын
Superb step by step explanation
@MH_HD
@MH_HD 6 жыл бұрын
This is the best explanation I have come across for a long time, Can you please answer how can we use entropy to find the uncertainty of a naive Bayesian classifier with let's say 4 feature variables and a binomial class variable?
@pkittali
@pkittali 7 жыл бұрын
Lovely explanation...Superb
@YoussefAhmed-uv7ti
@YoussefAhmed-uv7ti 5 жыл бұрын
Actually, there is something wrong here. the entropy and information in information theory are representing the same thing which is how much information we will get after decoding the random message, so in case of the balls in the box if all are the same color we have no information after decoding the message as its probability to be red =1 hence low entropy and low information.
@emrahyener402
@emrahyener402 3 жыл бұрын
Thanks for this perfect explanation 👏👏👏👍
@nijunicholas631
@nijunicholas631 5 жыл бұрын
Thanks..Got the intuition behind Entropy
@KayYesYouTuber
@KayYesYouTuber 4 жыл бұрын
Superb explanation. I like your teaching style. Thank you very much :-)
@ravikumar376
@ravikumar376 Жыл бұрын
Sir good explanation thank you very much. But at sequence3 hot to get 8/8log2. 1/4 result is 2
@logosfabula
@logosfabula 7 жыл бұрын
Luis, you really are a great communicator. Looking forward to your other explanations.
@francismcguire6884
@francismcguire6884 5 жыл бұрын
Best instructor there is! Thanks
@kingshukbanerjee748
@kingshukbanerjee748 6 жыл бұрын
very lucid explanation - excellent, intuitive build-up to Shannon's theorem from scratch
@kasraamanat5453
@kasraamanat5453 3 жыл бұрын
best, as always ❤️ thank you Luis❤️
@nassimbahri
@nassimbahri 5 жыл бұрын
For the first time in my life i understand the real meaning of the Entropy
@amitkumarmaiti5392
@amitkumarmaiti5392 4 жыл бұрын
Great Intuition Luis
@amatya.rakshasa
@amatya.rakshasa 3 жыл бұрын
Is there a construction or characterization or description of how to ask the smartest questions every time ?
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
wow, another great and insightful presentation . really helps to build intuition
@VC-zo9mt
@VC-zo9mt 3 жыл бұрын
I know this may be easier for others to understand, but could you show an explanation of the actual symbols of this formula and show an example of numbers plugged in to see which numbers go where. I am not familiar with Log other than it's related to exponents. The minus aspect of it is also unfamiliar.
@themightyquinn100
@themightyquinn100 2 жыл бұрын
At 13:34 the product does not equal 0.000488. It is approximately 0.000061035. You are missing the last 1/8 factor.
@cariboux2
@cariboux2 3 жыл бұрын
Luis, Thank you so much for this brilliant elucidation of information theory & entropy. Merely as an avocation, I have been toying around with a pet evolutionary theory about belief systems and societies. In order to test it - if that is even possible - I felt I needed to develop some sort of computer program as a model. Since I have very little programming experience and only mediocre math skills, I have been teaching myself both (with a lot of help from the web). It was purely by accident that I stumbled upon Claude Shannon and information theory, and I immediately became fascinated with the topic, and have a hunch that it may somehow be relevant to my own research. Regardless, I am now interested in it for its own sake. I had a an ephemeral understanding of how all the facets (probability, logs, choices, etc.) were all related mathematically, but it wasn't until after watching your video that I believe I fully grok the concept. At one point early on, I found myself shouting, "if he brings up yes/no questions, I know I understand this!" And then you did. It was such a wonderful moment for someone who finds math so challenging, and it is greatly appreciated! I shall check out your other videos later. You're a very good teacher!
@Faustus_de_Reiz
@Faustus_de_Reiz 2 жыл бұрын
For your work, I would look into some of the work by Loet Leydesdorf.
@cariboux2
@cariboux2 2 жыл бұрын
@@Faustus_de_Reiz Thank you! I shall.
@scherwinn
@scherwinn 6 жыл бұрын
Very clever explanation of mighty ENTROPY.
@justinphilpott
@justinphilpott 2 ай бұрын
Great video, thanks!
@xThomas1995
@xThomas1995 4 жыл бұрын
Thank you for the very good video. Easiest to understand so far.
@yhat314
@yhat314 6 жыл бұрын
Lovely job Luis! Very very good!
@namename6435
@namename6435 5 жыл бұрын
You explanation was crystal clear, if possible share some real time examples of data mining where entropy, gini index are used
@paulstevenconyngham7880
@paulstevenconyngham7880 6 жыл бұрын
this is a really great explanation, thanks so much for sharing mate!
@carlitos5336
@carlitos5336 4 жыл бұрын
Excelente explicación! Gracias por compartirla.
@micahdelaurentis6551
@micahdelaurentis6551 3 жыл бұрын
you killed it. Great video
@sosoboy77
@sosoboy77 5 жыл бұрын
Best video this week
@jaeimp
@jaeimp 2 жыл бұрын
Excellent job, Luis! Plain and simple: the log base 2 gives the number of bifurcations to arrive at the answer, and the probability of the answer serves to temper down the chaos introduced into the system by very rare events. Genius!
@rolfbecker4512
@rolfbecker4512 4 жыл бұрын
Thank you very much for this beautiful and clear explanation!
@bismeetsingh352
@bismeetsingh352 5 жыл бұрын
That was highly intuitive, thank you, sir, I appreciate the effort behind this.
@蔡小宣-l8e
@蔡小宣-l8e 3 жыл бұрын
十分谢谢! Thank you very much, Luis.
@shakeelurrahman1846
@shakeelurrahman1846 Жыл бұрын
thanks a lot for such a beautiful explanation..!
@scottsara123
@scottsara123 5 жыл бұрын
Easy and excellent explain, Please do for loss and cost function as well (convex)
@hyperduality2838
@hyperduality2838 4 жыл бұрын
Syntropy is dual to increasing entropy -- The 4th law of thermodynamics! Thesis is dual to anti-thesis -- The time independent Hegelian dialectic. Schrodinger's cat: Alive (thesis, being) is dual to not alive (anti-thesis, non being) -- Hegel's cat. Syntropy is the process of optimizing your predictions to track targets or teleological physics. Teleological physics (syntropy) is dual to non teleological physics (entropy, information).
@Omsip123
@Omsip123 Жыл бұрын
Very well explained, thank you
@paulinagc6986
@paulinagc6986 3 жыл бұрын
Thank you so much. You are such a good teacher, really :D :D :D
@jonathanfrancis
@jonathanfrancis 4 жыл бұрын
Wow. Amazing video.
@bhupeshrao2359
@bhupeshrao2359 4 жыл бұрын
As you said the game is to have red,red,red,blue. but for first case we have all reds. hence probability of winning in first case should be 1*1*1*0(prob of blue in first bucket). How did you calculate (1*1*1*1). please explain?
@miguelfernandosilvacastron3279
@miguelfernandosilvacastron3279 5 жыл бұрын
Thank you. Nice, concise explanation.
@meshackamimo1945
@meshackamimo1945 6 жыл бұрын
Hi. Thanks a million times for simplifying a very complicated topic. Kindly find time n post a simplified tutorial on mcmc.... I am overwhelmed by your unique communication skills. Markov chain Monte Carlo. God bless you.
@TheGenerationGapPodcast
@TheGenerationGapPodcast 3 жыл бұрын
Help us smash Markov chain Monte Carlo
@YugoGautomo
@YugoGautomo 5 жыл бұрын
Hi Luis, Thanks for your explanation. I guess you're wrong in minute 6.29 and 7.41. I think P winning for P bucket 1 should be 0, since there were no Blue Balls in the bucket as expected outcome of the game. should be R, R, R, B. Am I right?
@mortezaaslanzadeh3735
@mortezaaslanzadeh3735 4 жыл бұрын
Awesome explanation, you rock
Naive Bayes classifier: A friendly approach
20:29
Serrano.Academy
Рет қаралды 148 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 660 М.
Правильный подход к детям
00:18
Beatrise
Рет қаралды 11 МЛН
It works #beatbox #tiktok
00:34
BeatboxJCOP
Рет қаралды 41 МЛН
Claude Shannon - Father of the Information Age
29:32
University of California Television (UCTV)
Рет қаралды 363 М.
Machine Learning: Testing and Error Metrics
44:43
Serrano.Academy
Рет қаралды 111 М.
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 485 М.
The Biggest Ideas in the Universe | 20. Entropy and Information
1:38:43
A friendly introduction to Deep Learning and Neural Networks
33:20
Serrano.Academy
Рет қаралды 704 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
10:41
Aurélien Géron
Рет қаралды 360 М.
Правильный подход к детям
00:18
Beatrise
Рет қаралды 11 МЛН