Entropy (for data science) Clearly Explained!!!

  Рет қаралды 632,664

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Пікірлер: 1 300
@statquest
@statquest 2 жыл бұрын
If you'd like to learn more about Entropy, and more details why the log is used, check out the original manuscript: people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@yakubsadlilseyam5166
@yakubsadlilseyam5166 2 жыл бұрын
Sir, have you included entropy in your book, I couldn't find it
@statquest
@statquest 2 жыл бұрын
@@yakubsadlilseyam5166 No, it's not in the book because, while it's nice to know about, it's not essential since there are other, easier to understand options that you can use.
@sandeepgill4282
@sandeepgill4282 Жыл бұрын
@@statquest could you please provide sources to gain understanding of all types of entropies?
@statquest
@statquest Жыл бұрын
@@sandeepgill4282 en.wikipedia.org/wiki/Entropy_(information_theory)
@rikodewaner
@rikodewaner Жыл бұрын
It's a helpful introduction Josh....👍
@luischinchilla-garcia4840
@luischinchilla-garcia4840 3 жыл бұрын
This is quite possibly the best explanation of entropy I've ever seen. This is even better than Shannon's own paper!
@statquest
@statquest 3 жыл бұрын
BAM! :)
@DarXPloit
@DarXPloit 3 жыл бұрын
@@statquest Big Bang BAM
@SaffaGains
@SaffaGains 3 жыл бұрын
@@statquest DOUBLE BAM
@adipurnomo5683
@adipurnomo5683 3 жыл бұрын
What time Shannons was write that paper?
@statquest
@statquest 3 жыл бұрын
@@adipurnomo5683 I believe it was 1948
@felixvanderspek1293
@felixvanderspek1293 3 жыл бұрын
"Simplicity is the ultimate sophistication." - Leonardo da Vinci Thanks for explaining in such simple terms!
@statquest
@statquest 3 жыл бұрын
My pleasure!
@sade922
@sade922 2 жыл бұрын
9 minutes of your video explained everything better than 2 hours of my professor giving a lecture... Thank you!!!
@statquest
@statquest 2 жыл бұрын
Glad it helped!
@Kerem-Ertem
@Kerem-Ertem Жыл бұрын
Make it 3 hours man
@PunmasterSTP
@PunmasterSTP 8 ай бұрын
How'd the final go?
@eduardoh.m2072
@eduardoh.m2072 3 жыл бұрын
You, sir, you are the very first person to actually explain this subject and not just repeat some random definition without giving any thought to it. I'm amazed by the amount of people who confuse rambling on about the topic with actually explaining it. Thank you!
@statquest
@statquest 3 жыл бұрын
Thanks!
@dvo66
@dvo66 3 жыл бұрын
best entropy explanation. I took a 500 level ML class last spring in my masters and this is better explanation than my prof(no disrespect to him, he is amazing too)
@statquest
@statquest 3 жыл бұрын
Thanks!
@deepakmehta1813
@deepakmehta1813 3 жыл бұрын
Amazing video on Entropy Josh. Thank you. I am certainly more addicted to statquest than Netflix. I really liked the way you have introduced the notion of surprise, how you used it to pedagogically explain entropy. It is certainly now easy to think and remember the definition of entropy.
@statquest
@statquest 3 жыл бұрын
Awesome, thank you!!!!
@shahf13
@shahf13 3 жыл бұрын
@@statquest As a heavy KZbin addict with 40 sub-channels you are the only one I put a bell on
@statquest
@statquest 3 жыл бұрын
@@shahf13 BAM! :)
@GokulAchin
@GokulAchin 3 жыл бұрын
Please give Josh a nobel prize for not getting a single dislikes in many of his videos and for his contribution to the ML , Stats community. I have to forgive myself for not finding this channel way before when i started my interest in data science. You are definitely inspiring me to teach many people the same content you taught us.
@statquest
@statquest 2 жыл бұрын
Thank you!!! However, this video actually has 13 dislikes. For some reason KZbin no longer posts the number of dislikes. However, with 3, 402 likes, that means 99.6% of the people like this video, which is pretty good.
@GokulAchin
@GokulAchin 2 жыл бұрын
@@statquest DOUBLE BAM!!!!
@buihung3704
@buihung3704 11 ай бұрын
@@statquest The entropy, or the expected Surprise when you randomly pick a Like in your react pool, is pretty low :)))
@statquest
@statquest 11 ай бұрын
@@buihung3704 bam!
@matthewfox1561
@matthewfox1561 10 ай бұрын
kzbin.info/www/bejne/i33GZ61petiLi9Usi=wtYPowFO-N_iDqdS bazinga
@hemanthhariharan5105
@hemanthhariharan5105 Жыл бұрын
I'm truly amazed by the power of simplicity and intuition. Hats off Josh!
@statquest
@statquest Жыл бұрын
Thank you!
@saidisha6199
@saidisha6199 Жыл бұрын
One of the best explanations of entropy. I had been struggling for a while with this concept and there was no intuitive way I could understand and remember the formula so far, your video made it possible. Great video!
@statquest
@statquest Жыл бұрын
Glad it helped!
@AtiqurRahman-uk6vj
@AtiqurRahman-uk6vj 2 жыл бұрын
Your self promotion is not shameless, it a gift to humanity. Free content that explains way better than paid content on Coursera. Thanks for helping out poor guys like us Josh
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@pietronickl8779
@pietronickl8779 Жыл бұрын
so true
@bagavathypriya4628
@bagavathypriya4628 2 жыл бұрын
You are the BEST teacher !! Thanking God that you exists.
@statquest
@statquest 2 жыл бұрын
Wow, thank you!
@danielpaul65
@danielpaul65 2 жыл бұрын
Starting the video with a message declaring that we can understand Entropy is the best starting line I have ever seen from any teacher in my life. Great work!!!
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@wolfgangi
@wolfgangi 3 жыл бұрын
I freaking love these video, Josh has a gift to explain things so vividly
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@emmydistortion3997
@emmydistortion3997 2 жыл бұрын
Awesome world-level teaching... Thank you!
@statquest
@statquest 2 жыл бұрын
Thanks!
@CellRus
@CellRus 2 жыл бұрын
Absolutely amazing. I always come back to your videos from time to time for simple (but absolutely useful) explanation of complicated concepts that I found in papers. They all have helped me a lot, and I feel I'm better at communicating these concepts to other researchers too.
@statquest
@statquest 2 жыл бұрын
Hooray!!! I'm glad the videos are so helpful.
@felixlaw8377
@felixlaw8377 10 ай бұрын
Being able to derive entropy and show it simply to us in a funny way is just mindblowing... Hats off to you sir!!
@statquest
@statquest 10 ай бұрын
Thanks!
@gustavorm5686
@gustavorm5686 Жыл бұрын
the best explanation for entropy i saw, after browsing for tens of videos. well done prof!!
@statquest
@statquest Жыл бұрын
Wow, thanks!
@DanishArchive
@DanishArchive Жыл бұрын
I was awestruck when I finally understood what on earth Entropy is. In most algorithms, I hear entropy must be less, and I felt that it is some weird value which the model gives, and we have to tune it to reduce it. But now, sitting here, watching this video felt like an eye-opener. What a simplistic and beautiful way to explain complicated concepts. You truly are amazing, Josh!!! Super BAMMM!!
@statquest
@statquest Жыл бұрын
Thank you!
@loay1844
@loay1844 Жыл бұрын
Wow im sooo impressed. Frrr! It’s been a week trying to understand entropy and I rly thought I was never going to understand this bs. This video is arguably the best video on KZbin! Not only about entropy, but absolutely!! Thank you soo much
@statquest
@statquest Жыл бұрын
Glad I could help!
@DubZenStep
@DubZenStep 2 жыл бұрын
The world needs an army of people like you man. This explanation is outstanding. A triple bam.
@statquest
@statquest 2 жыл бұрын
Thank you!
@OwenMcKinley
@OwenMcKinley 3 жыл бұрын
😊😊15:45 hahaha “psst… the log of the inverse of the probability...” Josh this was a fantastic tutorial. Love how I can just wake up and see content like this fresh in my YT recommendations We all appreciate it
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@jeffnador9594
@jeffnador9594 3 жыл бұрын
You can also say "negative log of the probability". Since 1/(x^c) = x^(-c), if c = 1, then log(1/x) = log(x^-1) = -1 log(x)
@statquest
@statquest 3 жыл бұрын
@@jeffnador9594 Yep. But, to me, that form makes it just a little bit harder to see what's going on.
@jeffnador9594
@jeffnador9594 3 жыл бұрын
@@statquest Agreed! But, the more you can keep people guessing, the higher the surprise value of the statement...
@lbognini
@lbognini 2 жыл бұрын
@@jeffnador9594 the point is to get an intuition and derive the formula. Not to manipulate mathematic terms.
@bushraw66
@bushraw66 7 ай бұрын
I can't believe this guy made entropy fun and understandable. The intro song really lowered my anxiety about passing my exam thank you so much for your content
@statquest
@statquest 7 ай бұрын
Hooray! :)
@michaelgeorgoulopoulos8678
@michaelgeorgoulopoulos8678 3 жыл бұрын
The inverse probability is a much better way of putting it than the minus sign. It was all this time in front of me and I didn't notice. Thank you!
@statquest
@statquest 3 жыл бұрын
Thanks!
@thegt
@thegt Жыл бұрын
Simply amazing... I have been using CrossEntropy for months and only now I understood where the word Entropy came from in CrossEntropy
@statquest
@statquest Жыл бұрын
bam! :)
@shamshersingh9680
@shamshersingh9680 Жыл бұрын
How can it be!! How can you simplify such complex topics into such simple explanations. Hats Off Man. I seriously wish if I could have had a Maths teacher like you back in school. I have become fan of your videos. Your videos are the first and last stop solution for all my doubts. Thanks Josh. You are a boon to learners like us. Impressed.
@statquest
@statquest Жыл бұрын
Thank you! :)
@sammitiyadav6914
@sammitiyadav6914 9 ай бұрын
I follow most of your videos, not sure how I missed this gold! This is just the best entropy video I’ve ever seen.
@statquest
@statquest 9 ай бұрын
Wow, thanks!
@pbawa2003
@pbawa2003 2 жыл бұрын
this is most simple way to explain Entropy, way to go Josh, love your videos !!!
@statquest
@statquest 2 жыл бұрын
Glad you like them!
@anibaldk
@anibaldk 2 жыл бұрын
Priceless channel for anyone interested in statistics. Just BRILLIANT.
@statquest
@statquest 2 жыл бұрын
Thank you!
@korkutkaynardag9147
@korkutkaynardag9147 2 жыл бұрын
after my parents, I love you the most in the world.
@statquest
@statquest 2 жыл бұрын
bam!
@RESPECT-bu1fr
@RESPECT-bu1fr 4 ай бұрын
@korkutkaynardag9147 what about God
@lchandalier7976
@lchandalier7976 Ай бұрын
@@RESPECT-bu1fr bro wtf
@berryesseen
@berryesseen 18 күн бұрын
The more common name for surprise in information theory is "information". People like to use the notation i(X) or \imath(X) for it. So, i(X) = log (1/P_X(X)). Its expected value is the Shannon entropy H(X). The study of this random variable plays a critical role in source coding (the first paper on this is Shannon's original paper). Let me also explain why we have a logarithm of 1 over probability but not an arbitrary decreasing function. The reason lies in the operational meaning of description lengths. Let's say that we have independent random variables X_1 and X_2 with their corresponding description lengths L_1 and L_2. The probability of (X_1, X_2) is P(X_1) * P(X_2). The description length of (X_1, X_2) should be L_1 and L_2. Because logarithm is the only function that converts multiplication into summation, it has to appear in the "information" function. Shannon's source coding theory says this: If I have n copies of a random variable X (which can be anything), I can represent any possible outcome using n * H(X) coin flips with a probability approaching 1 as n approaches infinity.
@statquest
@statquest 18 күн бұрын
Thanks!
@dylansatow3315
@dylansatow3315 Жыл бұрын
Wow this was amazing. I've never seen entropy explained this clearly before
@statquest
@statquest Жыл бұрын
Glad to hear it!
@viethoalam9958
@viethoalam9958 7 ай бұрын
this is so smooth and easy to understand in connecting between “surprise” - an emotion with a mathmatic theory - number.
@statquest
@statquest 7 ай бұрын
:)
@Shionita
@Shionita 3 жыл бұрын
I feel so happy because I just learned something new thanks as always!! 😁
@statquest
@statquest 3 жыл бұрын
I'm so glad!
@marcoventura9451
@marcoventura9451 3 жыл бұрын
me too!!!
@ramwisc1
@ramwisc1 5 ай бұрын
Wow - this is the best explanation of entropy that I've ever seen. My light bulb turned on when I saw 1/probability and the transformation into log(1/probability) sealed things nicely!
@statquest
@statquest 4 ай бұрын
BAM! :)
@shahedmahbub9013
@shahedmahbub9013 3 жыл бұрын
Thanks for all your efforts in creating a smart, funny and most importantly CLEAR explanation. This was awesome.
@statquest
@statquest 3 жыл бұрын
Glad you liked it!
@tomasalmeida5306
@tomasalmeida5306 Жыл бұрын
My surprise after getting one heads is always 100% Great video, very helpful, thanks!
@statquest
@statquest Жыл бұрын
Thanks!
@tranquil123r
@tranquil123r 3 жыл бұрын
Loved it. The best explanation I came across on Entropy. Thanks Josh!
@statquest
@statquest 3 жыл бұрын
Glad you enjoyed it!
@tomashernandezacosta9715
@tomashernandezacosta9715 2 жыл бұрын
This is THEE single BEST explanation for Entropy that I have ever heard. After this video I bought your book instantly. TRIPLE BAM!!
@statquest
@statquest 2 жыл бұрын
Wow! Thank you very much for your support!
@OlehPopeskul
@OlehPopeskul Жыл бұрын
This is such a fascinating video, learning the theory of ML and I can certainly say you are a gifted person. Your perfect understanding field of Probability, Math and ML gives ability to explain it in the best way in the entire world. I'm amazed with your explanation skills
@statquest
@statquest Жыл бұрын
Wow, thank you!
@forresthu6204
@forresthu6204 2 жыл бұрын
This is the BEST version of the explanation about entropy.
@statquest
@statquest 2 жыл бұрын
Thank you!
@eliyahubasa9401
@eliyahubasa9401 3 жыл бұрын
Thanks, I’d been waiting for good explanation about entropy for a long time. Thanks :)
@statquest
@statquest 3 жыл бұрын
Thank you!
@ichimatsu8422
@ichimatsu8422 2 жыл бұрын
The absolute GOAT when it comes to stats on youtube
@statquest
@statquest 2 жыл бұрын
Thank you!
@nataliatenoriomaia1635
@nataliatenoriomaia1635 3 жыл бұрын
Awesome as always, Josh! Thank you for continuing to share high quality content with us. You’re a very talented teacher. I wish you all the best!
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@chaitu2037
@chaitu2037 2 жыл бұрын
"This is by far the best explanation for entropy that I have ever come across", thanks so much!
@statquest
@statquest 2 жыл бұрын
bam!
@kevalan1042
@kevalan1042 2 жыл бұрын
You had me at "let's talk about chickens"
@statquest
@statquest 2 жыл бұрын
bam! :)
@devrus265
@devrus265 2 жыл бұрын
This is by far the best explanation I heard on entropy.
@statquest
@statquest 2 жыл бұрын
Thank you!
@TheParkitny
@TheParkitny Жыл бұрын
If only text books explained things this way. Life would've been easier as an undergrad.
@statquest
@statquest Жыл бұрын
:)
@Kerem-Ertem
@Kerem-Ertem Жыл бұрын
Entropy is kind of complicated especially in data science. And that explanation was pure. Thanks!
@statquest
@statquest Жыл бұрын
Glad you enjoyed it!
@ceseb23
@ceseb23 3 жыл бұрын
Hello, Thanks for this video, its really helpfull as always :D. Quick question : why not use Surprise = 1- P(x), as it scale inverse to the probability and the surprise of a sure event is 0 as requested ?
@statquest
@statquest 3 жыл бұрын
Maybe it doesn't make sense for Surprise to be 1 when the probability is 0.
@AAA-tc1uh
@AAA-tc1uh 2 жыл бұрын
@@statquest I would expect a deeper answer than that, as the [0,1] range can be scaled with any constant to give any large-enough surprise value to probability 0. It's just that the function would be linear now.
@statquest
@statquest 2 жыл бұрын
@@AAA-tc1uh Well, then you're stuck with the figuring out what that constant should be. Infinity? But that kind of opens another can of worms because anything times infinity is infinity. Thus, another advantage of using the log function is that the limit as x goes go zero is -infinity.
@AAA-tc1uh
@AAA-tc1uh 2 жыл бұрын
@@statquest Sure, I understand, but my rebuttal would be: we already use the [0,1] range for the probability distribution with 0="would never happen" and 1="always happens" (not entirely correct, I know, e.g. continuous distributions) so the same way we could treat Surprise value of 0 as "no surprise at all", and 1 as "maximum surprise". And we have a nice, well-behaved range with no infinities or undefined behavior. Skimming Shannon's original paper I see he argues for the use of the logarithmic function in the opening paragraphs but never provides real deep reason other than convenience and and practicality in engineering usage (another point for the linear function suggested above). Edit: the real reason is the characterization of such function, see en.wikipedia.org/wiki/Entropy_(information_theory)#Characterization, which is apparently only satisfied by entropy function in this form, using logarithms, as proved by Shannon.
@statquest
@statquest 2 жыл бұрын
@@AAA-tc1uh Nice!
@bhuriwataunguranun6371
@bhuriwataunguranun6371 Күн бұрын
The best ML course I've ever seen.
@statquest
@statquest 21 сағат бұрын
Thank you!
@lan30198
@lan30198 3 жыл бұрын
Fuck, I never understand about the entropy before watching this video, you are amazing
@statquest
@statquest 3 жыл бұрын
bam!
@lan30198
@lan30198 3 жыл бұрын
@@statquest double bam!!!
@varuntejkasula748
@varuntejkasula748 Жыл бұрын
absolutely clear. Can't expect a more clearer explanation than this
@statquest
@statquest Жыл бұрын
Glad you think so!
@TuNguyen-ox5lt
@TuNguyen-ox5lt 3 жыл бұрын
this is definitely the most intuitive way to really gasp the ideal of entropy. you're just wonderful. Thank you soo much
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@naven084k
@naven084k 5 ай бұрын
Best explanation i have ever seen. Thanks Josh. I don't forget this anymore. Keep teaching the people ❤
@statquest
@statquest 5 ай бұрын
Thank you! Will do!
@sidd1454
@sidd1454 Жыл бұрын
This has got to be the best video made in the history of KZbin for me. I dont care about others.
@statquest
@statquest Жыл бұрын
Thanks!
@nirmithrjain6265
@nirmithrjain6265 Жыл бұрын
Seriously, you are the best teacher I have ever had
@statquest
@statquest Жыл бұрын
Thank you!
@jiaweizhang6189
@jiaweizhang6189 3 жыл бұрын
I have studied ml for a lot of time. i dont clearly know what is cross-entropy or entropy until now. This is the best explanation for entropy!
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@nikhilgupta4859
@nikhilgupta4859 2 жыл бұрын
Heyy Sir, I am your subscriber from past 1.5 year and I feel honoured to tell you, after following you I finally got a job transition as a senior data scientist at an MNC 6 month back. Now I have understood the datascience project ecosystem in my company. You are one of the contributors for my success. Thanks a Ton!!!!! Also I would like to open my hands for helping learners. So learners you can tag me asking any doubts. I would be more than happy helping you.
@statquest
@statquest 2 жыл бұрын
Congratulations!!! TRIPLE BAM!!! I'm so glad the videos were helpful and you are will to help others as well! HOORAY! :)
@nikhilgupta4859
@nikhilgupta4859 2 жыл бұрын
@@statquest Thank you Sir!! Bamm :)
@aryanafraz
@aryanafraz Жыл бұрын
I barely comment on a video on KZbin BUT SERIOUSLY your whole channel is the best channel I have ever seen on KZbin.
@statquest
@statquest Жыл бұрын
Thank you very much! :)
@crackedatcurry
@crackedatcurry 5 ай бұрын
This man deserves a prize for how well he taught this. BAMMMM!!!!
@statquest
@statquest 5 ай бұрын
Thanks!
@michaelzavin969
@michaelzavin969 2 жыл бұрын
Just wow ! i've watched my prof's lecture (1.5 h long) 3 times and did not understand anything and here you come with 15 minutes long video and BAM and medium BAM !! and I finally got it THANK YOU!!!
@statquest
@statquest 2 жыл бұрын
BAM!
@jsebdev1539
@jsebdev1539 Жыл бұрын
I'm so happy these channel exists! hurray!!!
@statquest
@statquest Жыл бұрын
Thank you! :)
@shaikhmoin849
@shaikhmoin849 2 ай бұрын
i have never seen this much of clarity ! Thank you so much
@statquest
@statquest 2 ай бұрын
Thanks! :)
@ankitchakraborty4552
@ankitchakraborty4552 Жыл бұрын
Arguably the best mathematics youtuber in our generation
@statquest
@statquest Жыл бұрын
Aww! Thank you very much! :)
@mohammedlabeeb
@mohammedlabeeb 2 жыл бұрын
Really Great video. Right to the point. I met with one of my coworker who is very seasoned in Data science to help me work on a project and use entropy for the first time. After one hour I was as confused as I could be. But this video really helped. I wish if I saw this video before I had my meeting.
@statquest
@statquest 2 жыл бұрын
Glad it was helpful!
@muskanmahajan04
@muskanmahajan04 2 жыл бұрын
By far the best explanation I've seen. You are a true saviour.
@statquest
@statquest 2 жыл бұрын
Thank you!
@RoRight
@RoRight 3 жыл бұрын
I was NOT surprised by the high quality of this video given StatQuest's high probability of producing awesome videos.
@statquest
@statquest 3 жыл бұрын
Bam!
@cls8895
@cls8895 3 жыл бұрын
WOW its SUPEERRR EASY and well explained!! I only had known about the entropy in physics, but now I can see the calculation way of the entropy. THANK YOU for your hard work for easy understanding from S.Korea!
@statquest
@statquest 3 жыл бұрын
Hooray! I"m glad the video was helpful! :)
@ayuktverma6367
@ayuktverma6367 2 ай бұрын
One of the best channel to explain statistics concept ..........❤❤
@statquest
@statquest 2 ай бұрын
Thank you!
@chenmarkson7413
@chenmarkson7413 10 ай бұрын
You might like to know that I am sharing this video with my whole class of CSC311 Introduction to Machine Learning at the University of Toronto. You are doing a phenomenal work in explaining concepts in such an intuitively understandable way! Hugest thanks!
@statquest
@statquest 10 ай бұрын
Thank you very much! I'm so happy the video is helpful! :)
@tiago9617
@tiago9617 2 жыл бұрын
I can't understand how it's possible to be so good at teaching something
@statquest
@statquest 2 жыл бұрын
bam! :)
@guliyevshahriyar
@guliyevshahriyar 10 ай бұрын
This is phenomenal work for ENTIRE DATA SCIENCE! Thank you a loot.
@statquest
@statquest 10 ай бұрын
You're very welcome!
@thers9297
@thers9297 11 ай бұрын
You might be the greatest teacher on youtube
@statquest
@statquest 11 ай бұрын
Thank you!
@bakyt_yrysov
@bakyt_yrysov Жыл бұрын
This is the beeeeeeeeeest explanation of entropy! THANK YOU!!
@statquest
@statquest Жыл бұрын
Thanks!
@rangjungyeshe
@rangjungyeshe 2 жыл бұрын
Fantastically clear explanation of a notoriously tricky subject. Apparently Johnny Von Neumann told Shannon to call his measure of information entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.” I suspect J V N wouldn't have said that if he's known about your video...
@statquest
@statquest 2 жыл бұрын
BAM!!! Thank you so much for your support!!! :)
@pietronickl8779
@pietronickl8779 Жыл бұрын
Thanks for these super clear explanations - you really manage to break down complex concepts till they seem simple and (almost) intuitive. Also really appreciate the pace, ie the patience of going step-by-step and not making any crazy leaps 2 mins to the end 👏👏👏
@statquest
@statquest Жыл бұрын
Thank you! :)
@azamatbagatov4933
@azamatbagatov4933 2 жыл бұрын
I am surprised how easily understandable entropy is! Thanks!
@statquest
@statquest 2 жыл бұрын
Glad it was helpful!
@kaushaljani6769
@kaushaljani6769 2 жыл бұрын
Bam!!! Hats off man that was the easiest explaination i ever come across... Thanks for making such kinds of tutorials.
@statquest
@statquest 2 жыл бұрын
Glad you liked it!
@ralphchien184
@ralphchien184 2 жыл бұрын
This is the most excellent explaination that I have ever seen. Impressive! Impressive! Impressive! Three times I must give. Thanks a lot!
@statquest
@statquest 2 жыл бұрын
Wow, thanks!
@areebahmad7012
@areebahmad7012 Жыл бұрын
This is absolutely amazing, I studied my whole course of Probability and statistics at my University but there was so much chaos. Now As I am learning machine learning this makes me a lot clear.
@statquest
@statquest Жыл бұрын
Glad it helped!
@surendrabarsode8959
@surendrabarsode8959 3 жыл бұрын
This is the best ever explanation of entropy I have seen!! The real surprise is the totally innovative idea of 'surprise! Thanks with entropy of zero!!!.
@statquest
@statquest 3 жыл бұрын
bam! :)
@alinazem6662
@alinazem6662 Жыл бұрын
Your videos are as straightforward as Y=mX. thanks Josh.
@statquest
@statquest Жыл бұрын
Glad you like them!
@lakminiwijesekara8662
@lakminiwijesekara8662 3 жыл бұрын
The best explanation I've ever come across.
@statquest
@statquest 3 жыл бұрын
Wow, thanks!
@qZoneful
@qZoneful Жыл бұрын
I am a biology student who is trying to understand the entropy concept for my species distribution models, and I choose to believe that this video uploaded from heaven with the consensus of all the passed statisticians
@statquest
@statquest Жыл бұрын
BAM! :)
@qZoneful
@qZoneful Жыл бұрын
@@statquest Are you planning to make a video about the Maximum Entropy Principle (MaxEnt)? :))
@statquest
@statquest Жыл бұрын
@@qZoneful I'll keep that in mind.
@vaibhavnakrani2983
@vaibhavnakrani2983 11 ай бұрын
I bow down to you sir. It was truly amazing in a simple way.
@statquest
@statquest 11 ай бұрын
Thank you!
@alialthiab7527
@alialthiab7527 2 жыл бұрын
You are awesome. I finally understood the entropy concept without any equations. Big love😍
@statquest
@statquest 2 жыл бұрын
Thanks!
@KleineInii
@KleineInii 2 жыл бұрын
Thank you so much for sharing this great explanation with us! I stopped the video after you derived the formula and then derived it again on my own. Makes so complete sense!!! I am giving a talk at a conference in 2 weeks, and in my presentation there is a formula using mutual information. I was asked to explain this in my pratice presentation and was not able to. Now, after seeing your video, I am so clear about the concept of entropy and feel much more confident when I need to explain it :)
@statquest
@statquest 2 жыл бұрын
Good luck with your presentation! BAM! :)
@jaysonl3685
@jaysonl3685 2 жыл бұрын
Absolutely amazing and intuitive explanation Josh! I couldn't have understood it without you, huge thanks :D
@statquest
@statquest 2 жыл бұрын
Thank you!
@SonSon-rq5dj
@SonSon-rq5dj 2 жыл бұрын
Solid video, solid explanation. Best channel out there for your data mining needs
@statquest
@statquest 2 жыл бұрын
Thank you!
@Stilzel
@Stilzel 11 ай бұрын
Josh, thank you so much for your videos, you are a GREAT teacher. I wish you all the best, and thank you again!
@statquest
@statquest 11 ай бұрын
Thank you very much! :)
@sumitmishra8449
@sumitmishra8449 3 жыл бұрын
Thank you, Josh, you literally are the best teacher out there. I got a job as a Data Analyst and I Only watched your videos for all the explanations and understanding. Made a lot of notes as well. Sincerely Thank you. PS: First thing I'm gonna do with my salary is buy a membership!! Infinite Bam!!
@statquest
@statquest 3 жыл бұрын
Congratulations!!!! TRIPLE BAM!!! :)
@nakul___
@nakul___ 3 жыл бұрын
Been looking forward to this one for a while and was not disappointed at all - thanks!
@statquest
@statquest 3 жыл бұрын
bam!
@namratachavan6024
@namratachavan6024 3 жыл бұрын
It is very well explained about Entropy with simple example. Thank you so much. And most important you are not doing shamelessly promotion its actually worth it for everyone like me to understand the concept in very easiest manner.
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@pingmelody8609
@pingmelody8609 Жыл бұрын
You are so talented! I'm so thankful KZbin recommendation system guided me into your videos, it's a whole new world!Every data scientists should watch your videos!!!! bam!!
@statquest
@statquest Жыл бұрын
Wow, thank you!
@dafeiwang7797
@dafeiwang7797 3 жыл бұрын
我所见过最易于理解的老师👨‍🏫 Greate
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@emsdy6741
@emsdy6741 3 жыл бұрын
DOUBLE BAM! Thanks for the video. I liked how you derived the formula of entropy, and now it is easier to understand.
@statquest
@statquest 3 жыл бұрын
Hooray!
@shaahinfazeli9095
@shaahinfazeli9095 3 жыл бұрын
You are truly amazing in simply explain the complicated things!
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@PunmasterSTP
@PunmasterSTP 8 ай бұрын
I always found entropy to be quite a confusing concept, but this video handled it expertly!
@statquest
@statquest 8 ай бұрын
BAM! :)
@dabestpilot4157
@dabestpilot4157 8 ай бұрын
Wonderful video, sending this to my group thats supposed to be making a presentation on this next week.
@statquest
@statquest 8 ай бұрын
Good luck with the presentation!
Frank Starmer Clearly Explained (How my pop influenced StatQuest!!!)
6:28
StatQuest with Josh Starmer
Рет қаралды 31 М.
Expected Values, Main Ideas!!!
13:39
StatQuest with Josh Starmer
Рет қаралды 201 М.
МЕНЯ УКУСИЛ ПАУК #shorts
00:23
Паша Осадчий
Рет қаралды 5 МЛН
Симбу закрыли дома?! 🔒 #симба #симбочка #арти
00:41
Симбочка Пимпочка
Рет қаралды 4,7 МЛН
Do you love Blackpink?🖤🩷
00:23
Karina
Рет қаралды 16 МЛН
Mutual Information, Clearly Explained!!!
16:14
StatQuest with Josh Starmer
Рет қаралды 102 М.
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 149 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
The physics of entropy and the origin of life | Sean Carroll
6:11
Decision and Classification Trees, Clearly Explained!!!
18:08
StatQuest with Josh Starmer
Рет қаралды 799 М.
Regression Trees, Clearly Explained!!!
22:33
StatQuest with Josh Starmer
Рет қаралды 660 М.
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
10:41
Aurélien Géron
Рет қаралды 355 М.
Why Information Theory is Important - Computerphile
12:33
Computerphile
Рет қаралды 155 М.