This is the BEST video on tree, gain and entropy on KZbin.
@Madeinchinaagain7 жыл бұрын
Very helpful. Videos like these make procrastination and cramming possible.
@ArminBishop7 жыл бұрын
I own you my exam. You are my saviour. Best explanation ever!!!
@keithmaly9042 ай бұрын
Very nice walkthrough of the concepts of entropy and gain. Ty.
@davidinfante48875 жыл бұрын
Good explanation. I am going to give a diplomat on data science to several colleagues (I am the head of the group of embedded systems with expertise in data mining, visual anlytics) and I was looking easy examples to explain them. This is the example with 0 entropy and highest gain I have found. All the best
@JoseFerreira-dk3gv8 жыл бұрын
Good video.. in minute 22:50 you pick temperature as the one with maximum information gain (0.570).. shouldn´t you pick Humidity (0.970)? I know that in terms of entropy you want to pick the one closer to 0.5... but that one refers to gain..
@aujoeyd7 жыл бұрын
U probly right
@snehanshus70396 жыл бұрын
you're right..max gain should be picked....sleep of tongue perhaps
@sivakumarrallabhandi45485 жыл бұрын
That's correct. Excuse the typo and appreciate the video.
@channagirijagadish12019 жыл бұрын
Clear and Concise Explanation. Appreciate the good work.
@fiacobelli5 жыл бұрын
Entropy is caos. Not knowing what decision to make. An entropy of 1 means that with the information you have, you can't make a good decision. For example guessing what side is a con going to fall has full entropy. You don't know. Now, if I tell you that the coin is loaded and it falls on heads 70% of the time, there's less entropy because the information you have allowes you to make a more precise guess.
@KumarKumar-fc2gc Жыл бұрын
Thank you so much for the crisp and clear explanation....
@harisk.14663 жыл бұрын
21:29 I think you should say, you choosed "outlook" because it has highest information gain and the lowest entropy among all other features? Correct me if I am wrong!
@cdm8916 жыл бұрын
21:30 where does the entropy of .970 come from? Gain(sunny, Humdity) = .970 - (3/5) …. etc where did you get the .970? Thanks
@arjunsrinivasan37515 жыл бұрын
It's the entropy of the set Sunny. All Gain examples are for sunny
@datascientist29584 жыл бұрын
Can you guide Gini index work with information gain to rank feature? Or Gini index work with Gini impurity to rank feature?
@kulin65223 жыл бұрын
Really good video, congratulations to the creator
@vyshakhunnikrishnan7 жыл бұрын
22:40, you calculated all the gains correctly and didn't choose the highest gain attribute. Constructing a tree in this fashion would mean it is deeper and more complex when we are looking for simplicity.
@fiacobelli7 жыл бұрын
Vyshakh Unnikrishnan correct. I should have picked humidity. My bad.
@fiacobelli7 жыл бұрын
Correct. I picked the wrong one. I should have picked humidity
@IgraphyRupage4 жыл бұрын
This video is great! Thank you!
@pruthvirajsuresh34678 жыл бұрын
crystal clear. thank you very much.
@OskarCeso7 жыл бұрын
Made it clear for me. Thanks!
@cdm8916 жыл бұрын
im really confused if someone can please help me understand.... first you get the entropy each attribute, then the entropy of each subset of that attribute, then you find the gain....but to construct the tree you look at the entropy of the attribute not the gain? what is the purpose of finding the gain if you dont use it to construct the tree? Thank you
@snehanshus70396 жыл бұрын
you use the gain to split the tree
@pericharlasai72425 жыл бұрын
What is entropy
@LutfarRahmanMilu7 жыл бұрын
This is a great tutorial! Thank you!
@johnstyl3 жыл бұрын
21:06 you forgot to say to also subtract the P of negative result multiplied by the log base two of that for the entropy of high wind, great vid aside from that
@cyrilchubenko26577 жыл бұрын
Thank you very much! At last I get the idea of those variables