entropyAndGain

  Рет қаралды 59,399

Francisco Iacobelli

Francisco Iacobelli

Күн бұрын

Пікірлер: 32
@erythsea
@erythsea 5 ай бұрын
This is the BEST video on tree, gain and entropy on KZbin.
@Madeinchinaagain
@Madeinchinaagain 7 жыл бұрын
Very helpful. Videos like these make procrastination and cramming possible.
@ArminBishop
@ArminBishop 7 жыл бұрын
I own you my exam. You are my saviour. Best explanation ever!!!
@keithmaly904
@keithmaly904 2 ай бұрын
Very nice walkthrough of the concepts of entropy and gain. Ty.
@davidinfante4887
@davidinfante4887 5 жыл бұрын
Good explanation. I am going to give a diplomat on data science to several colleagues (I am the head of the group of embedded systems with expertise in data mining, visual anlytics) and I was looking easy examples to explain them. This is the example with 0 entropy and highest gain I have found. All the best
@JoseFerreira-dk3gv
@JoseFerreira-dk3gv 8 жыл бұрын
Good video.. in minute 22:50 you pick temperature as the one with maximum information gain (0.570).. shouldn´t you pick Humidity (0.970)? I know that in terms of entropy you want to pick the one closer to 0.5... but that one refers to gain..
@aujoeyd
@aujoeyd 7 жыл бұрын
U probly right
@snehanshus7039
@snehanshus7039 6 жыл бұрын
you're right..max gain should be picked....sleep of tongue perhaps
@sivakumarrallabhandi4548
@sivakumarrallabhandi4548 5 жыл бұрын
That's correct. Excuse the typo and appreciate the video.
@channagirijagadish1201
@channagirijagadish1201 9 жыл бұрын
Clear and Concise Explanation. Appreciate the good work.
@fiacobelli
@fiacobelli 5 жыл бұрын
Entropy is caos. Not knowing what decision to make. An entropy of 1 means that with the information you have, you can't make a good decision. For example guessing what side is a con going to fall has full entropy. You don't know. Now, if I tell you that the coin is loaded and it falls on heads 70% of the time, there's less entropy because the information you have allowes you to make a more precise guess.
@KumarKumar-fc2gc
@KumarKumar-fc2gc Жыл бұрын
Thank you so much for the crisp and clear explanation....
@harisk.1466
@harisk.1466 3 жыл бұрын
21:29 I think you should say, you choosed "outlook" because it has highest information gain and the lowest entropy among all other features? Correct me if I am wrong!
@cdm891
@cdm891 6 жыл бұрын
21:30 where does the entropy of .970 come from? Gain(sunny, Humdity) = .970 - (3/5) …. etc where did you get the .970? Thanks
@arjunsrinivasan3751
@arjunsrinivasan3751 5 жыл бұрын
It's the entropy of the set Sunny. All Gain examples are for sunny
@datascientist2958
@datascientist2958 4 жыл бұрын
Can you guide Gini index work with information gain to rank feature? Or Gini index work with Gini impurity to rank feature?
@kulin6522
@kulin6522 3 жыл бұрын
Really good video, congratulations to the creator
@vyshakhunnikrishnan
@vyshakhunnikrishnan 7 жыл бұрын
22:40, you calculated all the gains correctly and didn't choose the highest gain attribute. Constructing a tree in this fashion would mean it is deeper and more complex when we are looking for simplicity.
@fiacobelli
@fiacobelli 7 жыл бұрын
Vyshakh Unnikrishnan correct. I should have picked humidity. My bad.
@fiacobelli
@fiacobelli 7 жыл бұрын
Correct. I picked the wrong one. I should have picked humidity
@IgraphyRupage
@IgraphyRupage 4 жыл бұрын
This video is great! Thank you!
@pruthvirajsuresh3467
@pruthvirajsuresh3467 8 жыл бұрын
crystal clear. thank you very much.
@OskarCeso
@OskarCeso 7 жыл бұрын
Made it clear for me. Thanks!
@cdm891
@cdm891 6 жыл бұрын
im really confused if someone can please help me understand.... first you get the entropy each attribute, then the entropy of each subset of that attribute, then you find the gain....but to construct the tree you look at the entropy of the attribute not the gain? what is the purpose of finding the gain if you dont use it to construct the tree? Thank you
@snehanshus7039
@snehanshus7039 6 жыл бұрын
you use the gain to split the tree
@pericharlasai7242
@pericharlasai7242 5 жыл бұрын
What is entropy
@LutfarRahmanMilu
@LutfarRahmanMilu 7 жыл бұрын
This is a great tutorial! Thank you!
@johnstyl
@johnstyl 3 жыл бұрын
21:06 you forgot to say to also subtract the P of negative result multiplied by the log base two of that for the entropy of high wind, great vid aside from that
@cyrilchubenko2657
@cyrilchubenko2657 7 жыл бұрын
Thank you very much! At last I get the idea of those variables
@sidk5919
@sidk5919 7 жыл бұрын
Thanks, you made it really clear!
@melkyewereta8113
@melkyewereta8113 7 жыл бұрын
I found it very useful..thank u.
@sukursukur3617
@sukursukur3617 2 жыл бұрын
Wauwwww
hmm
33:36
Francisco Iacobelli
Рет қаралды 30 М.
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 208 М.
Леон киллер и Оля Полякова 😹
00:42
Канал Смеха
Рет қаралды 4,7 МЛН
Chain Game Strong ⛓️
00:21
Anwar Jibawi
Рет қаралды 41 МЛН
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН
StatQuest: Random Forests Part 1 - Building, Using and Evaluating
9:54
StatQuest with Josh Starmer
Рет қаралды 1,2 МЛН
A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
10:41
Aurélien Géron
Рет қаралды 357 М.
Introduction to Entropy for Data Science
9:01
mfschulte222
Рет қаралды 127 М.
Decision Tree 3: which attribute to split on?
7:08
Victor Lavrenko
Рет қаралды 239 М.
Naïve Bayes Classifier -  Fun and Easy Machine Learning
11:59
Augmented AI
Рет қаралды 443 М.
Inference in First Order Logic (FOL) and Unification
20:14
Francisco Iacobelli
Рет қаралды 76 М.
Decision Tree 1: how it works
9:26
Victor Lavrenko
Рет қаралды 706 М.
Math Olympiad l A Beautiful Exponential Problem l VIJAY Maths
10:30