Decision Tree (Basic Intuition - Entropy, Gini Impurity & Information Gain) | NerdML

  Рет қаралды 47,205

NerdML

NerdML

4 жыл бұрын

This video will help you to understand about basic intuition of Entropy, Information Gain & Gini Impurity used for building Decision Tree algorithm. We will mathematically solve the problem. I have divided Decision Tree tutorial into several parts which will cover basic intuition, Classification problem solving & regression problem solving.
Below topics are explained in this video:
1). Agenda (00:40)
2). Introduction to Decision Tree (01:27)
3). Entropy (03:40)
4). Information Gain (07:53)
5). Gini Impurity (11:12)
Do subscribe to my channel and hit the bell icon to never miss an update in the future:
/ @nerdml
Please find the previous Video link -
Support Vector Machine Kernel Trick (Part - 4) | NerdML : • Support Vector Machine...
For connectivity with Machine Learning enthusiasts & professionals hit the link :
/ ntirawen1
Prerequisites
Basic understanding of Linear Algebra, Probability, Calculus, Matrix & Python programming including pandas, numpy, scikit learn & some visualization tools.
------------------------------------------------------
Creator : Rahul Saini
Please write back to me at rahulsainipusa@gmail.com for more information
Instagram: / 96_saini
Facebook: / rahulsainipusa
LinkedIn: / rahul-s-22ba1993
#DecisionTree, #MachineLearning, #NerdML, #Entropy, #InformationGain, #GiniImpurity, #Mathematics

Пікірлер: 49
@ronitroy2887
@ronitroy2887 4 жыл бұрын
Hey buddy, I was waiting for this video. This is really gud. Keep up the good work👍
@NerdML
@NerdML 4 жыл бұрын
Thanks buddy!!
@thangtran145
@thangtran145 2 жыл бұрын
Great video, subscribed! Clear, concise, friendly, and easy-going !
@NerdML
@NerdML 2 жыл бұрын
Thanks for liking it Happy Learning!!
@raghunathanp3734
@raghunathanp3734 3 жыл бұрын
Brilliant stuff .. keep going .. very helpful for my ML exam in 2days
@raghunathanp3734
@raghunathanp3734 3 жыл бұрын
One more doubt the high value of each of the entropy, Gini, information gain indicates what?
@NerdML
@NerdML 3 жыл бұрын
All the best
@NerdML
@NerdML 3 жыл бұрын
Information gain & entropy value is inversely proportional to each other so, at the same time both of these values can not be high. (Information gain should be high & Entropy should be low for ideal condition.)
@lavanyarao5650
@lavanyarao5650 3 жыл бұрын
Thank you so much
@NerdML
@NerdML 3 жыл бұрын
My pleasure
@arpitcruz
@arpitcruz 3 жыл бұрын
Buddy want tell you that there are hell lot of channel for data science .. but no one is better than u the way u explain
@NerdML
@NerdML 3 жыл бұрын
Thanks, keep sharing the content with your community
@siddheshpowar4208
@siddheshpowar4208 2 жыл бұрын
Nice explanation ✌️...all the things are nicely presents 👌.
@NerdML
@NerdML 2 жыл бұрын
Thanks dude👍🏻 Keep supporting
@naivelearner6357
@naivelearner6357 2 жыл бұрын
amazingly explained...
@NerdML
@NerdML 2 жыл бұрын
Oh thanks😊
@mohinikumari819
@mohinikumari819 2 жыл бұрын
👍
@in5minutes27
@in5minutes27 3 жыл бұрын
Thank u so much
@NerdML
@NerdML 3 жыл бұрын
Keep supporting 👍
@akshara_K.
@akshara_K. 2 жыл бұрын
Awesome video with detailed explanations. Can you suggest other topics/concepts that need to be learnt for Data science? If they are already included in your channel, that would be great.
@NerdML
@NerdML 2 жыл бұрын
Thanks Akshara, glad you liked the content Recently I have uploaded a video on “Complete Roadmap for a learning Data Science from scratch” You can watch that
@akshara_K.
@akshara_K. 2 жыл бұрын
@@NerdML Thank you very much! Much appreciated!
@NerdML
@NerdML 2 жыл бұрын
Pleasure is mine!!
@hannahnguyen4865
@hannahnguyen4865 3 жыл бұрын
Thank u
@NerdML
@NerdML 3 жыл бұрын
My pleasure
@ayushsingh-qn8sb
@ayushsingh-qn8sb 3 жыл бұрын
awesome explaination
@NerdML
@NerdML 3 жыл бұрын
Thanks dude, keep supporting 👍
@debabratasahoo2588
@debabratasahoo2588 2 жыл бұрын
Hlw sir , your content is osm. I much clarify throughout your good teaching way. thanks sir
@NerdML
@NerdML 2 жыл бұрын
Thanks to you for liking it Happy Learning!!
@VarunSharma-ym2ns
@VarunSharma-ym2ns 2 жыл бұрын
Grt session 👌
@NerdML
@NerdML 2 жыл бұрын
Thanks Varun!!
@topperbizzare564
@topperbizzare564 4 жыл бұрын
*NerdML* 💕💐 execellent video 💕 i saw this is too good 💕 thanks for this 💕
@NerdML
@NerdML 4 жыл бұрын
Thanks mate!!
@tymothylim6550
@tymothylim6550 2 жыл бұрын
Thank you very much for the video! Really clear and helpful:)
@NerdML
@NerdML 2 жыл бұрын
Thanks and keep supporting
@patrickmckenna6520
@patrickmckenna6520 3 жыл бұрын
Where do nodes E, F and G come from? Is each node not based off an attribute?
@NerdML
@NerdML 3 жыл бұрын
Sorry for late response Node E, F & G are branches of Impure Sub-tree B & C. As Node B is having 2 Yes & 4 No so further we will divide this node in D & E and assign 2 Yes, 0 No to D and 0 Yes, 4 No to E so that we can reach at pure leaf node. Definitely, each & every node is based off an attribute that's why we are finding entropy of each node & taking valid node for further processing. Hope this make sense to you!
@uditarpit
@uditarpit 2 жыл бұрын
Entropy measures impurity!!
@vaddadisairahul2956
@vaddadisairahul2956 3 жыл бұрын
May I know why is the value of entropy shown in bits? anything we should know about it?
@NerdML
@NerdML 3 жыл бұрын
See it's a standard practice to keep it in bits (binary digits) as due to choice of the base-2 logarithm
@NerdML
@NerdML 3 жыл бұрын
If we use normal logarithm instead of base-2 then unit will be nats
@BalaMurugan-cm6ev
@BalaMurugan-cm6ev 3 жыл бұрын
Wow . . Nice Explanation. Even if we calculate Gini Impurity, we will calculate Information Gain. . . Am I right?
@NerdML
@NerdML 3 жыл бұрын
Yes you can there is minor change in both the entropy and gini index
@kaustubhdwivedi1729
@kaustubhdwivedi1729 2 жыл бұрын
Bas ek baat bolna chahta hoon ki MAJA AAYA !
@NerdML
@NerdML 2 жыл бұрын
Shukriya🙏🏻
@vaishnavikalidass1544
@vaishnavikalidass1544 3 жыл бұрын
Finally I understood these 😂😂😂
@NerdML
@NerdML 3 жыл бұрын
Good to know...keep supporting 👍
@vaishnavikalidass1544
@vaishnavikalidass1544 3 жыл бұрын
@@NerdML can u post more videos like step by step preparation to Data science
@NerdML
@NerdML 3 жыл бұрын
Sure I will whenever I get some free time.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 591 М.
ВОДА В СОЛО
00:20
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 31 МЛН
Slow motion boy #shorts by Tsuriki Show
00:14
Tsuriki Show
Рет қаралды 9 МЛН
WHAT’S THAT?
00:27
Natan por Aí
Рет қаралды 14 МЛН
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 203 М.
Part 1-Decision Tree Classifier Indepth Intuition In Hindi| Krish Naik
34:17
Decision and Classification Trees, Clearly Explained!!!
18:08
StatQuest with Josh Starmer
Рет қаралды 712 М.
Decision Tree Classification Clearly Explained!
10:33
Normalized Nerd
Рет қаралды 641 М.
7.6.2. Entropy, Information Gain & Gini Impurity - Decision Tree
18:23
Regression Trees, Clearly Explained!!!
22:33
StatQuest with Josh Starmer
Рет қаралды 624 М.
The Gini Impurity Index explained in 8 minutes!
8:39
Serrano.Academy
Рет қаралды 38 М.
ВОДА В СОЛО
00:20
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 31 МЛН