Part 1-Decision Tree Classifier Indepth Intuition In Hindi| Krish Naik

  Рет қаралды 117,605

Krish Naik Hindi

Krish Naik Hindi

Күн бұрын

Пікірлер: 80
@krishnaikhindi
@krishnaikhindi 2 жыл бұрын
We are happy to announce iNeuron is coming up with the 6 months Live Full Stack Data Analytics batch with job assistance and internship starting from 18th June 2022.The instructor of the course will be me and Sudhanshu. The course price is really affordable 4000rs Inr including GST. The course content will be available for lifetime along with prerecorded videos. You can check the course syllabus below Course link: courses.ineuron.ai/Full-Stack-Data-Analytics From my side you can avail addition 10% off by using Krish10 coupon code. Don't miss this opportunity and grab it before it's too late. Happy Learning!!
@AmeliaMelia-tj3kc
@AmeliaMelia-tj3kc 5 ай бұрын
best teacher
@harshdeepjaggi9715
@harshdeepjaggi9715 Жыл бұрын
wonderful explanation sir.... I'm already enrolled in Data Science with one of the edtech of India... no doubt waha ke teachers bhi accha padhate par jo english mei content hai wo mind mei ek baar mei acche se nhi jaata... ye content hindi wala raise ghus gya mind mei ki bus ab hamesha yaad rhega... Thankyou for your efforts..
@netviz8673
@netviz8673 5 ай бұрын
decision tree for both classification regression here classification. 2 techniqiues. ID3 and CART. In Cart the decision tress spilts into binary trees. a)Entropy and Gini Index(Purity spirit) b) Information Gain(feature decision tree split). To check for Pure split two techniques called Entropy and Gini impurity are used and second technique called INformation gain (how the features are selected) is used. When H(S) is zero then that is pure split. And when H(s) is 1 then that is impure split ie equal distribution (eg 3yes and 3nos). The range of entropy remains between 0 to 1. In impure split the Gini impurity comes out to be 0.5 and in pure split it is 0. So the gini impurity ranges between 0 to 0.5. So in impure split the max value of gini impurity is 0.5 and in pure split it is 0. gini impurity is preferrable over entropy because of involvement of log it may slow down Now if you have multiple features, you use information gain to know how to make the tree using the given features whether which feature will start and which one will follow later. The feature starting with which the information gain calculation comes out to be the most should be the one with which the decision tree should be started.
@techinicalocean8733
@techinicalocean8733 3 ай бұрын
Thank you so much buddy God bless you
@aravindkumar6631
@aravindkumar6631 Ай бұрын
God bless you frnd
@prathameshekal7308
@prathameshekal7308 2 жыл бұрын
मंडळ आभारी आहे
@adityapradhan8474
@adityapradhan8474 8 күн бұрын
Thank you so much sir. All these maths concepts are helping a lot! I am able to visualize clearly that what's happening in the background of all these ml algorithms.
@jasonbourn29
@jasonbourn29 2 жыл бұрын
Thanx sir in Hindi explanations you tend to cover topics better(English vedios are also of far better quality than anyone else)
@pritamrajbhar9504
@pritamrajbhar9504 Жыл бұрын
It is one of the best and simplest explanations till far
@GopalKrishna-p4m
@GopalKrishna-p4m Жыл бұрын
after one an era No one will beat you sir !! incredible explanation thankyou so much sir
@patelanjali2909
@patelanjali2909 Жыл бұрын
your teaching skill awesomwe.
@RiteshBhalerao-b2v
@RiteshBhalerao-b2v Жыл бұрын
Great explaination...hard to find anywhere else👌👌
@akashlinganwar4810
@akashlinganwar4810 2 жыл бұрын
hello krish sir... your explanation is easy to understand and anyone can learn easily..thank you sir...😊
@utkarsh5165
@utkarsh5165 9 ай бұрын
Thank you Krish for Crystal Clear Explanation.❤
@adityatiwari7287
@adityatiwari7287 10 ай бұрын
Awesome Explanation....Thanks A Lot....Keep It Up !!!
@maukaladka4100
@maukaladka4100 2 жыл бұрын
Worth watching😍😍😍
@sandeepbehera3605
@sandeepbehera3605 2 жыл бұрын
wonderful explanations sir no one can explain like you...🙏🙏🙏 thankyou..sir😇
@palakgrover9925
@palakgrover9925 2 ай бұрын
Amazing Explanation 😃
@Keep_Laughfing
@Keep_Laughfing 2 жыл бұрын
Sir aise hi video bnate rhiye apko shayd pta bhi nhi hogaa ki ye aapki kitni bdi help h DATA SCIENCE lovers ke liye. Dil se dhanyvaad 🙏🙏🙏
@Er_IT_prashantjha
@Er_IT_prashantjha 8 ай бұрын
Bohot achha explain krte ho Sir aap 👌🏻💯
@aamiransarii
@aamiransarii Жыл бұрын
Many, Many Thanks .....so lovely of you
@mainakseal5027
@mainakseal5027 Жыл бұрын
what an amazing tutorial...hats off sirji!!!...
@DrZubairulIslam
@DrZubairulIslam 4 ай бұрын
Thanks Krish, Best Ever Video, Wao.
@sauravsahay8803
@sauravsahay8803 9 ай бұрын
You make everything look so easy
@IrfanSaleem541
@IrfanSaleem541 6 ай бұрын
Thanks a lot. Love and Respect from Oman
@AbhishekSharma-cj9to
@AbhishekSharma-cj9to 3 ай бұрын
this is really amazing sir 🙏
@SumitJindal-e3k
@SumitJindal-e3k Жыл бұрын
That was awesome
@mohammadshad4168
@mohammadshad4168 Ай бұрын
Thanks sir for the great explanation
@ruhiraj9551
@ruhiraj9551 2 жыл бұрын
Hello Krish sir ..thanku so much 🙏 for a very excellent explanation .
@maths_impact
@maths_impact 2 жыл бұрын
Wonderful explanation given by you sir in hindi.
@ketanitaliya9493
@ketanitaliya9493 Жыл бұрын
Thanks it is really helpful and easy to understand
@vidvaanwithromi5213
@vidvaanwithromi5213 Жыл бұрын
wonderfully explained sir!
@arunchougale5927
@arunchougale5927 2 жыл бұрын
Very Good explained by you .it is lot help me Thank u very much
@dhavalsukhadiya8654
@dhavalsukhadiya8654 9 ай бұрын
great explaination sir
@nisho404
@nisho404 Жыл бұрын
in one word bosssss
@jaiprakashsingh756
@jaiprakashsingh756 2 жыл бұрын
sir, I really find your videos very helpful. thanks a lot.
@osamaosama-vh6vu
@osamaosama-vh6vu 2 жыл бұрын
Your legend deae sir thank u be happy 😍
@gurpreetkaur-pf1bf
@gurpreetkaur-pf1bf 8 ай бұрын
Amazing ❤
@bryan4592
@bryan4592 5 ай бұрын
Amazing video sir
@SonuK7895
@SonuK7895 2 жыл бұрын
Very well explained sirjiiii
@girishgogate2733
@girishgogate2733 2 жыл бұрын
Thankyou krish sir ........
@AjmalShah-s7n
@AjmalShah-s7n 2 ай бұрын
Thank you sir G
@tuhinbarai7592
@tuhinbarai7592 Жыл бұрын
thank you sir..its to understand...
@BikashKonwar-w7q
@BikashKonwar-w7q Жыл бұрын
In Entropy formula summation of p(x) * log2(p(x))
@katw434
@katw434 2 жыл бұрын
Thanks sir please continue this series
@abhiWorldIN
@abhiWorldIN 5 ай бұрын
Awesome video
@mrityunjayupadhyay7332
@mrityunjayupadhyay7332 2 жыл бұрын
Great explanation
@meenalpande
@meenalpande Жыл бұрын
Nice explanation
@jasanimihir4994
@jasanimihir4994 2 жыл бұрын
As always Very well explained. I have one query sir. You told that if the dataset is very big then use gini index otherwise entropy is fine. But finding the entropy is must for the information gain as no mention of Gini index in information gain formula. So is it possible to use gini index to find information gain? Kindly throw light on that. 😊
@shaiqmahmood
@shaiqmahmood Жыл бұрын
There is a way to calculate the Information Gain using Gini index as well.
@krishj8011
@krishj8011 8 ай бұрын
nice tutorial
@razashaikh8698
@razashaikh8698 2 жыл бұрын
Sir, can we find information gain using ginny impurity?
@abhinayjangde
@abhinayjangde Ай бұрын
completed 😘
@anusuyaghosal6961
@anusuyaghosal6961 9 ай бұрын
sir you said H(s) is the entropy of root node but i think it is the entropy of target attribute
@Pankaj_Khanal_Joshi
@Pankaj_Khanal_Joshi Жыл бұрын
Sir sklearrn and seaborn ka video banaiye . thank u
@neelkalyani7849
@neelkalyani7849 8 ай бұрын
In calculating information gain, can we use gini impurity instead of entropy?
@maths_impact
@maths_impact 2 жыл бұрын
Wonderful
@asifmahmoud126
@asifmahmoud126 29 күн бұрын
Start at 21:17
@pkumar0212
@pkumar0212 6 ай бұрын
👌
@sadiqpurki104
@sadiqpurki104 23 күн бұрын
I understand
@sugandhaarora8174
@sugandhaarora8174 5 ай бұрын
is this required for data analyst role?
@abhishekbhad4029
@abhishekbhad4029 11 ай бұрын
Very nice explanation sir.i have one question how to get intership as no one is hiring for fresher
@shreyashnarvekar6391
@shreyashnarvekar6391 Ай бұрын
Where is the practical implementation ? can anyone guide me where can I find it ?
@priyanshupokhariya8866
@priyanshupokhariya8866 Жыл бұрын
0*log0 is undefined how is it coming 0??
@VikasSingh-nq5yx
@VikasSingh-nq5yx 2 жыл бұрын
Sirrrr.... ❤ I have a question 🙋! If interviewer ask a question why we are using minus ( - ) sign in Entropy? Please reply........ ❤
@saranshsehrawat8544
@saranshsehrawat8544 2 жыл бұрын
its formula
@neeraj.kumar.1
@neeraj.kumar.1 2 жыл бұрын
Don't worry they don't ask these types of mathematical formulas. They can ask what is Gini impurity.
@rushikeshwaghmare3446
@rushikeshwaghmare3446 Жыл бұрын
Sir acc to external sites gini impurity ranges from 0-1
@rushikeshwaghmare3446
@rushikeshwaghmare3446 Жыл бұрын
Please confirm on this…
@prerakchoksi2379
@prerakchoksi2379 2 жыл бұрын
wow
@vatsalshingala3225
@vatsalshingala3225 Жыл бұрын
❤❤❤❤❤❤❤❤❤❤
@SpectreWarfare
@SpectreWarfare Жыл бұрын
Video volume is very less. It is difficult to listen
@ShivamSharma-if1oh
@ShivamSharma-if1oh 2 жыл бұрын
My answer of Entropy is coming 0.6 not 1
@aaqibafraaz6215
@aaqibafraaz6215 2 жыл бұрын
Hello sir
@aaqibafraaz6215
@aaqibafraaz6215 2 жыл бұрын
You r doing a great job
@importantforlifeyasharora9042
@importantforlifeyasharora9042 2 ай бұрын
What a definition, entropy ranges 0 to 1 and ginni impurity ranges between 0 to 0.5.😂
@roshanYadav-y8f
@roshanYadav-y8f 2 ай бұрын
❤❤❤
@roshanYadav-y8f
@roshanYadav-y8f 2 ай бұрын
Kya bhai tum bhi data science ki preparation ker rahe ho
Мен атып көрмегенмін ! | Qalam | 5 серия
25:41
1% vs 100% #beatbox #tiktok
01:10
BeatboxJCOP
Рет қаралды 67 МЛН
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 360 М.
Understanding B-Trees: The Data Structure Behind Modern Databases
12:39
Decision and Classification Trees, Clearly Explained!!!
18:08
StatQuest with Josh Starmer
Рет қаралды 854 М.
How I'd learn ML in 2025 (if I could start over)
16:24
Boris Meinardus
Рет қаралды 191 М.
Decision Tree Classification Clearly Explained!
10:33
Normalized Nerd
Рет қаралды 755 М.
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 550 М.
Мен атып көрмегенмін ! | Qalam | 5 серия
25:41