Assuming that the worset case happens in a balanced decision tree which has log(n) levels where n is number of samples. The time complexity in inference is measured by the depth of the tree which equals to log(n). Therefore. Single Sample Inference: O(log(n)) Inference for m Samples: O(m * log(n))
@xyz_123812 ай бұрын
Why are considering xq a feature vector as it just a instance for which we have to predict label ??
@JahanaraImam-w5w3 ай бұрын
Miller Donald Wilson Ruth Davis Barbara
@tejaskumar37593 ай бұрын
can we have 2 or more node vectors in a ndata schema ?
@spotlight22036 ай бұрын
Can you provide Colab file?
@panchajanyanaralasetty73708 ай бұрын
(n (log n) * d), has to be multiplied with no. of splits in the right?
@srisaisubramanyamdavanam99122 ай бұрын
No that is not good way.Instead u sort features at first itself then u use throughout the training process.That is more efficient
@frankl18 ай бұрын
This is only the time complexity needed to sort the features. This has to be repeated for each node in the tree
@zma34659 ай бұрын
Very very clear, thanks man!
@sand92829 ай бұрын
It will be good if you share your notebooks so that we can refer to the codes.
@pranavyeleti349910 ай бұрын
should we avoid in random forest also
@rombuk7410 ай бұрын
I think opposite to what said in the video, when k increases impurity gets lower and that's why those features are not selected.
@hyahyahyajay602911 ай бұрын
Thank you sm for your tutorials!! Sharing them with my class :)
@SiyamSajnanChowdhury Жыл бұрын
Really clear explanation. Thank you!
@luis0283 Жыл бұрын
I found gold
@A.Salehzadeh Жыл бұрын
the best.thanks alot
@amacodes7347 Жыл бұрын
Interesting but wondering how to import data in CSV format for a bipartie graph in which all features are in one dataframe for both nodes
@hemalatanayak7144 Жыл бұрын
Sir...Can you please make a video on how to prepare dgl dataset for link prediction...it will be really helpful
@jayurbain Жыл бұрын
Are the Colab notebooks for your video series available? Thanks
@sahilverma3882 Жыл бұрын
Hey mate, How do I contact you ?
@PoojaSingh-tv5kg Жыл бұрын
Please make more videos on GNN
@frankl1 Жыл бұрын
The training time complexity in the video should be multiplied by the number of nodes in the tree. This is n in the worst case. So the final training time complexity should be O(n^2logn * d)
@jinxscript8 ай бұрын
isnt it n log2 n
@srisaisubramanyamdavanam99122 ай бұрын
We define time complexity for a split not for whole process
@hamzawi2752Ай бұрын
@@jinxscript you are right. It should be O(n * d * log(n)^2)
@anjalichoudhary2093 Жыл бұрын
Well explained!! Kudos :) Do you also have git repo of this
@Jas-l9q2 жыл бұрын
no other channel as much as I know has explained decision tree like u've done.. thanks a lot..
@meganfoxyou2 жыл бұрын
Very helpful!
@gabeblanco48122 жыл бұрын
Great explanation. Why are you squaring values within the Gini Impurity calc? I thought it was 1 - p(i) only
@sriharshai61732 жыл бұрын
Thanks. This video on gini impurity should clarify your doubt: kzbin.info/www/bejne/f6bUgoSpd5Wdetk
@gowripriyathota4382 жыл бұрын
Thank you brother for making it clear about the entropy & gini impurity.
@dharmeshsingh34402 жыл бұрын
Thankyou sir, please keep making videos...😀 I don't think there are lots of ML videos on KZbin with such great conecpt-explaining details.
@UGCNETCSE2 жыл бұрын
You are going good .. nice video
@UGCNETCSE2 жыл бұрын
Nice videos shriharsha
@UGCNETCSE2 жыл бұрын
Nice hari
@madhavaraomettu11592 жыл бұрын
Thank you so much sir...i was waiting for this topic from long time