Training: Time Complexity: O(d*nlogn) Space Complexity: O(nodes/rules) Inferencing: Time Complexity: O(depth) Space Complexity: O(nodes/rules)
Пікірлер: 8
@frankl18 ай бұрын
This is only the time complexity needed to sort the features. This has to be repeated for each node in the tree
@hamzawi2752Ай бұрын
Assuming that the worset case happens in a balanced decision tree which has log(n) levels where n is number of samples. The time complexity in inference is measured by the depth of the tree which equals to log(n). Therefore. Single Sample Inference: O(log(n)) Inference for m Samples: O(m * log(n))
@panchajanyanaralasetty73708 ай бұрын
(n (log n) * d), has to be multiplied with no. of splits in the right?
@srisaisubramanyamdavanam99122 ай бұрын
No that is not good way.Instead u sort features at first itself then u use throughout the training process.That is more efficient
@frankl1 Жыл бұрын
The training time complexity in the video should be multiplied by the number of nodes in the tree. This is n in the worst case. So the final training time complexity should be O(n^2logn * d)
@jinxscript8 ай бұрын
isnt it n log2 n
@srisaisubramanyamdavanam99122 ай бұрын
We define time complexity for a split not for whole process
@hamzawi2752Ай бұрын
@@jinxscript you are right. It should be O(n * d * log(n)^2)