Partial Dependence (PDPs) and Individual Conditional Expectation (ICE) Plots | Intuition and Math

  Рет қаралды 1,801

A Data Odyssey

A Data Odyssey

Күн бұрын

Пікірлер: 12
@adataodyssey
@adataodyssey 6 ай бұрын
🚀 Free Course 🚀 Signup here: mailchi.mp/40909011987b/signup XAI course: adataodyssey.com/courses/xai-with-python/ SHAP course: adataodyssey.com/courses/shap-with-python/
@josephbolton8092
@josephbolton8092 25 күн бұрын
This was so great
@adataodyssey
@adataodyssey 25 күн бұрын
Thanks Joseph :)
@myselfandpesit
@myselfandpesit 2 ай бұрын
Clear explanation. Thanks!
@adataodyssey
@adataodyssey 2 ай бұрын
No problem :)
@makefly3305
@makefly3305 5 ай бұрын
Hi I have a question at 5:45, wanna know based on which pattern of the plot you said the "km_driven" is less equally distributed and skewed to the left? 😄
@adataodyssey
@adataodyssey 5 ай бұрын
I'm looking at the bars on the x-axis. This is known as a "rug plot". 10% of the dataset falls before the first bar, 20% before the second bar and so on... You can see that the bars are shifted towards the left. This means that most of the dataset has a lower km_driven value. I hope that makes sense?
@IsmaelSilva-po2xb
@IsmaelSilva-po2xb 4 ай бұрын
Writing a subsection on ICE on my master dissertation
@adataodyssey
@adataodyssey 4 ай бұрын
Great Ismael! I hope this video helped :)
@TheCsePower
@TheCsePower 6 ай бұрын
Would be nice if the pdp had some kind of confidence interval that varied with the feature value.
@adataodyssey
@adataodyssey 6 ай бұрын
That's a good idea! You might be able to use the std of the prediction around each point. It would be related to the ICE plot where a point would have a larger std if not all the individual lines follow the same trend.
PDPs and ICE Plots | Python Code | scikit-learn Package
12:57
A Data Odyssey
Рет қаралды 769
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 338 М.
КОГДА К БАТЕ ПРИШЕЛ ДРУГ😂#shorts
00:59
BATEK_OFFICIAL
Рет қаралды 8 МЛН
Человек паук уже не тот
00:32
Miracle
Рет қаралды 4,2 МЛН
Feature Selection using Hierarchical Clustering | Python Tutorial
15:55
Partial Dependence Plots (Opening the Black Box)
10:54
ritvikmath
Рет қаралды 16 М.
Accumulated Local Effect Plots (ALEs) | Explanation & Python Code
13:44
Friedman's H-statistic for Analysing Interactions | Maths and Intuition
15:06
Interpretable Machine Learning - Feature Effects - Partial Dependence (PD) Plot
17:39
Statistical Learning and Data Science
Рет қаралды 1,3 М.
Has Generative AI Already Peaked? - Computerphile
12:48
Computerphile
Рет қаралды 1 МЛН
Get more out of Explainable AI (XAI): 10 Tips
13:47
A Data Odyssey
Рет қаралды 1 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 373 М.