4 Significant Limitations of SHAP

  Рет қаралды 10,961

A Data Odyssey

A Data Odyssey

Күн бұрын

SHAP is the most powerful Python package for understanding and debugging your machine learning models. Yet, it still has its limitations. Understanding these is critical to avoid incorrect conclusions when using the package. We explore the 4 most significant limitations of SHAP: issues with the package, feature dependencies, causal inference and human error.
*NOTE*: You will now get the XAI course for free if you sign up (not the SHAP course)
SHAP course: adataodyssey.c...
XAI course: adataodyssey.c...
Newsletter signup: mailchi.mp/409...
Read the companion article (no-paywall link): towardsdatasci...
Medium: / conorosullyds
Twitter: / conorosullyds
Mastodon: sigmoid.social...
Website: adataodyssey.com/

Пікірлер: 32
@adataodyssey
@adataodyssey 7 ай бұрын
*NOTE*: You will now get the XAI course for free if you sign up (not the SHAP course) SHAP course: adataodyssey.com/courses/shap-with-python/ XAI course: adataodyssey.com/courses/xai-with-python/ Newsletter signup: mailchi.mp/40909011987b/signup
@saremish
@saremish 10 ай бұрын
I really enjoyed such a deep discussion about the clear distinction between correlation and causation!
@adataodyssey
@adataodyssey 10 ай бұрын
Thanks Sarem! A very important concept when it comes to XAI. I am definitely guilty of jumping to causality conclusions without enough evidence.
@Hoxle-87
@Hoxle-87 Жыл бұрын
Great video series. Don’t stop making them. Maybe take another app/tool/methodology and break it into parts like you did with SHAP. Very digestible.
@adataodyssey
@adataodyssey Жыл бұрын
Thanks! Planning some more videos soon
@zahrabounik3390
@zahrabounik3390 18 күн бұрын
WOW! Such an amazing explanation on SHAP! I really enjoyed. Thank you.
@adataodyssey
@adataodyssey 16 күн бұрын
No problem Zahra! I'm glad you found it useful
@shrishchandrapandey801
@shrishchandrapandey801 10 ай бұрын
Amazing work, Conor! Keep them coming. These 6 mins have helped clarify so many topics!
@adataodyssey
@adataodyssey 10 ай бұрын
Great to hear! I’m glad I could help.
@yijunfu6808
@yijunfu6808 Жыл бұрын
best youtuber explaining SHAP I have found!
@adataodyssey
@adataodyssey Жыл бұрын
Thank you! I am here to help :)
@jenilsaliya3769
@jenilsaliya3769 Жыл бұрын
good explanation on topic , thank you sir
@adataodyssey
@adataodyssey Жыл бұрын
Thank you Jenil!
@yashnarang3014
@yashnarang3014 7 күн бұрын
Great video!!!
@adataodyssey
@adataodyssey 5 күн бұрын
Thanks Yash!
@cesarepiscopo2549
@cesarepiscopo2549 9 ай бұрын
AMAZING WORK!
@adataodyssey
@adataodyssey 9 ай бұрын
I really appreciate that!
@AZ-ph7gg
@AZ-ph7gg Жыл бұрын
Great explanation!
@adataodyssey
@adataodyssey Жыл бұрын
Thank you :)
@azizjedidi1180
@azizjedidi1180 Жыл бұрын
Great video man. Thank you very much.
@adataodyssey
@adataodyssey Жыл бұрын
I’m glad you enjoyed it Aziz!
@NA-ug5eq
@NA-ug5eq 4 ай бұрын
Amazing video. Thank you so much. I have one question please: When explaining kernelShap, what do you mean by permuting values, please? What does mean grey circles in the graph at time 2.28, please? Does permuting refer to changing features order ( this is not clear in the graph in video at 2.28) or it refers to replacing some feature values with random values? Thank in advance for your response
@adataodyssey
@adataodyssey 3 ай бұрын
Take a look at the theory videos in thius playlist. They should help :) kzbin.info/www/bejne/g4KZl3l6rM-omdE&pp=gAQBiAQB
@sasaglamocak2846
@sasaglamocak2846 Жыл бұрын
Great video. You mentioned that KernelSHAP suffers from extrapolation if features are correlated, like other permutation based methods. What about TreeSHAP with e.g., XGBoost?
@adataodyssey
@adataodyssey Жыл бұрын
Hi Sasa, this is a great question. To be honest, I don't completely understand the TreeSHAP algorithm. Looking into some other literature, it seems like TreeSHAP is not effected by correlations in the same way as KernelSHAP. "KernelSHAP ignores feature dependence. ... TreeSHAP solves this problem by explicitly modeling the conditional expected prediction." Then they go on to say "While TreeSHAP solves the problem of extrapolating to unlikely data points, it does so by changing the value function and therefore slightly changes the game. TreeSHAP changes the value function by relying on the conditional expected prediction. With the change in the value function, features that have no influence on the prediction can get a TreeSHAP value different from zero." You can read more here: christophm.github.io/interpretable-ml-book/shap.html
@sasaglamocak2846
@sasaglamocak2846 Жыл бұрын
@@adataodyssey great, thanks for the answer
@escolhifazeremcasa
@escolhifazeremcasa 28 күн бұрын
Is there any way to deal with limitation 2: Feature Dependencies ?
@adataodyssey
@adataodyssey 27 күн бұрын
Often, even if you have highly correlated features, SHAP will still work. It is just important to keep in mind that it may have problems if you do have highly correlatated features. In this case, you just need to confirm the results from shap using a method that is robust to them like ALEs or simple data exploration methods.
@tschess7
@tschess7 3 ай бұрын
I am confused. You said that Machine Leaning only cares about correlations not association but should it be said "only cares about correlations not causation"?
@adataodyssey
@adataodyssey 3 ай бұрын
Yes, "causation" is correct. Thanks for pointing out the mistake
@mdabubakarchowdhurysunny2846
@mdabubakarchowdhurysunny2846 3 ай бұрын
can show some code about LIME
@adataodyssey
@adataodyssey 3 ай бұрын
Keep an eye for the next video on Monday ;)
SHAP Violin and Heatmap Plots | Interpretations and New Insights
5:26
SHAP with Python (Code and Explanations)
15:41
A Data Odyssey
Рет қаралды 62 М.
Inside Out 2: ENVY & DISGUST STOLE JOY's DRINKS!!
00:32
AnythingAlexia
Рет қаралды 16 МЛН
SHAPALAQ 6 серия / 3 часть #aminkavitaminka #aminak #aminokka #расулшоу
00:59
Аминка Витаминка
Рет қаралды 2,6 МЛН
Man Mocks Wife's Exercise Routine, Faces Embarrassment at Work #shorts
00:32
Fabiosa Best Lifehacks
Рет қаралды 6 МЛН
Which One Is The Best - From Small To Giant #katebrush #shorts
00:17
The mathematics behind Shapley Values
11:48
A Data Odyssey
Рет қаралды 27 М.
Aprenda a usar o pacote SHAP!
15:10
Statplace
Рет қаралды 447
Beeswarm Plots (Including SHAP Values)
5:55
ML & DL Explained
Рет қаралды 6 М.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 314 М.
Shapley Values for Machine Learning
11:06
A Data Odyssey
Рет қаралды 15 М.
How I Became A Data Scientist (No CS Degree, No Bootcamp)
12:28
Egor Howell
Рет қаралды 100 М.
Shapley Additive Explanations (SHAP)
11:52
KIE
Рет қаралды 63 М.
AI can't cross this line and we don't know why.
24:07
Welch Labs
Рет қаралды 956 М.
Inside Out 2: ENVY & DISGUST STOLE JOY's DRINKS!!
00:32
AnythingAlexia
Рет қаралды 16 МЛН