0% found this document useful (0 votes)
23 views28 pages

Local Explainability in Python with SHAP

The document discusses local explainability in AI using SHAP and LIME, highlighting their importance in interpreting model predictions for specific instances. It provides examples of using SHAP for heart disease risk prediction and LIME for admissions prediction, along with visualizations like waterfall plots. Additionally, it covers text and image explainability, demonstrating how LIME can be applied to analyze the impact of words and image features on model outputs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views28 pages

Local Explainability in Python with SHAP

The document discusses local explainability in AI using SHAP and LIME, highlighting their importance in interpreting model predictions for specific instances. It provides examples of using SHAP for heart disease risk prediction and LIME for admissions prediction, along with visualizations like waterfall plots. Additionally, it covers text and image explainability, demonstrating how LIME can be applied to analyze the impact of words and image features on model outputs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Local explainability

with SHAP
EXPLAINABLE AI IN PYTHON

Fouad Trad
Machine Learning Engineer
Global vs. local explainability
Global explainability Local explainability
Overall model behavior Explains prediction for specific data point

Doesn't explain individual instances Crucial for sensitive applications

1 Images generated by DALL-E

EXPLAINABLE AI IN PYTHON
Heart disease dataset
age sex chest_pain_type blood_pressure ecg_results thalassemia target
52 1 0 125 1 3 0
53 1 0 140 0 3 0
70 1 0 145 1 3 0
61 1 0 148 1 3 0
62 0 0 138 1 2 0

knn : KNN classifier predicting risk of heart disease

EXPLAINABLE AI IN PYTHON
Local explainability with SHAP
explainer = [Link](knn.predict_proba, [Link](X, 10))

test_instance = [Link][0, :]

shap_values = explainer.shap_values(test_instance)

print(shap_values.shape)

(6, 2)

EXPLAINABLE AI IN PYTHON
SHAP waterfall plots
Shows how features increase or decrease model's prediction

EXPLAINABLE AI IN PYTHON
SHAP waterfall plots
Shows how features increase or decrease model's prediction

EXPLAINABLE AI IN PYTHON
SHAP waterfall plots
Shows how features increase or decrease model's prediction

EXPLAINABLE AI IN PYTHON
Creating waterfall plots
shap.waterfall_plot(
[Link](
values=shap_values[:,1],
base_values=explainer.expected_value[1],
data=test_instance,
feature_names=[Link]
)
)

EXPLAINABLE AI IN PYTHON
Waterfalls for several instances

EXPLAINABLE AI IN PYTHON
Let's practice!
EXPLAINABLE AI IN PYTHON
Local explainability
with LIME
EXPLAINABLE AI IN PYTHON

Fouad Trad
Machine Learning Engineer
LIME → Local Interpretable Model-Agnostic Explanations

Explains predictions of complex models

Works on individual instances

Agnostic to model type

EXPLAINABLE AI IN PYTHON
Lime explainers
Tailored to different kinds of data

EXPLAINABLE AI IN PYTHON
Lime explainers
Tailored to different kinds of data

EXPLAINABLE AI IN PYTHON
Lime explainers
Tailored to different kinds of data
Generates perturbations around a sample

Sees effect on model's output

Constructs simpler model for explanation

EXPLAINABLE AI IN PYTHON
Lime explainers
Tailored to different kinds of data
Generates perturbations around a sample

Sees effect on model's output

Constructs simpler model for explanation

EXPLAINABLE AI IN PYTHON
Admissions dataset
GRE Score TOEFL Score University Rating SOP LOR CGPA Chance of Admit Accept
337 118 4 4.5 4.5 9.65 0.92 1
324 107 4 4 4.5 8.87 0.76 1
316 104 3 3 3.5 8 0.72 1
322 110 3 3.5 2.5 8.67 0.8 1
314 103 2 2 3 8.21 0.45 0

regressor : predicts chance of admit

classifier : predicts acceptance

Features in X

EXPLAINABLE AI IN PYTHON
Creating tabular explainer
Regression Classification
from lime.lime_tabular import LimeTabularExplainer from lime.lime_tabular import LimeTabularExplainer
instance = [Link][1,:] instance = [Link][1,:]

explainer_reg = LimeTabularExplainer( explainer_class = LimeTabularExplainer(


[Link], [Link],
feature_names=[Link], feature_names=[Link],
mode='regression' mode='classification'
) )

explanation_reg = explainer_reg.explain_instance( explanation_class = explainer_class.explain_instance(


[Link], [Link],
[Link] classifier.predict_proba
) )

EXPLAINABLE AI IN PYTHON
Visualizing explanation
Regression Classification

explanation_reg.as_pyplot_figure() explanation_class.as_pyplot_figure()

EXPLAINABLE AI IN PYTHON
SHAP vs. LIME
SHAP LIME

shap.waterfall_plot(...) explanation_class.as_pyplot_figure()

EXPLAINABLE AI IN PYTHON
Let's practice!
EXPLAINABLE AI IN PYTHON
Text and image
explainability with
LIME
EXPLAINABLE AI IN PYTHON

Fouad Trad
Machine Learning Engineer
Text-based models
Process and interpret written language
Example: Sentiment analysis

Black box models

LimeTextExplainer explains such models


Finds how each word impacts prediction

EXPLAINABLE AI IN PYTHON
LIME text explainer
from lime.lime_text import LimeTextExplainer

text_instance =
"This product has great features but a poor design."

def model_predict(instance):
...
return class_probabilities

explainer = LimeTextExplainer()
exp = explainer.explain_instance(
text_instance,
model_predict
)

exp.as_pyplot_figure()

EXPLAINABLE AI IN PYTHON
Image-based models
Highly complex
Interpret visual data

Example: Food classification

LimeImageExplainer explains such models


Finds which parts of image impact predictions

EXPLAINABLE AI IN PYTHON
LIME image explainer
from lime.lime_image import LimeImageExplainer

explainer = LimeImageExplainer()
explanation = explainer.explain_instance(
image,
model_predict,
num_samples=50
)

temp, _ = explanation.get_image_and_mask(
explanation.top_labels[0],
hide_rest=True
)

EXPLAINABLE AI IN PYTHON
LIME image explainer
from lime.lime_image import LimeImageExplainer

explainer = LimeImageExplainer()
explanation = explainer.explain_instance(
image,
model_predict,
num_samples=50
)

temp, _ = explanation.get_image_and_mask(
explanation.top_labels[0],
hide_rest=True
)

[Link](temp)

EXPLAINABLE AI IN PYTHON
Let's practice!
EXPLAINABLE AI IN PYTHON

You might also like