Front page example (XGBoost)

The code from the front page example using XGBoost.


In [1]:
import xgboost
import shap

# train XGBoost model
X,y = shap.datasets.boston()
model = xgboost.XGBRegressor().fit(X, y)

# explain the model's predictions using SHAP values
# (same syntax works for LightGBM, CatBoost, and scikit-learn models)
background = shap.maskers.TabularIndependent(X, sample=100)
explainer = shap.Explainer(model, background)
shap_values = explainer(X)

# visualize the first prediction's explanation
shap.plots.waterfall(shap_values[0])



In [2]:
# plot the global importance of each feature
shap.plots.bar(shap_values[0], show=False)



In [3]:
shap.plots.initjs()

# visualize the first prediction's explanation
shap.plots.force(shap_values[0])


Out[3]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

In [4]:
# visualize the first prediction's explanation
shap.plots.force(shap_values)


Out[4]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

In [5]:
# plot the importance of a single feature across all samples
shap.plots.dependence(shap_values[:,"RM"], color=shap_values)



In [6]:
# plot the global importance of each feature
shap.plots.bar(shap_values)



In [7]:
# plot the distribution of importances for each feature over all samples
shap.plots.summary(shap_values)