Skip to content
#

shap

Here are 129 public repositories matching this topic...

mljar-supervised
moshe-rl
moshe-rl commented Nov 30, 2021

When using r2 as eval metric for regression task (with 'Explain' mode) the metric values reported in Leaderboard (at README.md file) are multiplied by -1.
For instance, the metric value for some model shown in the Leaderboard is -0.41, while when clicking the model name leads to the detailed results page - and there the value of r2 is 0.41.
I've noticed that when one of R2 metric values in the L

bug help wanted good first issue
oegedijk
oegedijk commented May 5, 2021

Recently dtreeviz has added support for lightgbm models: https://github.com/parrt/dtreeviz/

So it would be cool to add the same tree visualization in explainerdashboard that already exist for RandomForest, ExtraTrees and XGBoost models.

If someone wants to pitch in to help extract individual predictions of decision trees inside lightgbm boosters and then get them in shape to be used by the

help wanted good first issue

This repository introduces different Explainable AI approaches and demonstrates how they can be implemented with PyTorch and torchvision. Used approaches are Class Activation Mappings, LIMA and SHapley Additive exPlanations.

  • Updated Sep 5, 2019
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the shap topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the shap topic, visit your repo's landing page and select "manage topics."

Learn more