MindXLib is an open toolkit ensemble of algorithmic achievements in XAI (Explainable AI) from the Data Decision Team at Alibaba DAMO Academy's Decision Intelligence Lab.
You can install MindXLib using pip:
pip install mindxlib
import xgboost
import shap # just for retrival of adult dataset
from mindxlib import ShapExplainer
# Load adult dataset
X, y = shap.datasets.adult()
# Train XGBoost classifier
model = xgboost.XGBClassifier()
model.fit(X, y)
# Initialize Tree SHAP explainer
explainer = ShapExplainer(model, method="tree")
# Generate explanations
explanation = explainer.explain(X[:1000], baseline=X, mode="origin")
# Show scatter plot for Age feature
explanation.show('scatter', feature='Age')import pandas as pd
import numpy as np
from mindxlib import SSRL
from mindxlib.data import tic_tac_toe
# Load tic-tac-toe dataset
X, y = tic_tac_toe()
# Initialize and fit SSRL
explainer = SSRL(cc=10, lambda_1=1, distorted_step=10,
categorical_features=X.columns.tolist())
explainer.fit(X, y)
# Show learned rules
explainer.show()
# Make predictions
predictions = explainer.predict(X)
acc = np.sum(predictions.values == y.values) / y.shape[0]
print(f'Training accuracy: {acc:.2f}')
# Example output:
'''
IF 1==o AND 4==o AND 7==o, THEN negative
ELIF 3==o AND 4==o AND 5==o, THEN negative
ELIF 0==o AND 1==o AND 2==o, THEN negative
ELIF 6==o AND 7==o AND 8==o, THEN negative
ELIF 0==o AND 3==o AND 6==o, THEN negative
ELIF 2==o AND 5==o AND 8==o, THEN negative
ELIF 0!=x AND 4!=x AND 8!=x, THEN negative
ELIF 2!=x AND 4!=x AND 6!=x, THEN negative
ELSE positive
Training accuracy: 0.98
'''The algorithm package currently supports the following models:
- RuleSet - Rule-based classifier using submodular optimization that supports binary classification
- RuleSetImb - Rule-based classifier optimized for imbalanced data that supports binary classification
- Diver - Rule discovery through combinatorial optimization that supports binary classification
- DrillUp - Pattern detection algorithm for discriminative rules that supports binary classification
- SSRL (Scalable Sparse Rule Lists) - Efficient decision rule list learning that supports multi-class classification
- SHAP - SHapley Additive exPlanations for model interpretation, providing customizable baselines and a user-friendly interface
- LIME - Local Interpretable Model-agnostic Explanations
- IG (Integrated Gradients) - Path attribution method for deep learning models
- GAM - Generalized Additive Models with shape functions
- FDTemp - Functional Decomposition Temperature method for explaining temporal black-box models
- Efficient Decision Rule List Learning via Unified Sequence Submodular Optimization
- SLIM: a Scalable Light-weight Root Cause Analysis for Imbalanced Data in Microservice
- Interactive Generalized Additive Models for Electricity Load Forecasting
- Learning Interpretable Decision Rule Sets: A Submodular Optimization Approach
- Explain temporal black-box models via functional decomposition