A collection of research materials on explainable AI/ML
-
Updated
Mar 21, 2025 - Markdown
A collection of research materials on explainable AI/ML
Generate Diverse Counterfactual Explanations for any machine learning model.
Optimal binning: monotonic binning with constraints. Support batch & stream optimal binning. Scorecard modelling and counterfactual explanations.
CARLA: A Python Library to Benchmark Algorithmic Recourse and Counterfactual Explanation Algorithms
An Open-Source Library for the interpretability of time series classifiers
A package for Counterfactual Explanations and Algorithmic Recourse in Julia.
The repository contains lists of papers on causality and how relevant techniques are being used to further enhance deep learning era computer vision solutions.
Model Agnostic Counterfactual Explanations
Meaningfully debugging model mistakes with conceptual counterfactual explanations. ICML 2022
A collection of algorithms of counterfactual explanations.
CEML - Counterfactuals for Explaining Machine Learning models - A Python toolbox
This repository contains the official code for the CVPR 2023 paper ``Adversarial Counterfactual Visual Explanations''
A list of research papers of explainable machine learning.
Code to reproduce our paper on probabilistic algorithmic recourse: https://arxiv.org/abs/2006.06831
Code for the paper "Getting a CLUE: A Method for Explaining Uncertainty Estimates"
Code accompanying the paper "Preserving Causal Constraints in Counterfactual Explanations for Machine Learning Classifiers"
Code and data for decision making under strategic behavior, NeurIPS 2020 & Management Science 2024.
[TPAMI 2025] Generalized Semantic Contrastive Learning via Embedding Side Information for Few-Shot Object Detection
Official Code for the ACCV 2022 paper Diffusion Models for Counterfactual Explanations
This is an official implementation for PROMPT-CAM: A Simpler Interpretable Transformer for Fine-Grained Analysis (CVPR'25)
Add a description, image, and links to the counterfactual-explanations topic page so that developers can more easily learn about it.
To associate your repository with the counterfactual-explanations topic, visit your repo's landing page and select "manage topics."