Skip to content


Repository files navigation

Official Tensorflow implementation of papers:

Explaining the Black-box Smoothly- A Counterfactual Approach Paper

Using Causal Analysis for Conceptual Deep Learning Explanation Paper


Train baseline black-box classification model

  1. Download StanfordCheXpert and MIMIC-CXR dataset

  2. Train a classifier. Skip this step if you have a pretrained classifier.

python --config 'configs/Step_1_StanfordCheXpert_Classifier_256.yaml'
  1. Save the output of the trained classifier. User can choose to save the predictions on test set of the same dataset or some different dataset. Set the config appropriately. We used this file to save the prediction on MIMIC-CXR dataset.
python --config 'configs/Step_1_StanfordCheXpert_Classifier_256.yaml'

Generate counterfactual visual explanation

  1. Process the output of the classifier and create input for Explanation model by discretizing the posterior probability. We used this file to create training dataset for Explainer for Pleural Effusion, Cardiomegaly and Edema.
  1. Train a Segmentation Network. We train a lung segmentation network on JSRT dataset
python --config 'configs/Step_2_JSRT_Segmentation_256.yaml'
  1. Train a Object Detector for pacemaker and hardware. The code is borrowed from: Faster_RCNN_TensorFlow
python --config 'configs/Step_3_MIMIC_OD_Pacemaker.yaml'
  1. Trainer explainer model.
python --config 'configs/Step_4_MIMIC_Explainer_256_Pleural_Effusion.yaml'
  1. Explore the trained Explanation model and see qualitative results.


python --config 'configs/Step_4_MIMIC_Explainer_256_Pleural_Effusion.yaml'


  • Pretrained segmentation network for lung and heart. The segmentation network is trained on JSTR dataset and hence is not perfect on MIMIC dataset. Improving the segmentation network is work in progress.
  • Code to compute Cardio Thoracic Ratio (CTR) using lung + heart segmentation.
  • Code to create an animated Gif to better visualize the results, on a given subject.

Pleural Effusion

python --config 'configs/Step_4_MIMIC_Explainer_256_Pleural_Effusion.yaml'


  • Pre-trained object detector for healthy/blunt costophrenic recess.

Generate conceptual explanation

  1. Dissect the trained classification model to identify hidden units which are relevant for a given concept
python --config 'configs/Step_1_StanfordCheXpert_Classifier_256_temp.yaml'
Rscript --vanilla Step_5_Lasso_Regression.r

  1. Use this notebook to visualize the activation regions for a given hidden unit

  1. Use the counterfactual explanation to compute the causal in-direct effect associated with a given set of concept-units
python --config 'configs/Step_4_MIMIC_Explainer_256_Pleural_Effusion.yaml'


  author    = {Sumedha Singla and
               Brian Pollack and
               Stephen Wallace and
               Kayhan Batmanghelich},
  title     = {Explaining the Black-box Smoothly- {A} Counterfactual Approach},
  volume    = {abs/2101.04230},
  year      = {2021},
  url       = {}
@article {
	Title = {Using Causal Analysis for Conceptual Deep Learning Explanation},
	Author = {Singla, Sumedha and Wallace, Stephen and Triantafillou, Sofia and Batmanghelich, Kayhan},
	DOI = {10.1007/978-3-030-87199-4_49},
	Volume = {12903},
	Year = {2021},
	Journal = {Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention},
	Pages = {519—528}


Explaining the Black-box Smoothly - A counterfactual Approac; Using Causal Analysis for Conceptual Deep Learning Explanation







No releases published


No packages published