Repository For the Paper "Interpreting CNNs using Conditional GANs by representing CNNs as Conditional Priors"
The Paper introduced a method to interpret why a CNN makes its predictions by training a GAN to understand CNNs. We propose a novel method that trains conditional Generative Adversarial Networks to generate visual explanations of Convolutional Neural Networks (CNNs). To comprehend a CNN, the GAN is trained with information on how the CNN processes an image when making predictions. Supplying that information has two main challenges. How to represent these information in a form feedable to the GANs and how to effectively feed the representation to the GAN. To solve these issues, we developed a suitable representation of the CNN using cumulatively averaging intermediate explainability maps. We also propose two methods to feed the representations to the GAN and to choose an effective training strategy. Our approach learned the general aspects of CNNs and was agnostic to datasets and CNN architectures. The study includes both a qualitative and quantitative evaluation of the interpretability maps in comparison with state of the art approaches. We found that the initial layers of CNNs and final layers are equally crucial for explaining CNNs upon interpreting the GAN.
- Introduced a GAN that understands the general working of CNNs.
- Introduced a method to represent CNN's operations as conditional priors.
- Introduced a method to interpret our proposed GAN.
LSFT-GAN and GSFT_GAN are the two varients of the Proposed GAN architecture. They differe by thier conditioning methodologies. GSFT(Global-SFT) conditions the GAN using a Global Condition and LSFT(Local-SFT) conditions the GAN progressively.
The Figure shows sample results on interpreting CNNs trained fro Classifying Food-11 and Animals-10 Dataset. We interpreted how our GAN interpreted CNNs based on the relevance of the Input Conditions.Here me mention how to use this repository for Infererence. We provide a pretrained gan model trained on classifiction models to explain Animals-10, Food-11 and CIFAR-10 Datasets. We provide the pretrained model for both LSFT-GAN and GSFT-GAN. We provide a preprocessed .npy file of Food-11 dataset to use for Inference. Please follow the below steps for Inference:
Download the npy and pre-trained models from Download Drive
Unzip the Models.zip downloaded from drive to models/. Models.zip contains the pretained models for LSFT-GAN and GSFT-GAN
Move the Food_Resnet.npy downloaded from drive to npys/. Food_Resnet.npy is the npy containing the required preprocessed inputs for Inference.
Create a new python environment and run requirements.txt In command line:
py venv -m env
env/scripts/activate.bat
pip install -r requirements.txt
To Run Inference on LSFT-GAN run
python lsft_inference.py
To Run Inference on GSFT-GAN run
python gsft_inference.py
Open outputs/ to find the folder with the inferences. The inference folder contains three sub floders inputs , Gan outputs and GradCAM outputs.