Skip to content

lartpang/awesome-class-activation-map

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Awesome-Class-Activation-Map

An awesome list of papers and tools about the class activation map (CAM) technology.

Class activation maps could be used to interpret the prediction decision made by the convolutional neural network (CNN).

https://paperswithcode.com/method/cam

📚 Paper

2016

  • [CAM] Learning Deep Features for Discriminative Localization | CVPR 2016, Arxiv 2015
  • [Grad-CAM] Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization | Arxiv 2016, ICCV 2017, IJCV 2019

2018

  • [Grad-CAM++] Grad-CAM++: Generalized Gradient-Based Visual Explanations for Deep Convolutional Networks | WACV 2018

2019

  • [Smooth Grad-CAM++] Smooth Grad-CAM++: An Enhanced Inference Level Visualization Technique for Deep Convolutional Neural Network Models | Arxiv 2019, Intelligent Systems Conference 2019

2020

  • [Score-CAM] Score-CAM: Score-Weighted Visual Explanations for Convolutional Neural Networks | CVPRW 2020, Arxiv 2019
  • [Ablation-CAM] Ablation-CAM: Visual Explanations for Deep Convolutional Network via Gradient-free Localization | WACV 2020
  • [SS-CAM] SS-CAM: Smoothed Score-CAM for Sharper Visual Feature Localization | Arxiv 2020
  • [IS-CAM] IS-CAM: Integrated Score-CAM for axiomatic-based explanations | Arxiv 2020
  • [XGrad-CAM] Axiom-based Grad-CAM: Towards Accurate Visualization and Explanation of CNNs | Arxiv 2020, BMVC 2020 (Oral) | Code
  • [Eigen-CAM] Eigen-CAM: Class Activation Map using Principal Components | Arxiv 2020, IJCNN 2020

2021

  • [Ablation-CAM++] Ablation-CAM++: Grouped Recursive Visual Explanations for Deep Convolutional Networks | ICIP 2021
  • [Relevance-CAM] Relevance-CAM: Your Model Already Knows Where to Look | CVPR 2021
  • [Group-CAM] Group-CAM: Group Score-Weighted Visual Explanations for Deep Convolutional Networks | Arxiv 2021 | Code
  • [Integrated Grad-Cam] Integrated Grad-Cam: Sensitivity-Aware Visual Explanation of Deep Convolutional Networks Via Integrated Gradient-Based Scoring | ICASSP 2021
  • [LFI-CAM] LFI-CAM: Learning Feature Importance for Better Visual Explanation | ICCV 2021
  • [LayerCAM] LayerCAM: Exploring Hierarchical Class Activation Maps for Localization | TIP 2021
  • [CALM] Keep CALM and Improve Visual Feature Attribution | Arxiv 2021, ICCV 2021 | Code

2022

  • [Abs-CAM] Abs-CAM: A Gradient Optimization Interpretable Approach for Explanation of Convolutional Neural Networks | Arxiv 2022, SIViP 2023
  • [Recipro-CAM] Recipro-CAM: Fast gradient-free visual explanations for convolutional neural networks | Arxiv 2022

2023

  • [Opti-CAM] Opti-CAM: Optimizing saliency maps for interpretability | Arxiv 2023

🧰 Tool

  • TorchCAM: class activation explorer: Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++, Score-CAM, SS-CAM, IS-CAM, XGrad-CAM, Layer-CAM)