Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?


Failed to load latest commit information.
Latest commit message
Commit time
April 13, 2020 03:56
April 13, 2020 03:56
April 13, 2020 13:36
August 16, 2021 01:43
September 17, 2021 21:44
August 6, 2022 17:40
April 13, 2020 03:56

Score-CAM: Score-Weighted Visual Explanations for Convolutional Neural Networks

We develop a novel post-hoc visual explanation method called Score-CAM, which is the first gradient-free CAM-based visualization method that achieves better visual performance (state-of-the-art).

Paper: Score-CAM: Score-Weighted Visual Explanations for Convolutional Neural Networks, appeared at IEEE CVPR 2020 Workshop on Fair, Data Efficient and Trusted Computer Vision. Our paper has been cited by 400!

Demo: You can run an example via Colab


2021.12.16: A great MATLAB implementation from Kenta Itakura.

2021.4.03: A Pytorch implementation jacobgil/pytorch-grad-cam (3.8K Stars).

2020.8.18: A PaddlePaddle implementation from PaddlePaddle/InterpretDL.

2020.7.11: A Tensorflow implementation from keisen/tf-keras-vis.

2020.5.11: A Pytorch implementation from utkuozbulak/pytorch-cnn-visualizations (6.2K Stars).

2020.3.24: Merged into frgfm/torch-cam, a wonderful library that supports multiple CAM-based methods.


If you find this work is helpful in your research, please cite our work:

  title={Score-CAM: Score-weighted visual explanations for convolutional neural networks},
  author={Wang, Haofan and Wang, Zifan and Du, Mengnan and Yang, Fan and Zhang, Zijian and Ding, Sirui and Mardziel, Piotr and Hu, Xia},
  booktitle={Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops},


Utils are built on flashtorch, thanks for releasing this great work!


If you have any questions, feel free to open an issue or directly contact me via: