Skip to content

Code for the NeurIPS 2025 paper "Smoothed Differentiation Efficiently Mitigates Shattered Gradients in Explanations"

License

Notifications You must be signed in to change notification settings

adrhill/smoothdiff-experiments

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SmoothDiff

Code for the NeurIPS 2025 paper "Smoothed Differentiation Efficiently Mitigates Shattered Gradients in Explanations".

Code

  • The Julia reference implementation of SmoothDiff can be found in the /julia folder, which contains a Julia package called SmoothedDifferentiation.jl.
  • The PyTorch reference implementation of SmoothDiff can be found in /python folder.

Installation

  1. Install Julia v1.11. On Unix systems, this requires running
    curl -fsSL https://install.julialang.org | sh
  2. Start a Julia REPL session by typing julia in your terminal
  3. Install DrWatson.jl by typing the following in your Julia REPL:
    ]add DrWatson
  4. Run the experiments and plotting scripts listed below by typing include("path/to/file.jl") in your REPL, replacing the string with the correct path.

Experiments

We provide all code and virtual environments required to reproduce our experiments and figures.

Running experiments

The following steps need to be run in sequence:

  1. Download the ImageNet dataset by agreeing to its terms of access and run the script /experiments/heatmaps/save_input.jl to save preprocessed input tensors.
  2. Compute explanations by running /experiments/cluster/run_analyzers/run.jl as well as the $n=10^6$ sample SmoothGrad run in /experiments/cluster/run_analyzers/run_100k.jl.
  3. Compute benchmarks by running /experiments/cluster/run_analyzers/run.jl.
  4. After having computed explanations in step 3, evaluate them using pixel-flipping by running /experiments/cluster/pixelflipping/run.jl.
  5. The raw quantitative results from Appendix F can be found in /experiments/cluster/quantus-eval/results and are computed by running /experiments/cluster/quantus-eval/run.sh.

Reproducing figures

After having computed explanations and pixel-flipping results, the following plots and tables can be reproduced by running the respective file:

About

Code for the NeurIPS 2025 paper "Smoothed Differentiation Efficiently Mitigates Shattered Gradients in Explanations"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published