Skip to content

👋 Code for the paper: "Look at the Variance! Efficient Black-box Explanations with Sobol-based Sensitivity Analysis" (NeurIPS 2021)

Notifications You must be signed in to change notification settings

fel-thomas/Sobol-Attribution-Method

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

👋 Sobol Attribution Method (NeurIPS 2021)

This repository contains code for the paper:

Look at the Variance! Efficient Black-box Explanations with Sobol-based Sensitivity Analysis, Thomas Fel*, Rémi Cadène*, Mathieu Chalvidal, Matthieu Cord, David Vigouroux & Thomas Serre. NeurIPS 2021, [arXiv].

The code is implemented and available for Pytorch & Tensorflow. A notebook for each of them is available: notebook Pytorch, notebook Tensorflow.

@inproceedings{fel2021sobol,
      title={Look at the Variance! Efficient Black-box Explanations with Sobol-based Sensitivity Analysis}, 
      author={Thomas Fel and Remi Cadene and Mathieu Chalvidal and Matthieu Cord and David Vigouroux and Thomas Serre},
      year={2021},
      booktitle={Advances in Neural Information Processing Systems (NeurIPS)}
}

Other Attribution methods

The code for the metrics and the other attribution methods used in the paper come from the Xplique toolbox.

Authors

About

👋 Code for the paper: "Look at the Variance! Efficient Black-box Explanations with Sobol-based Sensitivity Analysis" (NeurIPS 2021)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published