ICML 2020: paper, supplementary material and bibtex available at http://proceedings.mlr.press/v119/sastry20a.html
The code is written in Python 3 with Pytorch 1.1.
(Please refer to this repository for the results of Baseline/ODIN/Mahalanobis on dataset-pairs not presented in the Mahalanobis paper)
We used the out-of-distribution datasets presented in odin-pytorch
We used pre-trained neural networks open-sourced by Mahalanobis and odin-pytorch. The DenseNets trained on CIFAR-10 and CIFAR-100 are by ODIN; remaining are by Mahalanobis.
For experiments on OE-trained networks, we used the pre-trained networks open-sourced by OE
Running the setup.sh downloads the Out-of-Distribution Datasets and pre-trained models.