An implementation of the Residual Flow algorithm for out-of-distribution detection [arXiv] . Some code was adopted from deep_Mahalanobis_detector and RealNVP.
E. Zisselman, A. Tamar. "Deep Residual Flow for Out of Distribution Detection". CVPR 2020.
@InProceedings{Zisselman_2020_CVPR,
author = {Zisselman, Ev and Tamar, Aviv},
title = {Deep Residual Flow for Out of Distribution Detection},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}
Tested on Ubuntu Linux 18.04.04 and Python 3.7, and requires the following dependencies:
- PyTorch: Requires 1 GPU with CUDA 10.2 support.
- scipy
- scikit-learn
The datasets from odin-pytorch:
are to be placed in ./data/
.
The pre-trained neural networks are from deep_Mahalanobis_detector:
-
DenseNet trained on CIFAR-10, CIFAR-100 and SVHN.
-
ResNet trained on CIFAR-10, CIFAR-100 and SVHN.
To be placed in ./pre_trained/
.
We provide six pre-trained residual flow networks for OOD detection for ResNet and DenseNet:
- DenseNet trained on CIFAR-10, CIFAR-100 and SVHN.
- Residual Flow for DenseNet-CIFAR-10 / Residual Flow for DenseNet-CIFAR-100 / Residual Flow for DenseNet-SVHN
- ResNet trained on CIFAR-10, CIFAR-100 and SVHN.
- Residual Flow for ResNet-CIFAR-10 / Residual Flow for ResNet-CIFAR-100 / Residual Flow for ResNet-SVHN
To be placed in ./output/
.
Example usage of residual flow targeting ResNet trained on CIFAR-10.
Settings: 1x GPU (index 0)
# extract feature activations from classification network
python Residual_flow_prepare.py --dataset cifar10 --net_type resnet --gpu 0
Place the pre-trained residual flow networks (ResNet, CIFAR-10) in ./output/
or train the networks using the following:
Note: Each layer is trained individually using the flag --layer n
where 'n' is the layer index [0..N]
# (optional - you may use the pre-trained networks above)
# Residual Flow training - trained per target network layer [0..N]
# where N = 3 for DenseNet and N = 4 for ResNet
python Residual_flow_train.py --num_iter 2000 --net_type resnet --dataset cifar10 --layer 0 --gpu 0
python Residual_flow_test_processing.py --net_type resnet --dataset cifar10
# (optional) comparison with Mahalanobis detector
python OOD_Generate_Mahalanobis.py --dataset cifar10 --net_type resnet --gpu 0
python OOD_Regression_Residual_flow.py --net_type resnet
# (optional) comparison with Mahalanobis detector
python OOD_Regression_Mahalanobis.py --net_type resnet
# generate the adversarial samples
python ADV_Samples_FGSM.py
# extract feature activations from classification network
python Residual_flow_prepare.py --dataset cifar10 --net_type resnet --gpu 0 --validation_src FGSM
Place the pre-trained residual flow networks (ResNet, CIFAR-10) in ./output/
or train the networks using the following:
Note: Each layer is trained individually using the flag --layer n
where 'n' is the layer index [0..N]
# (optional - you may use the pre-trained networks above)
# Residual Flow training - trained per target network layer [0..N]
# where N = 3 for DenseNet and N = 4 for ResNet
python Residual_flow_train.py --num_iter 2000 --net_type resnet --dataset cifar10 --layer 0 --gpu 0
python Residual_flow_test_processing.py --net_type resnet --dataset cifar10 --validation_src FGSM
# (optional) comparison with Mahalanobis detector
python ADV_Generate_Mahalanobis.py.py --dataset cifar10 --net_type resnet --gpu 0
python OOD_Regression_Residual_flow_FGSM_validation.py --net_type resnet
# (optional) comparison with Mahalanobis detector
python OOD_Regression_Mahalanobis_FGSM_validation.py --net_type resnet