Skip to content

Official implementation of "GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators" (NeurIPS 2020)

License

Notifications You must be signed in to change notification settings

DingfanChen/GS-WGAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GS-WGAN

LICENSE Python

This repository contains the implementation for GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators (NeurIPS 2020).

Contact: Dingfan Chen (dingfan.chen@cispa.de)

Requirements

The environment can be set up using Anaconda with the following commands:

conda create --name gswgan-pytorch python=3.6
conda activate gswgan-pytorch
conda install pytorch=1.2.0 
conda install torchvision -c pytorch
pip install -r requirements.txt

Training

Step 1. To warm-start the discriminators:

cd source
sh pretrain.sh
  • To run the training in parallel: adjust the 'meta_start' argument and run the script multiple times in parallel.
  • Alternatively, you can download the pre-trained models using the links below.

Step 2. To train the differentially private generator:

cd source
python main.py -data 'mnist' -name 'ResNet_default' -ldir '../results/mnist/pretrain/ResNet_default'
  • Please refer to source/config.py (or execute python main.py -h) for the complete list of arguments.

  • The default setting require ~22G GPU memory. Please allocate multiple GPUs by specifying the '-ngpus' argument if it does not fit in the memory of one GPU.

Evaluation

Privacy

  • To compute the privacy cost:
    cd evaluation
    python privacy_analysis.py -data 'mnist' -name 'ResNet_default'
    

Utility

  • To evaluate downstream sklearn classifiers:
    The evaluation results will be saved to '#dirname(gen_data.npz)#/eval/sklearn' by default.

    cd evaluation
    python eval_sklearn.py --gen_data './../results/mnist/main/ResNet_default/gen_data.npz' -data 'mnist'
    
  • To evaluate downstream CNN classifiers:
    The evaluation results will be saved to '#dirname(gen_data.npz)#/eval/cnn' by default:

    cd evaluation
    python eval_cnn.py --gen_data './../results/mnist/main/ResNet_default/gen_data.npz' -data 'mnist'
    
  • To evaluate the MNIST Inception Score:

    cd evaluation
    
    1. Train a classifier on real data. The model will be saved to 'evaluation/models/' by default:
      python train_mnist_inception_score.py -data 'mnist'
      
    2. Load the pre-trained classifier and evaluate IS. The evaluation result will be saved to '#dirname(gen_data.npz)#/eval/IS/' by default:
      python eval_mnist_inception_score.py -data 'mnist' --gen_data './../results/mnist/main/ResNet_default/gen_data.npz'
      
  • To evaluate the FID (requires installing TensorFlow:
    The evaluation results will be saved to '#dirname(gen_data.npz)#/eval/FID/' by default.

    cd evaluation
    python eval_fid.py -data 'mnist' --gen_data './../results/mnist/main/ResNet_default/gen_data.npz' 
    

Pre-trained Models

Pre-trained model checkpoints can be downloaded using the links below. The discriminators are obtained after the warm-starting step (step 1), while the generators are obtained after the DP training step (step 2). The pre-trained models are stored as .pth files and the corresponding training configurations are stored in params.pkl and params.txt.

Generator Discriminators
MNIST link link
Fashion-MNIST link link

Citation

@inproceedings{chen20gswgan,
title = {GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators},
author = {Dingfan Chen and Tribhuvanesh Orekondy and Mario Fritz},
year = {2020},
date = {2020-12-06},
booktitle = {Neural Information Processing Systems (NeurIPS)},
pubstate = {published},
tppubtype = {inproceedings}
}

Acknowledgements

Our implementation uses the source code from the following repositories:

About

Official implementation of "GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators" (NeurIPS 2020)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published