Skip to content
code for "Residual Flows for Invertible Generative Modeling".
Branch: master
Clone or download
Latest commit d6516a0 Sep 5, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets *release* Aug 19, 2019
lib *release* Aug 19, 2019
preprocessing *release* Aug 19, 2019
.gitignore *release* Aug 19, 2019
LICENSE *release* Aug 19, 2019 Update Sep 5, 2019 *release* Aug 19, 2019 *release* Aug 19, 2019 *release* Aug 19, 2019

Residual Flows for Invertible Generative Modeling [arxiv]

Building on the use of Invertible Residual Networks in generative modeling, we propose:

  • Unbiased estimation of the log-density of samples.
  • Memory-efficient reformulation of the gradients.
  • LipSwish activation function.

As a result, Residual Flows scale to much larger networks and datasets.


  • PyTorch 1.0+
  • Python 3.6+



  1. Follow instructions in preprocessing/create_imagenet_benchmark_datasets.
  2. Convert .npy files to .pth using preprocessing/convert_to_pth.
  3. Place in data/imagenet32 and data/imagenet64.

CelebAHQ 64x64 5bit:

  1. Download from
  2. Convert .npy files to .pth using preprocessing/convert_to_pth.
  3. Place in data/celebahq64_5bit.

CelebAHQ 256x256:

# Download Glow's preprocessed dataset.
tar -C data/celebahq -xvf celeb-tfr.tar
python extract_celeba_from_tfrecords

Density Estimation Experiments


python --data mnist --imagesize 28 --actnorm True --wd 0 --save experiments/mnist


python --data cifar10 --actnorm True --save experiments/cifar10

ImageNet 32x32:

python --data imagenet32 --actnorm True --nblocks 32-32-32 --save experiments/imagenet32

ImageNet 64x64:

python --data imagenet64 --imagesize 64 --actnorm True --nblocks 32-32-32 --factor-out True --squeeze-first True --save experiments/imagenet64

CelebAHQ 256x256:

python --data celebahq --imagesize 256 --nbits 5 --actnorm True --act elu --batchsize 8 --update-freq 5 --n-exact-terms 8 --fc-end False --factor-out True --squeeze-first True --nblocks 16-16-16-16-16-16 --save experiments/celebahq256

Pretrained Models

Model checkpoints can be downloaded from releases.

Use the argument --resume [checkpt.pth] to evaluate or sample from the model.

Each checkpoint contains two sets of parameters, one from training and one containing the exponential moving average (EMA) accumulated over the course of training. Scripts will automatically use the EMA parameters for evaluation and sampling.


  title={Residual Flows for Invertible Generative Modeling},
  author={Chen, Ricky T. Q. and Behrmann, Jens and Duvenaud, David and Jacobsen, J{\"{o}}rn{-}Henrik},
  booktitle = {Advances in Neural Information Processing Systems},
You can’t perform that action at this time.