Skip to content


Switch branches/tags

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
Apr 3, 2020
Apr 7, 2020
Apr 7, 2020
Apr 3, 2020
Apr 7, 2020
Apr 7, 2020
Apr 3, 2020


This repository is the PyTorch implementation of the paper:

Normalizing Flows with Multi-Scale Autoregressive Priors (CVPR 2020)

Apratim Bhattacharyya*, Shweta Mahajan*, Mario Fritz, Bernt Schiele, Stefan Roth

* Authors contributed equally.

Getting started

This code has been developed under Python 3.5, Pytorch 1.0.0 and CUDA 9.0.

  1. Please run to check if all required packages are installed.
  2. The datasets used in this project are the following,
    • MNIST (included in torchvision.datasets)
    • CIFAR10 (included in torchvision.datasets)
    • ImageNet (available here)


The script is used for training. The important keyword arguments for training are,

  • dataset_name : Name of the dataset in {mnist,cifar10,imagenet_32,imagenet_64} (lowercase string).
  • data_root : Path to the location of the dataset. Please see for default values.
  • coupling : Type of split coupling to use in the mAR-SCF model. Possible values {affine,mixlogcdf} (lowercase string).
  • batch_size : Recommended values are 128 for affine couplings and 64 for mixlogcdf couplings.
  • L : Number of levels in the mAR-SCF model (3 for MNIST, CIFAR10 and ImageNet 32x32; 4 for ImageNet 64x64).
  • K : Number of couplings per level.
  • C : Number of channels per coupling.

Example usage to train a model on CIFAR10 with MixLogCDF couplings,

python --dataset_name cifar10 --coupling mixlogcdf --batch_size 64 --K 4 --C 96

Note, the number of GPUs used can be controlled with the flag CUDA_VISIBLE_DEVICES, will default to CPU if no cuda devices are available.

Generation and Validation

Samples and test results in bits/dim can be obtained using and the flag --from_checkpoint. Checkpoints should be stored in the ./checkpoints folder. Generated samples are stored in the ./samples folder.

Note, that checkpoints (and sample) files follow the following format,


Checkpoints can be obtained here. Please note that the checkpoints available here for CIFAR10 with MixLogCDF couplings have been trained for longer than reported in the paper, which leads to improved results. E.g. the checkpoint with 256 channels and MixLogCDF couplings have been trained for ~1000 epochs leading to a test NLL (bits/dim) of 3.22 bits/dim and FID of 33.6 (vs 3.24 bits/dim and FID of 41.9 at ~400 epoch as reported in the paper for fair comparison to Residual Flows).


The code for affine couplings is based on the glow-pytorch repo, MixLogCDF couplings on the flowplusplus repo and Convolutional LSTM on the pytorch_convolutional_rnn repo.


title = {Normalizing Flows with Multi-scale Autoregressive Priors},
author = {Apratim Bhattacharyya and Shweta Mahajan and Mario Fritz and Bernt Schiele and Stefan Roth},
booktitle = {IEEE Conference on Computer Vision and Pattern Recognition},
year = {2020},


Normalizing Flows with Multi-Scale Autoregressive Priors (CVPR 2020)







No packages published