Skip to content

fmu2/flow-VAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

flow-VAE

A PyTorch implementation of the training procedure of [1] with normalizing flows for enriching the family of approximate posteriors. The first installation of normalizing flows in a variational inference framework was proposed in [2].

Implementation Details

This implementation supports training on four datasets, namely MNIST, Fashion-MNIST, SVHN and CIFAR-10. For each dataset, only the training split is used for learning the distribution. Labels are left untouched. Raw data (except for MNIST) is subject to dequantization, random horizontal flipping and logit transformation. Adam with default parameters are used for optimization.

Four types of flows are implemented, including two types of general normalizing flows and two types of volume-preserving flows.

  • Planar flow and radial flow are the general normalizing flows proposed in [2].
  • Householder flow is a volume-preserving flow proposed in [3].
  • NICE is another volume-preserving flow proposed in [4]. Although characterized as volume-preserving flow, NICE can be augmented by a scaling operation, which is included as the last step of the flow in this implementation.

Three variants of variational autoencoder (VAE) with normalizing flows are implemented. Their specifications and potential issues are summarized below.

  • VAE with dynamic flows (see dynamic_flow_vae.py). Flow parameters are treated as output of the encoder. In other words, they change with respect to input. This is a principled (and most flexible) way of parametrizing the approximate posteriors and is suggested by [2]. Unfortunately, this approach is largely impractical given the potentially huge number of flow parameters, and is extremely susceptible to numerical instability. Furthermore, it leads to little, if any, improvement on the objective value.

  • VAE with static flows (see static_flow_vae.py). Flow parameters are treated as learned parameters. In other words, they remain fixed with respect to input. This significantly constrains the richness of approximate posteriors, but is more practical and alleviates (but does not fully resolve) the numerical issue seen with dynamic flows. I also observed moderate improvement on the objective value under certain circumstances, particularly with radial flows. However, sampling becomes problematic since most latent codes lie outside of the most probable region under the standard normal prior (see figures below).

  • Convolutional VAE with static flow (see static_flow_conv_vae.py). Convolutional layers instead of dense layers are used in the encoder and the decoder.

Based on the experiments, I concluded that the idea of enriching approximate posteriors in VAE using normalizing flows is theoretically appealing but can easily cause more headache than benefit in real practice. Flow-based generative models may be better options if the goal is to improve data likelihood. You are welcome to check out my implementation of NICE, realNVP and Glow (forthcoming).

Result

I only show results on MNIST. The models use dense layers and static flows. The obeservation carries over to other datasets.

MNIST

model without flow

samples (after 10000 iterations)

latent space

model with planar flows (16 steps)

samples (after 10000 iterations, observe highly homogeneous samples)

latent space (observe inflation in the region with high uncertainty)

model with radial flows (16 steps)

samples (after 10000 iterations, observe highly homogeneous samples)

latent space (observe inflation in the region with high uncertainty)

model with householder flows (16 steps)

samples (after 10000 iterations)

latent space (observe rotation)

model with NICE flows (16 steps)

samples (after 10000 iterations)

latent space (observe expansion of the support)

Training

Code runs on a single GPU and has been tested with

  • Python 3.7.2
  • torch 1.0.0
  • numpy 1.15.4

Examples:

python dynamic_flow_vae.py --dataset=mnist --batch_size=128 --flow=radial --length=16  
python static_flow_vae.py --dataset=mnist --batch_size=128 --flow=radial --length=16  
python static_flow_conv_vae.py --dataset=mnist --batch_size=128 --flow=radial --length=16  

Reference

[1] Diederik P Kingma, Max Welling. Auto-Encoding Variational Bayes. ICLR 14.
[2] Danilo Jimenez Rezende, Shakir Mohamed. Variational Inference with Normalizing Flows. ICML 15.
[3] Jakub M. Tomczak, Max Welling. Improving Variational Auto-Encoders using Householder Flow. NIPS 16 workshop.
[4] Laurent Dinh, David Krueger, Yoshua Bengio. NICE: Non-linear Independent Components Estimation. ICLR 15 workshop.

About

Variational Autoencoder (VAE) with Normalizing Flows

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages