This repo contains the implementation for Gaussianization Flows.
Iterative Gaussianization is a fixed-point iteration procedure that can transform any continuous random vector into a Gaussian one. Based on iterative Gaussianization, we propose a new type of normalizing flow model that enables both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that these models, named Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.
-
PyTorch
-
seaborn
To run RBIG experiments, simply run
python rbig.py
To download tabular datasets, follow the instructions here.
To run the experiments, run
python tabular_experiment.py --multidim_kernel --usehouseholder
and specify the dataset and settings by using the flags
--total_datapoints --process_size --dataset --layer --epoch --lr --batch_size