Differentiable Annealed Importance Sampling Minimizes The Symmetrized Kullback-Leibler Divergence Between Initial and Target Distribution (ICML 2024)
This is the official GitHub repository for our work Differentiable Annealed Importance Sampling Minimizes The Symmetrized Kullback-Leibler Divergence Between Initial and Target Distribution where we investigate the initial distribution of differentiable annealed importance sampling (DAIS) for inference.
Differentiable annealed importance sampling (DAIS), proposed by Geffner & Domke (2021) and Zhang et al. (2021), allows optimizing over the initial distribution of AIS. In this paper, we show that, in the limit of many transitions, DAIS minimizes the symmetrized Kullback-Leibler divergence between the initial and target distribution. Thus, DAIS can be seen as a form of variational inference (VI) as its initial distribution is a parametric fit to an intractable target distribution. We empirically evaluate the usefulness of the initial distribution as a variational distribution on synthetic and real-world data, observing that it often provides more accurate uncertainty estimates than VI (optimizing the reverse KL divergence), importance weighted VI, and Markovian score climbing (optimizing the forward KL divergence).
The code base was heavily extended and rewritten based on the DAIS code base by Zhang et al. (2021) which was taken from their OpenReview submission. The baseline for Markovian Score Climbing (Naesseth et al., 2020) was taken from the original repository.
- tested with Python 3.10 (a lower version might work as well);
- dependencies can be found in
requirements.txt
Example training command for some logistic regression baseline:
python train_logistic_regression.py \
--batch_size 128 \
--n_particles 16 \
--n_transitions 16 \
--scaled_M \
--max_iterations 10_000 \
--lr 1e-3 \
--dataset ionosphere \
--data_path <path to dataset csv> \
--path_to_save <path to save model to>
Distributed under the MIT License. See LICENSE.MIT
for more information.
Following is the Bibtex if you would like to cite our paper :
@inproceedings{zenn2024differentiable,
title = {Differentiable Annealed Importance Sampling Minimizes The Symmetrized {K}ullback-{L}eibler Divergence Between Initial and Target Distribution},
author = {Zenn, Johannes and Bamler, Robert},
booktitle = {Forty-first International Conference on Machine Learning (ICML)},
year = {2024},
url = {https://openreview.net/pdf?id=rvaN2P1rvC}
}