This repository contains code illustrating the application of the
algorithms in Kuntz et al. (2022)
and reproducing the results in the paper. For the toy hierarchical model (Section 2), Bayesian logistic regression (Section 3.1), and Bayesian neural network (Section 3.2) examples, we use JAX and the source code is in the jax
folder. For the generator network example (Section 3.3), we use PyTorch and the source code is in the torch
folder.
In either case, the code can be run on Google Colab by clicking on the links below, or locally on your machine (see the README.md files in the respective folder for instructions how to do so).
Update (24/04/2023): See here for new tensorflow implementations of the generator networks.
The notebooks can be accessed by clicking the links below and logging into a Google Account.
Link | Example |
---|---|
Toy hierarchical model | |
Bayesian logistic regression | |
Bayesian neural network | |
Generator network (MNIST) | |
Generator network (CelebA) |
If you find the code useful for your research, please consider citing our paper:
@InProceedings{Kuntz2023,
title = {Particle algorithms for maximum likelihood training of latent variable models},
author = {Kuntz, Juan and Lim, Jen Ning and Johansen, Adam M.},
booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics},
pages = {5134--5180},
year = {2023},
volume = {206},
series = {Proceedings of Machine Learning Research},
url = {https://proceedings.mlr.press/v206/kuntz23a.html},
}
This work is made available under the MIT License. Please see our LICENSE file.