Skip to content

louissharrock/Coin-SVGD

Repository files navigation

Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates

ICML 2023

Description

This repository contains code to implement the coin sampling algorithms described in Sharrock et al. (2023). The basic implementation of the algorithms (e.g., Coin SVGD) can be found in main.py.

Example Usage

For some examples of how to use the code for the variou models considered in our paper, see the notebooks below.

File Example
toy_svgd.ipynb Toy examples.
bayes_ica.ipynb Bayesian independent component analysis.
bayes_lr.ipynb Bayesian logistic regression.
bayes_nn.ipynb Bayesian neural network.
bayes_pmf.ipynb Bayesian probabilistic matrix factorisaton.

Citation

If you find the code in this repository useful for your own research, please consider citing our paper:

@InProceedings{Sharrock2023,
  title = 	 {Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates},
  author =       {Sharrock, Louis and Nemeth, Christopher},
  booktitle = 	 {Proceedings of The 40th International Conference on Machine Learning},
  year =         {2023},
  city =         {Honolulu, Hawaii},
}

Acknowledgements

Our implementations of Coin SVGD, Coin LAWGD, and Coin KSDD, are based on existing implementations of SVGD, LAWGD, and KSDD. We gratefully acknowledge the authors of the following papers for their open source code:

  • Q. Liu and D. Wang. Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm. NeurIPS, 2016. [Paper] | [Code].
  • S. Chewi, T. Le Gouic, C. Lu, T. Maunu, P. Rigollet. SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence. NeurIPS, 2020. [Paper] | [Code].
  • A. Korba, P.-C. Aubin-Frankowski, S. Majewski, P. Ablin. ICML 2021. [Paper] | [Code].

We did not contribute any of the datasets used in our experiments. Please get in touch if there are any conflicts of interest or other issues with hosting these datasets here.

About

Code for "Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates" (ICML 2023)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published