Skip to content
Efficient, lightweight, variational inference approximation bounds
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
tests
viabel
.gitignore
LICENSE
README.md
setup.py

README.md

viabel: Variational Inference Approximation Bounds that are Efficient and Lightweight

Description

This package computes bounds errors of the mean, standard deviation, and variances estimates produced by a continuous approximation to a (unnormalized) distribution. A canonical application is a variational approximation to a Bayesian posterior distribution. In particular, using samples from the approximation Q and evaluations of the (maybe unnormalized) log densities of Q and (target distribution) P, the package provides functionality to compute bounds on:

  • the α-divergence between P and Q
  • the p-Wasserstein distance between P and Q
  • the differences between the means, standard deviations, and variances of P and Q

If you use this package, please cite:

Practical posterior error bounds from variational objectives. Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick. arXiv:1910.04102 [stat.ML], 2019.

Compilation and testing

After cloning the repository, testing and installation is easy. To test the package:

nosetests tests/

To install:

pip install .

Usage

🚧🚧 Under Construction 🚧🚧

You can’t perform that action at this time.