Skip to content

ngiann/GaussianVariationalInference.jl

Repository files navigation

GaussianVariationalInference.jl

Deterministic variational inference in Julia.

Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public. Documentation GitHub

What is this?

A Julia package for approximating a posterior distribution with a full-covariance Gaussian distribution by optimising a variational lower bound1. In the near future it is planned to introduce a method for mean-field approximation. We recommend using this package for problems with a relatively small number of parameters, 2-20 parameters perhaps. The main focus of this package is to provide a method for approximating a target posterior with a Gaussian that does not need tuning learning rates (step sizes) and converges reliably.

Basic usage

To install this package, please switch in the Julia REPL into package mode and add using add GaussianVariationalInference.

The package is fairly easy to use. Currently, the only function of interest to the user is VI. At the very minimum, the user needs to provide a function that codes the joint log-likelihood function.

Consider approximating the following target density:

using GaussianVariationalInference

logp = exampleproblem1() # target log-posterior density to approximate
x₀ = randn(2)            # random initial mean for approximating Gaussian
q, logev = VI(logp, randn(2), S = 100, iterations = 10_000, show_every = 50)

# Plot target posterior, not log-posterior!
using Plots # must be indepedently installed.
x = -3:0.02:3
contour(x, x, map(x -> exp(logp(collect(x))), Iterators.product(x, x))', fill=true, c=:blues)

# Plot Gaussian approximation on top using red colour
contour!(x, x, map(x -> pdf(q,(collect(x))), Iterators.product(x, x))', color="red", alpha=0.2)

A plot similar to the one below should appear. The blue filled contours correspond to the exponentiated logp, and the red contours correspond to the produced Gaussian approximation q.

image

For further information, please consult the documentation.

Should you use the software for academic purposes, please kindly consider citing the relevant paper1.

Related packages

  • AdvancedVI.jl: A library for variational Bayesian inference in Julia.
  • DynamicHMC.jl: Implementation of robust dynamic Hamiltonian Monte Carlo methods in Julia.

Footnotes

  1. Approximate Variational Inference Based on a Finite Sample of Gaussian Latent Variables, [Arxiv]. 2