Code for NIPS 2015 "Gradient-Free Hamiltonian Monte Carlo via Effecient Kernel Exponential Families"
Python Shell
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
kernel_hmc
tests
.coveragerc
.gitignore
.travis.yml
LICENSE
README.md
requirements.txt
setup.py

README.md

Kernel Hamiltonian Monte Carlo

Build Status Coverage Status

Code for NIPS 2015 Gradient-free Hamiltonain Monte Carlo with Efficient Kernel Exponential Families.

This package implements the kernel HMC part of the paper. It heavily depends on the kernel exponential family package, where all gradient estimation code is located.

My blog post about KMC.

An IPython notebook featuring KMC lite's ability to move in previously unexplored regions.

Install dependencies:

pip install -r https://raw.githubusercontent.com/karlnapf/kernel_hmc/master/requirements.txt

Optional dependencies are:

  • cholupdate for effecient low-rank updates of Cholesky factors of covariances. Speeds up Adaptive-Metropolis and KMC Finite from cubic to quadratic costs, see paper.
  • Shogun-toolbox for the Gaussian Process marginal posterior over hyper-parameters example. To compute unbiased estimates of the marginal likelihood via approximate inference and importance sampling.
  • theano for the Banana example, to compute gradients via auto-grad.

Install kernel_hmc:

pip install git+https://github.com/karlnapf/kernel_hmc.git

A list of examples can be found here. For example, run

python -m kernel_hmc.examples.demo_trajectories.py
python -m kernel_hmc.examples.demo_mcmc_kmc_static.py