Code release for the ICLR paper
Switch branches/tags
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
began Update README.md Feb 13, 2018
hamiltonian Update README.md Feb 13, 2018
meta_learning Update README.md Jun 13, 2018
LICENSE Create LICENSE Feb 13, 2018
README.md Update README.md Feb 26, 2018

README.md

Stein Gradient Estimator

Thank you for your interest in our paper:

Yingzhen Li and Richard E. Turner

Gradient Estimators for Implicit Models

International Conference on Learning Representations (ICLR), 2018

Roughly speaking, whenever you need to compute dlogp(x)/dx you can use our method. Applications include: variational inference, maximum entropy, gradient-based MCMC, entropy regularisation (to GANs), and more...

Please consider citing the paper when any of the material is used for your research.

Contributions: Yingzhen derived the estimator and implemented all experiments. Rich provided advices for paper writing. Other people who provided comments on the manuscript are acknowledged in the paper.

Experiments

I've got three experiments to demonstrate the wide application of the gradient estimator. In the folders you can find the corresponding code, each accompanied by another README.md for further details.

Also the kernel induced hamiltonian flow code depends on another package, see https://github.com/karlnapf/kernel_hmc

Citing the paper (bib)

@inproceedings{li2018gradient,
  title = {Gradient Estimators for Implicit Models},
  author = {Li, Yingzhen and Turner, Richard E.},
  booktitle = {International Conference on Learning Representations},
  year = {2018}
}