Code release for the ICLR paper
Switch branches/tags
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
began Update Feb 13, 2018
hamiltonian Update Feb 13, 2018
meta_learning Update Jun 13, 2018
LICENSE Create LICENSE Feb 13, 2018 Update Feb 26, 2018

Stein Gradient Estimator

Thank you for your interest in our paper:

Yingzhen Li and Richard E. Turner

Gradient Estimators for Implicit Models

International Conference on Learning Representations (ICLR), 2018

Roughly speaking, whenever you need to compute dlogp(x)/dx you can use our method. Applications include: variational inference, maximum entropy, gradient-based MCMC, entropy regularisation (to GANs), and more...

Please consider citing the paper when any of the material is used for your research.

Contributions: Yingzhen derived the estimator and implemented all experiments. Rich provided advices for paper writing. Other people who provided comments on the manuscript are acknowledged in the paper.


I've got three experiments to demonstrate the wide application of the gradient estimator. In the folders you can find the corresponding code, each accompanied by another for further details.

Also the kernel induced hamiltonian flow code depends on another package, see

Citing the paper (bib)

  title = {Gradient Estimators for Implicit Models},
  author = {Li, Yingzhen and Turner, Richard E.},
  booktitle = {International Conference on Learning Representations},
  year = {2018}