Skip to content

Code for ICML 2019 paper on "Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations"

Notifications You must be signed in to change notification settings

yorkerlin/VB-MixEF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

67 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

For an efficient mixture of Gaussians (MOG) implementaiton using vectorization(which is 20 times faster than this repo), please use this new repo based on our ICML 2020 paper.

Code for ICML 2019 paper on Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations

  • To-do List:
    • added a poster and a technical report about the gradient identities used in the paper (To appear at the ICML workshop on Stein's method) [done]
    • added a poster of the main paper. [done]
    • [skewness] added a Matlab implementation for the toy example using skew Gaussian and exponentially modified Gaussian. [done]
    • [multi-modality] added a Matlab implementation for the toy example using MoG. The implementation is based on this repo [done]
    • [heavy tails] To add a Matlab implementation for BLR using t-distribution and symmetric normal inverse Gaussian
    • To add a Python implementation for Vadam extensions

About

Code for ICML 2019 paper on "Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published