No description or website provided.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
configs
core
figs
misc
network
ops
.gitignore
LICENSE
README.md
main.py

README.md

noisy K-FAC

The major contributors of this repository include Guodong Zhang and Shengyang Sun.

Update

A new repo was released with implementations of noisy K-FAC and noisy EK-FAC.

Introduction

This repository contains the code to reproduce the classification results from the paper Noisy Natural Gradient as Variational Inference Paper, Video. (RL code see VIME-NNG)

Noisy Natural Gradient: Variational Inference can be instantiated as natural gradient with adaptive weight noise. By further approximating full Fisher with K-FAC, we get noisy K-FAC, a surprisingly simple variational training algorithm for Bayesian Neural Nets. Noisy K-FAC not only improves the classification accuracy, but also gives well-calibrated prediction.

Now, the implementation of convolution with multiple samples (which is very useful for Bayesian Neural Nets) is messy and slow, we plan to implement a new operation in tensorflow after NIPS.

Citation

To cite this work, please use

@article{zhang2017noisy,
  title={Noisy Natural Gradient as Variational Inference},
  author={Zhang, Guodong and Sun, Shengyang and Duvenaud, David and Grosse, Roger},
  journal={arXiv preprint arXiv:1712.02390},
  year={2017}
}

Dependencies

This project uses Python 3.5.2. Before running the code, you have to install

Example

python main.py --config configs/kfac_plain.json

Tensorboard Visualization

This implementation allows for the beautiful Tensorboard visualization. All you have to do is to launch Tensorboard from your experiment directory located in experiments/.

tensorboard --logdir=experiments/cifar10/noisy-kfac/summaries