Skip to content
No description, website, or topics provided.
Jupyter Notebook Python
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
code Delete utils.cpython-36.pyc Jun 7, 2018
data Delete synthetic.cpython-36.pyc Jun 7, 2018
models First commit Jun 7, 2018
notebooks Delete mnist_low_capacity_1-checkpoint.ipynb Jun 7, 2018
scripts First commit Jun 7, 2018
LICENSE First commit Jun 7, 2018
README.md First commit Jun 7, 2018

README.md

LCBNN

This repository contains code used in the experiments of our paper: "Loss-calibrated approximate inference in Bayesian neural networks" by Adam D. Cobb, Stephen J. Roberts and Yarin Gal.

Abstract

Current approaches in approximate inference for Bayesian neural networks minimise the Kullback-Leibler divergence to approximate the true posterior over the weights. However, this approximation is without knowledge of the final application, and therefore cannot guarantee optimal predictions for a given task. To overcome the challenge of making more suitable task-specific approximations, we introduce a new \emph{loss-calibrated} evidence lower bound for Bayesian neural networks in the context of supervised learning. By introducing a lower bound that depends on the utility function, we ensure that our approximation achieves higher utility than traditional methods for applications that have asymmetric utility functions. Through an illustrative medical example and a separate limited capacity experiment, we demonstrate the superior performance of our new loss-calibrated model in the presence of noisy labels. Furthermore, we show the scalability of our method to real world applications for per-pixel semantic segmentation on an autonomous driving data set.

Example

pedestrian_gain

Getting Started

Requirements

Data

The data for the SegNet experiment comes from the repository: https://github.com/alexgkendall/SegNet-Tutorial.git

Contact Information

Adam Cobb: acobb@robots.ox.ac.uk

You can’t perform that action at this time.