Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 

LCBNN

This repository contains code used in the experiments of our paper: "Loss-calibrated approximate inference in Bayesian neural networks" by Adam D. Cobb, Stephen J. Roberts and Yarin Gal.

Abstract

Current approaches in approximate inference for Bayesian neural networks minimise the Kullback-Leibler divergence to approximate the true posterior over the weights. However, this approximation is without knowledge of the final application, and therefore cannot guarantee optimal predictions for a given task. To overcome the challenge of making more suitable task-specific approximations, we introduce a new \emph{loss-calibrated} evidence lower bound for Bayesian neural networks in the context of supervised learning. By introducing a lower bound that depends on the utility function, we ensure that our approximation achieves higher utility than traditional methods for applications that have asymmetric utility functions. Through an illustrative medical example and a separate limited capacity experiment, we demonstrate the superior performance of our new loss-calibrated model in the presence of noisy labels. Furthermore, we show the scalability of our method to real world applications for per-pixel semantic segmentation on an autonomous driving data set.

Example

pedestrian_gain

Getting Started

Requirements

Data

The data for the SegNet experiment comes from the repository: https://github.com/alexgkendall/SegNet-Tutorial.git

Contact Information

Adam Cobb: acobb@robots.ox.ac.uk

About

No description, website, or topics provided.

Resources

License

Releases

No releases published

Packages

No packages published