MOEGPLIB: A Library for Mixtures of GP Experts. |
---|
MOEGPLIB contains the code for the CoRL paper: |
Trust your robots! Predictive uncertainty estimation of neural networks with scalable Gaussian Processes. |
The intended features are the followings. |
1. Implementations of Mixtures of Gaussian Process Experts with Neural Tangent Kernel. |
2. Implementations of MC-dropout [1]. |
3. Implementations of approximate uncertainty propagation with MC-dropout [2]. |
4. Implementations of Laplace approximation [3-4]. |
|
Some of these methods are used as a baseline, and others are the results of the project. |
Motivation: Deep Neural Networks (DNNs) are wonderful tools for building predictors with large amounts of data, but notorious for delivering poor uncertainty estimates. On the other hand, Gaussian Processes (GPs) are theoretically elegant, and known to provide well-calibrated uncertainties. But, GPs does not scale well to big data-sets, and generally lacks predictive power as oppose to DNNs. To obtain a tool that has the best of both worlds, we started this project. As GP regression is analytically tractable, making its predictions just serious of matrix multiplications. We also note that, popular models Bayesian Neural Networks, and deep ensembles are hard to deploy them on a robot, as they require combining multiple predictions of DNNs. |
Important Note: We are currently updating the code along with another iteration of the method development. The plan is to have a major release until Fall 2022. Hence, this repository is under construction. |
Installation Guide |
Installations using conda environments
To install the repository, navigate to the root directory and run:
$ conda env create -f environment.yml
To activate the newly created environment, run:
$ conda activate moegp
To check if the environment is setup correctly run:
$ conda env list
This list should comply with environment.yml
file.
Due to hardware and driver incompatibility, it might be necessary to install GPyTorch and PyTorch separately. Make sure to install the correction version with a supported NVIDIA driver.
Manual installations
An alternative is to install the required packages manually.
$ pip/conda install numpy scipy torchvision tqdm psutil scikit-learn gputil zarr pandas pyscaffold
$ pip install torch/conda install pytorch
$ pip install gpytorch
$ python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
To generate figures, install the following additional dependencies:
$ pip/conda install matplotlib seaborn statsmodels colorcet
Getting Started ===========
To install the code as a Python package go to the root directory and run:
$ python setup.py develop
We set up this project using PyScaffold.
The Sphinx tool-chain is already set-up, for the code documentations.
To access the HTML documentation:
$ python setup.py docs
$ cd docs/
$ make html
Then, inside the build/html
directory, there will be a file: index.html
which can be opened with any browser.
To acces the PDF documentation:
$ cd docs/
$ make latexpdf
$ cd ../build/sphinx/latex/
$ make
This should generate a PDF file called user_guide.pdf.
Minimalistic Guide to Developers ===========
Overview of directory structure
.
+-- docs/
+-- src/
| +-- curvature/
| +-- moegplib/
| +-- baselines/
| +-- clustering/
| +-- datasets/
| +-- lightnni/
| +-- moegp/
| +-- networks/
| +-- utils/
+-- tools/
| +-- snelson/
| +-- trainer/
Overview of important directories
src/moegplib/baselines
: Utility code for baselinessrc/moegplib/datasets
: Data loader implementationssrc/moegplib/networks
: Neural Network Modelssrc/moegplib/curvature/
The code of the Laplace baselines submodule (only available after recursive pull)tools/
: Tools to train and reproduce the results. There is one directory for each class of experiments.
On Reproducing the Results ===========
Snelson Experiments
The command lines to reproduce the snelson experiments can be found below. To be executed at the root
.
$ python tools/patchgp.py --ckp_dir PATH_TO_NETWORK_CHECKPOINTS --data_dir PATH_TO_DATA
$ python tools/localgp.py --ckp_dir PATH_TO_NETWORK_CHECKPOINTS --data_dir PATH_TO_DATA
patchgp.py
runs the snelson experiment with patchwork prior, and localgp.py
produces the results for a pure local GPs.
Further Readings ============
We recommend above literatures for further reading.