Skip to content
Gaussian Process package based on data augmentation, sparsity and natural gradients
Branch: master
Clone or download

README.md

AugmentedGaussianProcesses.jl Docs Latest Docs Stable Build Status Coverage Status

AugmentedGaussianProcesses! (previously OMGP) is a Julia package in development for Data Augmented Sparse Gaussian Processes. It contains a collection of models for different gaussian and non-gaussian likelihoods, which are transformed via data augmentation into conditionally conjugate likelihood allowing for extremely fast inference via block coordinate updates.

Packages models :

Two GP classification likelihood


Three GP Regression likelihood

Regression Plot

More models in development

  • MultiClass : A multiclass classifier model, relying on a modified version of softmax
  • Poisson : For point process estimation
  • Heteroscedastic : Non stationary noise
  • Probit : A Classifier with a Bernoulli likelihood with the probit link
  • Online : Allowing for all algorithms to work online as well
  • Numerical solving : Allow for a more general class of likelihoods by applying numerical solving (like GPFlow)

Install the package

The package requires Julia 1.0 Run in Julia press ] and type add AugmentedGaussianProcesses, it will install the package and all its requirements

Use the package

A complete documentation is currently being written, for now you can use this very basic example where X_train is a matrix N x D where N is the number of training points and D is the number of dimensions and Y_train is a vector of outputs (or matrix independent multi-output).

using AugmentedGaussianProcesses
model = SVGP(X_train,Y_train,RBFKernel(1.0),LogisticLikelihood(),AnalyticSVI(100),64)
train!(model,iterations=100)
Y_predic = predict_y(model,X_test) #For getting the label directly
Y_predic_prob = proba_y(model,X_test) #For getting the likelihood of predicting class 1

Both documentation and examples are available.

References :

Check out my website for more news

"Gaussian Processes for Machine Learning" by Carl Edward Rasmussen and Christopher K.I. Williams

ECML 17' "Bayesian Nonlinear Support Vector Machines for Big Data" by Florian Wenzel, Théo Galy-Fajou, Matthäus Deutsch and Marius Kloft. https://arxiv.org/abs/1707.05532

AAAI 19' "Efficient Gaussian Process Classification using Polya-Gamma Variables" by Florian Wenzel, Théo Galy-Fajou, Christian Donner, Marius Kloft and Manfred Opper. https://arxiv.org/abs/1802.06383

UAI 13' "Gaussian Process for Big Data" by James Hensman, Nicolo Fusi and Neil D. Lawrence https://arxiv.org/abs/1309.6835

JMLR 11' "Robust Gaussian process regression with a Student-t likelihood." by Jylänki Pasi, Jarno Vanhatalo, and Aki Vehtari. http://www.jmlr.org/papers/v12/jylanki11a.html

You can’t perform that action at this time.