Skip to content

An example of variational approximation for Gaussian process classification

License

Notifications You must be signed in to change notification settings

emtiyaz/VariationalApproxExample

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VariationalApproxExample

An example of variational approximation for Gaussian process classification. To run this code, download the code in a directory, and do following in MATLAB,

$ addpath(genpath(pwd))
$ example

We generate synthetic classification data (yi,Xi). We assume a GP prior with zero mean and a linear Kernel with a logit likelihood to generate binary data.

As a simple example of variational approximation, we will fit an approximate Gaussian posterior N(m,V) with a restriction that the diagonal of V is 1. Our goal is to find m and V.

We will use the KL method of Kuss and Rasmussen, 2005 and solve the following optimization problem to find m:

max_m f(m) = -(m-mu)'Omega(m-mu)/2 + sum_i fi(yi,mi,1)

where Omega is inverse of GP covariance matrix, mu is the mean, and fi = E(log p(yi|xi)) wrt N(xi|mi,1) of the logit likelihood p(yi|xi) = exp(yi*xi)/(1+exp(xi))

We use LBFGS implemented in minFunc by Mark Schmidt http://www.di.ens.fr/~mschmidt/Software/minFunc.html

The implementation of fi for logit likelihood is based on the following ICML paper (also see the Appendix )

About

An example of variational approximation for Gaussian process classification

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published