Skip to content
Code repo for "Function-Space Distributions over Kernels"
Jupyter Notebook Python R
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
exps first commit Jun 13, 2019
fx first commit Jun 13, 2019
plots first commit Jun 13, 2019
spectralgp first commit Jun 13, 2019
tests first commit Jun 13, 2019
.gitignore first commit Jun 13, 2019

Functional Kernel Learning (FKL)

This repository contains a GPyTorch implementation of functional kernel learning (FKL) from the paper,

Function-Space Distributions over Kernels

by Gregory Benton, Wesley Maddox, Jayson Salkey, Julio Albinati, and Andrew Gordon Wilson.

Please cite our work if you find it useful:

        title = {Function-space {Distributions} over {Kernels}},
        language = {en},
        booktitle = {Advances in {Neural} {Information} {Processing} {Systems}},
        author = {Benton, Greg and Salkey, Jayson and Maddox, Wesley and Albinati, Julio and Wilson, Andrew Gordon},
        year = {2019},
        pages = {8},


Functional kernel learning is an extension of standard Gaussian process regression that directly models both the data via a standard Gaussian process regression set-up, while also non-parametrically modelling kernel space. To model the kernel in a non-parametric manner, FKL utilizes Bochner's Theorem to parameterize the kernel as a deterministic function of its spectral density. FKL then model the spectral density as a latent Gaussian process, performing alternating updates of elliptical slice sampling on the latent GP with gradient-based updates for the GP regression hyper-parameters.

Prior, Function Space Prior, Kernel Space
Posterior, Function Space Posterior, Kernel Space


To install the package, run python develop. See dependencies in requirements.txt (broadly latest versions of PyTorch (>=1.0.0), GPyTorch(>=0.3.2), and standard scipy/numpy builds.)

Please note that the codebase is written to use a GPU if it finds one. We also wrote everything to use double precision (even on the GPU) as default.

One Dimensional Regression

This is in the exps/ directory.

python --data=SM --iters=1 --ess_iters=200 --nx=200 --omega_max=2 --optim_iters=8 #spectral mixture
python --data=sinc --iters=5 --ess_iters=22 --optim_iters=5 --omega_max=1.3 --nx=100 --mean=LogRBF --nomg=75 #sinc
python --data=QP --iters=5 --ess_iters=100 --nx=150 --omega_max=5 --period=1.7 --optim_iters=10 #quasi-periodic
python --iters=5 --ess_iters=100 --optim_iters=10 --omega_max=8 #airline

Multi-Dimensional Regression (with Product Kernels)

Multi-dimensional regression tasks can be found in the exps_multi_input_dim/ folder, one can use and

To replicate our experiments, please run


which will run on all datasets in Table 1.

Multi-Task Extrapolation

This is found in the prcp-testing/ and fx/ folder.

The large scale precipitation dataset can be found at: (hopefully anonymous). This is a pre-processed version. Drop it into the prcp-testing/data-management/ folder and then run.


before training.

Training command for Precipitation Data

python --iters=10 --ess_iters=10 --optim_iters=20 --save=TRUE #if saving models

Note that this will save all of the plots to: plots/run108_0523_final/

Training command for FX dataset

python --dataset=fx


PyTorch and GPyTorch for automatic differentiation and the modelling set-up.

We additionally compared to standard GPyTorch GP models (see example).

Finally, the bnse file contains a clone of Felipe Tobar's Bayesian nonparametric spectral estimation code from here.

You can’t perform that action at this time.