Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding sparse finite GP type for integration with Turing #134

Closed
ElOceanografo opened this issue Sep 29, 2020 · 3 comments
Closed

Adding sparse finite GP type for integration with Turing #134

ElOceanografo opened this issue Sep 29, 2020 · 3 comments

Comments

@ElOceanografo
Copy link
Contributor

A continuation/focusing of this discussion on Discourse. Defining the following structure and methods allows using a sparse GP in a Turing model:

struct SparseFiniteGP{T1<:Stheno.FiniteGP, T2<:Stheno.FiniteGP} <: Distributions.AbstractMvNormal
    fobs::T1
    finducing::T2
end

function Stheno.logpdf(f::T, y::AbstractArray{T1, 1}) where {T <: SparseFiniteGP, T1 <: Real}
    return elbo(f.fobs, y, f.finducing)
end

Base.length(f::SparseFiniteGP) = length(f.fobs)
Distributions._rand!(rng::Random._GLOBAL_RNG, f::SparseFiniteGP1, x::Array{T}) where {T<:Real} = rand(f.fobs)

An example model is shown below. Using the SparseFiniteGP speeds up inference dramatically for a simulated dataset with ~1000 observations.

@model function example_model_sparse(x, z, xu)
    f = GP(Matern32(), GPC())
    fsparse = SparseFiniteGP(f(x, 1e-6),  f(xu, 1e-6))
    y ~ fsparse
    λ = exp.(y)
    z .~ Poisson.(λ)
end
chn_sparse = sample(example_model_sparse(x, z, xu), NUTS(), 200)

Some questions to discuss while I put together a PR...

  • Any thoughts on the interface or structure of the types?
  • Besides rand and logpdf, what other methods does this new type need?
  • Where should this code go? In src/abstract_gp.jl?
  • What tests are needed--in particular, what comparisons should there be between the sparse and dense GPs?
@willtebbutt
Copy link
Member

Thanks for opening this.

Any thoughts on the interface or structure of the types?

The above looks really good -- I prefer the design to the one that I proposed on discourse.

Besides rand and logpdf, what other methods does this new type need?

Yes, there are a few methods of mean and cov that should just fall back to fobs. Basically just look for methods in src/abstract_gp.jl that accept FiniteGPs and implement them :)

Where should this code go? In src/abstract_gp.jl?

We'll have quite a lot of approximate inference stuff one this has been added, so maybe it makes sense to create a new file called src/approximate_inference.jl or something, put this there, and move elbo-related stuff from src/abstract_gp.jl into that file?

What tests are needed--in particular, what comparisons should there be between the sparse and dense GPs?

I would just check that mean, cov, and rand (with identical seeds) produce the same thing in all cases, and that elbo with regular FiniteGPs and logpdf under SparseFiniteGPs produce the same answer.

@ElOceanografo
Copy link
Contributor Author

Ok, that all sounds good. PR on the way.

@ElOceanografo
Copy link
Contributor Author

Closed by #136!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants