Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Adding INLA to PyMC step 1: get a Laplace Approximation #6992

Open
theorashid opened this issue Nov 6, 2023 · 2 comments
Open

ENH: Adding INLA to PyMC step 1: get a Laplace Approximation #6992

theorashid opened this issue Nov 6, 2023 · 2 comments
Labels
feature request hackathon Suitable for hackathon

Comments

@theorashid
Copy link
Contributor

Before

No response

After

No response

Context for the issue:

cc: @bwengals @athowes @junpenglao

previous closed LA/INLA issues: #4847 and #3242

There are three steps to getting R-INLAish in pymc:

  1. Laplace approximation
  2. embedded (marginal) laplace approximation with other inference on the hyperparameters (probably with HMC like Stan-crew looked into, but R-INLA uses numerical integration and TMB uses empirical Bayes)
  3. sparse cholesky in pytensor to get R-INLA speed (see Dan Simpson's blog, maybe pymc can interface with CHOLMOD).

The first step is getting a Laplace approximation. This is great for models like certain GLMs or stuff with splines where a lot of the posteriors are Gaussians. This can be bundled into the ADVI interface like numpyro do. Looks like this PR got fairly close in pymc3.

Hopefully it won't be too difficult for someone who knows the ADVI pymc interface well. It's pretty fresh though so should probably be put in pymc-experimental first. If anyone wants to attempt parts 2 or 3, that should definitely be in pymc-experimental.

Some resources:

@bwengals
Copy link
Contributor

bwengals commented Nov 7, 2023

Am really interested in particular to the LA + HMC option, seems like the best “fit” within the rest of PyMC. It would be amazing to specify which parts of the model you want to approx with LA. Though a full INLA clone type project would be super cool too. Would INLA benefit from autodiff from pytensor (dont know if it currently relies on AD)?

plenty of nuances with the HMC approach

RE the sparse stuff, this project adds a banded matrix cholesky and other related ops to tensorflow with gradients, might be another place to start?

@athowes
Copy link

athowes commented Nov 15, 2023

in particular to the LA + HMC option

If useful to see, the tmbstan R package (described in this paper) implements the HMC with LA option in the function tmbstan::tmbstan(..., laplace = TRUE).

dont know if it currently relies on AD

R-INLA currently doesn't use AD. See this discussion thread on the R-INLA user Google group.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request hackathon Suitable for hackathon
Projects
None yet
Development

No branches or pull requests

4 participants