Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

implement blocked gibbs sampling for Bayesian inference in linear Gaussian SSM using JSL #677

Closed
murphyk opened this issue Feb 16, 2022 · 10 comments
Labels

Comments

@murphyk
Copy link
Member

murphyk commented Feb 16, 2022

Implement blocked gibbs sampling for Bayesian inference in linear Gaussian SSM .
In the "E step", use the Jax SSM library for forwards-filtering backwards-sampling.
In the "M step", sample from the parameter posteriors assuming conjugate priors.

Some details can be found in this paper
A. Wills, T. B. Schön, F. Lindsten, and B. Ninness, “Estimation of Linear Systems using a Gibbs Sampler,” IFAC proc. vol., vol. 45, no. 16, pp. 203–208, Jul. 2012, doi: 10.3182/20120711-3-be-2027.00297. [Online]. Available: https://linkinghub.elsevier.com/retrieve/pii/S1474667015379520

@calvintanama
Copy link

Hi, can I work on this one?

@murphyk
Copy link
Member Author

murphyk commented Mar 10, 2022

Hi @calvintanama . Sure, please feel free to work on this.
See sec 12.3.7 of vol 2 for some details.
Note: you need to solve #678 first!

@murphyk
Copy link
Member Author

murphyk commented Mar 10, 2022

See the PySSM library for relevant code.
Please try to replicate some of their examples.

@calvintanama
Copy link

Hi @murphyk. Since you have mentioned, that I need to solve #678 first, should I make separate PR for #678 and this issue and work on that one first?

@murphyk
Copy link
Member Author

murphyk commented Mar 11, 2022 via email

@murphyk
Copy link
Member Author

murphyk commented Mar 11, 2022

Concretely the goal should be to replicate the functionality and examples of this paper

C. Strickland, R. Burdett, K. Mengersen, and R. Denham, “PySSM: A Python Module for Bayesian Inference of Linear Gaussian State Space Models,” J. Stat. Softw., vol. 57, pp. 1–37, Apr. 2014, doi: 10.18637/jss.v057.i06. [Online]. Available: https://www.jstatsoft.org/article/view/v057i06. [Accessed: Mar. 11, 2022]

@murphyk
Copy link
Member Author

murphyk commented Mar 16, 2022

See also

@misc{helske2021bssm,
title={bssm: Bayesian Inference of Non-linear and Non-Gaussian State Space Models in R},
author={Jouni Helske and Matti Vihola},
year={2021},
url={https://arxiv.org/abs/2101.08492},
eprint={2101.08492},
archivePrefix={arXiv},
primaryClass={stat.CO}
}

@calvintanama
Copy link

Hi @murphyk thanks for the literature! For the issue #678 I tried to implement Durbin-Koopman sampling from [1] (https://www.jstor.org/stable/4140605) using modification of Jarociński [2] (https://www.sciencedirect.com/science/article/pii/S0167947315001176?via%3Dihub#tb000005). I made an assumption, that the matrix R_t in [1] is identity matrix because in the current implementation of LDS it is not needed to specify one.
I tried to test my implementation using LGSSM from TensorFlow Probability as you have mentioned, but somehow I can not get the equivalent LGSSM object of LDS (.sample() method resulting different numbers). Here is my code https://colab.research.google.com/drive/1NgA8-7qDviN-ID-PtWqL0jIPjNKBdhXQ?usp=sharing
My questions:

  1. Am I going into expected direction?
  2. How should I handle PRNG?

@xinglong-li
Copy link
Collaborator

I've completed issue #678 and I'd like to continue work on this series of tasks ;)

@murphyk
Copy link
Member Author

murphyk commented Apr 22, 2022

Hi @xinglong-li Yes, please go for it. If you use conjugate priors, it should be easy to do (as you know :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants