Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Implement elliptical slice sampler #1755

Merged
merged 17 commits into from
Feb 17, 2017

Conversation

jonathanhfriedman
Copy link
Contributor

Here's the beginnings of an elliptical slice sampler implementation (#1643), and a quick demonstration in demo-ess.ipynb that it works on the very simple problem of Gaussian prior/likelihood (the "Gaussian regression" example in the original paper). I was hoping that some more experienced people could provide advice.

Currently you pass the prior covariance to the sampler and it samples from a zero mean multivariate normal directly, but it might be nicer/cleaner to pass in the Cholesky decomposition and sample by multiplying with a standard mv normal. Maybe we should support either option. I'd also still like to work on supporting priors with nonzero means.

@jonathanhfriedman jonathanhfriedman changed the title WIP: Implement elliptical slice sampler [WIP] Implement elliptical slice sampler Feb 5, 2017
@springcoil
Copy link
Contributor

springcoil commented Feb 5, 2017 via email

model : PyMC Model
Optional model for sampling step. Defaults to None (taken from context).
"""
default_blocked = False
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As this operates on the full join, you need to set this to True. Otherwise, a separate ESS will be used for each RV.

@jonathanhfriedman
Copy link
Contributor Author

Thanks for the feedback @springcoil and @twiecki. As I understand it, the selling point of elliptical slice sampling is that it is about as fast as vanilla slice sampling, mixes well no matter the prior covariance, and requires no parameter tuning. The downside is that it requires a multivariate normal prior.

@fonnesbeck
Copy link
Member

This also allows for the sampling of multivariate (or vector-valued) nodes, does it not? Our slice sampler only works well with scalar variables.

@jonathanhfriedman
Copy link
Contributor Author

Yep, but only if they have normal priors.

.. [1] I. Murray, R. P. Adams, and D. J. C. MacKay. "Elliptical Slice
Sampling", The Proceedings of the 13th International Conference on
Artificial Intelligence and Statistics (AISTATS), JMLR W&CP
9:541–548, 2010.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's a sneaky em-dash -- can't be included in python2 source code!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wow, good catch!

@twiecki
Copy link
Member

twiecki commented Feb 10, 2017

@jonathanhfriedman Can we make the GP example a bit more salient (more variability in y-coordinates)>?

@jonathanhfriedman
Copy link
Contributor Author

jonathanhfriedman commented Feb 10, 2017

@twiecki Definitely. I'm also working on having it sample the noise/length scale/vertical function scale to make it a little less contrived and more comparable to the existing GP regression example.

@twiecki
Copy link
Member

twiecki commented Feb 11, 2017

@jonathanhfriedman Hm, would we not expect the true length-scale to be within the posterior region?

@jonathanhfriedman
Copy link
Contributor Author

For that example, there's not really a true length scale since the data isn't actually generated from a GP. I guess that doesn't make it particularly instructive, though...

Anyway, I redid the example so that it now just focuses on sampling from the posterior given a prior and likelihood and doesn't cover fitting covariance kernel parameters. Maybe best to keep things simple and focus on what ESS does well.

@twiecki
Copy link
Member

twiecki commented Feb 16, 2017

@jonathanhfriedman OK, sounds good. And can you confirm that NUTS produces the same results on these models?

@jonathanhfriedman
Copy link
Contributor Author

@twiecki NUTS and ESS produce the same results with fewer (10) data points, but with 30 points, all the other samplers seem to either not get started at all or fail to sample from the whole distribution.

@twiecki
Copy link
Member

twiecki commented Feb 16, 2017

@jonathanhfriedman Pretty good advertisement for ESS then :). I resolved a conflict but it seems you still need to rebase.

@jonathanhfriedman
Copy link
Contributor Author

@twiecki I think that should fix all the conflicts. Shall I squash the commits or would you prefer to squash them when you merge?

@twiecki twiecki merged commit bfe1cbe into pymc-devs:master Feb 17, 2017
@twiecki
Copy link
Member

twiecki commented Feb 17, 2017

@jonathanhfriedman Great, contribution, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants