Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Sampling function from posterior for Thompson sampling? #757

Closed
wangell opened this issue Jun 26, 2019 · 7 comments
Closed

[Docs] Sampling function from posterior for Thompson sampling? #757

wangell opened this issue Jun 26, 2019 · 7 comments

Comments

@wangell
Copy link

wangell commented Jun 26, 2019

📚 Documentation/Examples

I'm trying to implement Thompson sampling - is there an implemented way to sample a function from the model posterior without an input? i.e. f = model.sample(); f(X)

Similar to this https://math.stackexchange.com/questions/1218718/how-do-we-sample-from-a-gaussian-process

@KeAWang
Copy link
Collaborator

KeAWang commented Jun 26, 2019

You can't draw samples of random functions from the posterior unless you specify the points at which you want to evaluate those random functions.

To sample from a GP's posterior evaluated at points test_x, do

# Set into posterior mode
model.eval()
likelihood.eval()

preds = likelihood(model(test_x))
preds.sample()

where test_x is N x D if you're not using batched GPs. preds is a multivariate Gaussian distribution corresponding to your GP posterior at those points.

@jacobrgardner
Copy link
Member

Very small addendum to @KeAWang's answer: if you need to backpropagate through the samples for any reason, be sure to use rsample instead of sample.

@eytan
Copy link

eytan commented Jun 27, 2019

Are you trying to implement TS for bandits or BayesOpt? If it's for BayesOpt we have some code laying around internally which we can add to BoTorch if you'd like, @wangell

@jacobrgardner
Copy link
Member

I'm going to close this unless someone mentions they are still having trouble with this.

@wangell
Copy link
Author

wangell commented Jul 1, 2019

@eytan That would be awesome

@DavidWalz
Copy link

@eytan I'd be very interested in your TS approach for BayesOpt. I only know of the approximate spectral sampling approach for stationary kernels.

@Balandat
Copy link
Collaborator

Balandat commented Dec 4, 2019

@DavidWalz Spectral sampling would work, what we've been doing mostly is just a discrete version that's based on drawing joint samples on a (large) discretization of the domain. That works pretty well in reasonably small dimensions and is very fast given how gpytorch exploits fast predictive variances & batched computation. Here is an old PR for this that I hope to clean up some time soon: pytorch/botorch#218

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants