-
Notifications
You must be signed in to change notification settings - Fork 557
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Docs] Sampling function from posterior for Thompson sampling? #757
Comments
You can't draw samples of random functions from the posterior unless you specify the points at which you want to evaluate those random functions. To sample from a GP's posterior evaluated at points
where |
Very small addendum to @KeAWang's answer: if you need to backpropagate through the samples for any reason, be sure to use |
Are you trying to implement TS for bandits or BayesOpt? If it's for BayesOpt we have some code laying around internally which we can add to BoTorch if you'd like, @wangell |
I'm going to close this unless someone mentions they are still having trouble with this. |
@eytan That would be awesome |
@eytan I'd be very interested in your TS approach for BayesOpt. I only know of the approximate spectral sampling approach for stationary kernels. |
@DavidWalz Spectral sampling would work, what we've been doing mostly is just a discrete version that's based on drawing joint samples on a (large) discretization of the domain. That works pretty well in reasonably small dimensions and is very fast given how gpytorch exploits fast predictive variances & batched computation. Here is an old PR for this that I hope to clean up some time soon: pytorch/botorch#218 |
📚 Documentation/Examples
I'm trying to implement Thompson sampling - is there an implemented way to sample a function from the model posterior without an input? i.e. f = model.sample(); f(X)
Similar to this https://math.stackexchange.com/questions/1218718/how-do-we-sample-from-a-gaussian-process
The text was updated successfully, but these errors were encountered: