Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend Posterior API to support torch distributions & overhaul MCSampler API (#1486) #193

Closed

Conversation

saitcakmak
Copy link
Contributor

Summary:
X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.

  • Adds a generic TorchPosterior which wraps a Torch Distribution. This defines a few properties that we commonly expect, and calls the distribution for the rest.
  • For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
  • Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based get_sampler method, we can support SAA with mixed posteriors without having to shuffle base samples in a PosteriorList, as long as all base distributions have a corresponding sampler and support base samples.
  • Adds ListSampler for sampling from PosteriorList.
  • Adds ForkedRNGSampler and StochasticSampler for sampling from posteriors without base samples.
  • Adds rsample_from_base_samples for sampling with base_samples / with a sampler.
  • Absorbs FullyBayesianPosteriorList into PosteriorList.
  • For MC acqfs, introduces a get_posterior_samples for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using get_sampler, eliminating the need to construct a sampler in __init__, which we used to do under the assumption of Gaussian posteriors.

TODOs:

  • Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
  • Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
  • Some more listed in T134364907
  • Test fixes and new units

Other notables:

  • See D39760855 for usage of TorchDistribution in SkewGP.
  • TransformedPosterior could serve as the fallback option for derived posteriors.
  • MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces ForkedRNGSampler and StochasticSampler as convenience samplers for these use cases.
  • Introduced batch_range_override for the sampler to support edge cases where we may want to override posterior.batch_range (needed in qMultiStepLookahead)
  • Removes unused sampling utilities construct_base_samples(_from_posterior), which assume Gaussian posterior.
  • Moves the main logic of _set_sampler method of CachedCholesky subclasses to a _update_base_samples method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Nov 11, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39759489

saitcakmak added a commit to saitcakmak/aepsych that referenced this pull request Nov 16, 2022
…ler API (#1254)

Summary:
X-link: facebook/Ax#1254

Pull Request resolved: facebookresearch#193

X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 14457c86793aa05489c090fe0f969756fbd22bf6
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39759489

saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: a851029dde668c954ce6104c0155baf718dfa860
saitcakmak added a commit to saitcakmak/Ax that referenced this pull request Nov 16, 2022
…ler API (facebook#1254)

Summary:
Pull Request resolved: facebook#1254

X-link: facebookresearch/aepsych#193

X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: e18a1eb5ecd989aa926e646e71eae27f9f685ae4
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: f1a7d9b390e4012a35f13cfd04ee1114bc12e53e
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 59fa663777555ff6d528dab53d124665ae5e75e7
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 64ba2cac975fb347cbf2b61cd0e8fc627edc6a6f
…ler API (#1254)

Summary:
X-link: facebook/Ax#1254

Pull Request resolved: facebookresearch#193

X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: d6f2d48c1019370f9727acb3fc2652f048e302a0
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 325519e3458dcb5bd1f09cf8e71466f5aecda6ae
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39759489

saitcakmak added a commit to saitcakmak/Ax that referenced this pull request Nov 17, 2022
…ler API (facebook#1254)

Summary:
Pull Request resolved: facebook#1254

X-link: facebookresearch/aepsych#193

X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 7c56a0b185333ba8a90b99d14058dba1add4d2de
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Differential Revision: https://internalfb.com/D39759489

fbshipit-source-id: 99bb2ee02e9e52a8b390f736c4a0668bce8bb09d
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Differential Revision: https://internalfb.com/D39759489

fbshipit-source-id: 2669ea7c5996095e2635e8572184b1c22e969d57
facebook-github-bot pushed a commit to pytorch/botorch that referenced this pull request Nov 18, 2022
…ler API (#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: #1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: f4db866320bab9a5455dfc0c2f7fe2cc15385453
facebook-github-bot pushed a commit to facebook/Ax that referenced this pull request Nov 18, 2022
…ler API (#1254)

Summary:
Pull Request resolved: #1254

X-link: facebookresearch/aepsych#193

X-link: pytorch/botorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: f4db866320bab9a5455dfc0c2f7fe2cc15385453
@saitcakmak saitcakmak deleted the export-D39759489 branch November 18, 2022 02:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants