Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix FixedSingleSampleModel dtype/device conversion #1254

Closed
wants to merge 1 commit into from

Conversation

ItsMrLin
Copy link
Contributor

@ItsMrLin ItsMrLin commented Jun 7, 2022

Summary: Sometimes model.train_inputs[0] is a tuple instead of a tensor. Instead of assuming the structure of the model's class members, will just cast X on the fly in forward. It shouldn't cause any additional runtime if device and dtype align.

Reviewed By: Balandat

Differential Revision: D36976244

Summary: Sometimes `model.train_inputs[0]` is a tuple instead of a tensor. Instead of assuming the structure of the model's class members, will just cast X on the fly in `forward`. It shouldn't cause any additional runtime if device and dtype align.

Reviewed By: Balandat

Differential Revision: D36976244

fbshipit-source-id: db51af596d3ac4ef27c719e2d38a47a046c503a2
@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Jun 7, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36976244

@codecov
Copy link

codecov bot commented Jun 7, 2022

Codecov Report

Merging #1254 (574895d) into main (1ed307c) will not change coverage.
The diff coverage is 100.00%.

❗ Current head 574895d differs from pull request most recent head 7cfd0bb. Consider uploading reports for the commit 7cfd0bb to get more accurate results

@@            Coverage Diff            @@
##              main     #1254   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          122       122           
  Lines        10589     10587    -2     
=========================================
- Hits         10589     10587    -2     
Impacted Files Coverage Δ
botorch/models/deterministic.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1ed307c...7cfd0bb. Read the comment docs.

saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: a851029dde668c954ce6104c0155baf718dfa860
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: f1a7d9b390e4012a35f13cfd04ee1114bc12e53e
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 59fa663777555ff6d528dab53d124665ae5e75e7
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 16, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.
- Some more listed in T134364907
- Test fixes and new units

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 64ba2cac975fb347cbf2b61cd0e8fc627edc6a6f
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: 325519e3458dcb5bd1f09cf8e71466f5aecda6ae
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Differential Revision: https://internalfb.com/D39759489

fbshipit-source-id: 99bb2ee02e9e52a8b390f736c4a0668bce8bb09d
saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 17, 2022
…ler API (pytorch#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: pytorch#1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Differential Revision: https://internalfb.com/D39759489

fbshipit-source-id: 2669ea7c5996095e2635e8572184b1c22e969d57
facebook-github-bot pushed a commit that referenced this pull request Nov 18, 2022
…ler API (#1254)

Summary:
X-link: facebook/Ax#1254

X-link: facebookresearch/aepsych#193

Pull Request resolved: #1486

The main goal here is to broadly support non-Gaussian posteriors.
- Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest.
- For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level.
- Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples.
- Adds `ListSampler` for sampling from `PosteriorList`.
- Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples.
- Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`.
- Absorbs `FullyBayesianPosteriorList` into `PosteriorList`.
- For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors.

TODOs:
- Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff.
- Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff.

Other notables:
- See D39760855 for usage of TorchDistribution in SkewGP.
- TransformedPosterior could serve as the fallback option for derived posteriors.
- MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases.
- Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`)
- Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior.
- Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more.

Reviewed By: Balandat

Differential Revision: D39759489

fbshipit-source-id: f4db866320bab9a5455dfc0c2f7fe2cc15385453
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants