-
Notifications
You must be signed in to change notification settings - Fork 295
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem when Sobol falls back to HitAndRunPolytopeSampler #2373
Comments
Thanks for flagging this, this does seem like a bug. I'll take a closer look |
OK so the issue is that the hit and run sampling fallback in Ax (see here currently neither applies the normalization that you mentioned nor, and more importantly, does it apply thinning to the samples, which results in much more correlated samples. Basically, we need to update the Ax logic to call |
Thanks for looking into this. Sounds right, may as well call it directly since it seems to be working fine. When you say thinning what exactly does that mean? Sorry not 100% familiar with all the jargon yet. |
Thinning (removing all but every |
Got it, thanks. |
* adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as pytorch#1225 * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * deprecates `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead Note: This change is in preparation for fixing facebook/Ax#2373
Addresses the issues reported in facebook#2373 by using the updated `HitAndRunPolytopeSampler` from pytorch/botorch#2358 (this will require this to land in nightly botorch before tests can pass).
* adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as pytorch#1225 * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * deprecates `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead Note: This change is in preparation for fixing facebook/Ax#2373
* adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as pytorch#1225 * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * deprecates `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead Note: This change is in preparation for fixing facebook/Ax#2373
Summary: This set of changes does the following: * adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * Changes `HitAndRunPolytopeSampler` to take the `seed` arg in its constructor, and removes the arg from the `draw()` method (the method on the base class is adjusted accordingly). The resulting behavior is that if a `HitAndRunPolytopeSampler` is instantiated with the same args and seed, then the sequence of `draw()`s will be deterministic. `DelaunayPolytopeSampler` is stateless, and so retains its existing behavior. * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as pytorch#1225. If `bounds` are note provided, emits a warning that this cannot be performed (doing this would require vertex enumeration of the constraint polytope, which is NP-hard and too costly). * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * removes `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead * simplifies some of the testing code Note: This change is in preparation for fixing facebook/Ax#2373 Test Plan: Ran a stress test to make sure this doesn't cause flaky tests: https://www.internalfb.com/intern/testinfra/testconsole/testrun/3940649908470083/ Differential Revision: D58068753 Pulled By: Balandat
Summary: This set of changes does the following: * adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * Changes `HitAndRunPolytopeSampler` to take the `seed` arg in its constructor, and removes the arg from the `draw()` method (the method on the base class is adjusted accordingly). The resulting behavior is that if a `HitAndRunPolytopeSampler` is instantiated with the same args and seed, then the sequence of `draw()`s will be deterministic. `DelaunayPolytopeSampler` is stateless, and so retains its existing behavior. * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as pytorch#1225. If `bounds` are note provided, emits a warning that this cannot be performed (doing this would require vertex enumeration of the constraint polytope, which is NP-hard and too costly). * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * removes `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead * simplifies some of the testing code Note: This change is in preparation for fixing facebook/Ax#2373 Test Plan: Ran a stress test to make sure this doesn't cause flaky tests: https://www.internalfb.com/intern/testinfra/testconsole/testrun/3940649908470083/ Differential Revision: D58068753 Pulled By: Balandat
…book#2492) Summary: Addresses the issues reported in facebook#2373 by using the updated `HitAndRunPolytopeSampler` from pytorch/botorch#2358 (this will require this to land in nightly botorch before tests can pass). Differential Revision: D58070591 Pulled By: Balandat
Summary: This set of changes does the following: * adds an `n_thinning` argument to `sample_polytope` and `HitAndRunPolytopeSampler`; changes the defaults for `HitAndRunPolytopeSampler` args to `n_burnin=200` and `n_thinning=20` * Changes `HitAndRunPolytopeSampler` to take the `seed` arg in its constructor, and removes the arg from the `draw()` method (the method on the base class is adjusted accordingly). The resulting behavior is that if a `HitAndRunPolytopeSampler` is instantiated with the same args and seed, then the sequence of `draw()`s will be deterministic. `DelaunayPolytopeSampler` is stateless, and so retains its existing behavior. * normalizes the (inequality and equality) constraints in `HitAndRunPolytopeSampler` to avoid the same issue as #1225. If `bounds` are note provided, emits a warning that this cannot be performed (doing this would require vertex enumeration of the constraint polytope, which is NP-hard and too costly). * introduces `normalize_dense_linear_constraints` to normalize constraint given in dense format to the unit cube * removes `normalize_linear_constraint`; `normalize_sparse_linear_constraints` is to be used instead * simplifies some of the testing code Note: This change is in preparation for fixing facebook/Ax#2373 Pull Request resolved: #2358 Test Plan: Ran a stress test to make sure this doesn't cause flaky tests: https://www.internalfb.com/intern/testinfra/testconsole/testrun/3940649908470083/ Reviewed By: saitcakmak Differential Revision: D58068753 Pulled By: Balandat fbshipit-source-id: 9a75c547a3493e393cd7e724edd984318b76e1f4
Summary: Addresses the issues reported in #2373 by using the updated `HitAndRunPolytopeSampler` from pytorch/botorch#2358 (this will require this to land in nightly botorch before tests can pass). Pull Request resolved: #2492 Reviewed By: saitcakmak Differential Revision: D58070591 Pulled By: Balandat fbshipit-source-id: f578674bf476ff677314cf4f71283c7ded660943
This has been fixed in #2492 |
I've been playing around to try to understand the effect of linear inequality constraints on quasi-random sampling, such as Sobol, and I'm seeing some issues where the samples don't seem to uniformly fill the polytope that is defined by the linear constraints (in addition to the usual box constraints) when
Sobol
is allowed to fall back toHitAndRunPolytopeSampler
. I am wondering if this is related to pytorch/botorch#1225, which was fixed a while back but perhaps the code path that Ax is following to call this sampler is not passing through the proper normalization steps that were implemented?Here's what I mean. First generate a 100-sample Sobol sequence without falling back to the polytope sampler; to make this possible I'll crank up the max number of rejection sampling tries to
1e5
:Producing all 100 generator runs is slow (15 seconds on my computer), probably due to a tiny acceptance fraction during rejection sampling, but the result looks exactly as we'd expect:
If I repeat the experiment above, but setting
"fallback_to_sample_polytope": True
and"max_rs_draws": 0
to force falling back to polytope sampling, the code runs noticeably faster (4 sec), but the results look quite biased towards the corners:As a sanity check, I called this sampler directly and got much more reasonable results, although still not as clean-looking as what you get (very slowly) through rejection sampling:
I'm still pretty new to this so I could easily be doing something wrong, feedback would be welcome!
The text was updated successfully, but these errors were encountered: