Fix hang in VonMises rejection sampling for small values of concentration#114498
Fix hang in VonMises rejection sampling for small values of concentration#114498julian-urban wants to merge 5 commits intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/114498
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 8d068d3 with merge base b27565a ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@pytorchbot label "topic: not user facing" |
fritzo
left a comment
There was a problem hiding this comment.
Thanks for fixing this! Could we move your logic to three @lazy_propertys, to keep the other code paths entirely single precision?
|
@fritzo Thanks for picking it up, please check my latest commit. I followed your suggestion of implementing the I'm not sure if I was supposed to resolve the above conversation already, this is my first PR and I'm not familiar with the proper etiquette. Apologies. |
fritzo
left a comment
There was a problem hiding this comment.
Looks great @julian-urban! It might be worth adding a line comment in .sample() explaining why sampling is performed using double precision, but feel free to leave as is.
|
Looks like a legit lint error |
|
@fritzo whoopsie. The test passes now and I added a clarifying comment to |
|
@ezyang could you please merge this? |
|
@pytorchmergebot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…tion (pytorch#114498) Fixes pytorch#88443 Forces the internal `dtype` of `torch.distributions.von_mises.VonMises` to be `torch.double` and mirrors the numpy implementation of the second order Taylor expansion for `concentration < 1e-5`. Samples and log probs are returned with `dtype` of argument `loc`. In principle one could also use masking in the rejection sampler to return uniformly distributed numbers for `concentration < 1e-8`, as in numpy. This may be slightly more efficient, but isn't required to solve the hanging issue. Pull Request resolved: pytorch#114498 Approved by: https://github.com/fritzo
Fixes #88443
Forces the internal
dtypeoftorch.distributions.von_mises.VonMisesto betorch.doubleand mirrors the numpy implementation of the second order Taylor expansion forconcentration < 1e-5. Samples and log probs are returned withdtypeof argumentloc.In principle one could also use masking in the rejection sampler to return uniformly distributed numbers for
concentration < 1e-8, as in numpy. This may be slightly more efficient, but isn't required to solve the hanging issue.cc @fritzo @neerajprad @alicanb @nikitaved