New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add note to torch docs for sinh/cosh #49413
Conversation
💊 CI failures summary and remediationsAs of commit f9a0aa4 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. This comment has been revised 7 times. |
Codecov Report
@@ Coverage Diff @@
## master #49413 +/- ##
=======================================
Coverage 80.56% 80.56%
=======================================
Files 1875 1875
Lines 202701 202701
=======================================
+ Hits 163309 163312 +3
+ Misses 39392 39389 -3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool! In the future we should think about styling this section as something like "Implementation Details."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
@soulitzer merged this pull request in 399b07a. |
Summary: Address pytorch#48641 Documents the behavior of sinh and cosh in the edge cases ``` >>> b = torch.full((15,), 89, dtype=torch.float32) >>> torch.sinh(b) tensor([2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38]) >>> b = torch.full((16,), 89, dtype=torch.float32) >>> torch.sinh(b) tensor([inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf]) >>> b = torch.full((17,), 89, dtype=torch.float32) >>> torch.sinh(b) tensor([ inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, inf, 2.2448e+38]) >>> b = torch.full((32,), 89, dtype=torch.float32)[::2] >>> torch.sinh(b) tensor([2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38, 2.2448e+38]) ``` See https://sleef.org/purec.xhtml Pull Request resolved: pytorch#49413 Reviewed By: ezyang Differential Revision: D25587932 Pulled By: soulitzer fbshipit-source-id: 6db75c45786f4b95f82459d0ce5efa37ec0774f0
Address #48641
Documents the behavior of sinh and cosh in the edge cases
See https://sleef.org/purec.xhtml