-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[numpy] torch.exp{2, m1}
: promote integer inputs to float
#48926
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[numpy] torch.exp{2, m1}
: promote integer inputs to float
#48926
Conversation
Reason for skipping for bfloat16 >>> import torch
>>> torch.tensor(6.75, dtype=torch.bfloat16)
tensor(6.7500, dtype=torch.bfloat16)
>>> t = torch.tensor(6.75, dtype=torch.bfloat16)
>>> torch.expm1(t)
tensor(852., dtype=torch.bfloat16)
>>> torch.expm1(t.to(torch.float32))
tensor(853.0588) There are cases with more difference in tolerance for other values. |
💊 CI failures summary and remediationsAs of commit ced829a (more details on the Dr. CI page):
1 failure not recognized by patterns:
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 1 time. |
This makes sense. I would really like to start considering ways we can work around NumPy's lack of bfloat16 support. Maybe we can cast the values to float32, then cast them to bfloat16, then nextafter them in the direction of the PyTorch result? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome! Thanks @kshitij12345!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
I think that makes sense since both have same range. Right now the catch is that |
Reference: #42515