Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.reciprocal: promote integer inputs to float #49102

Closed
wants to merge 13 commits into from

Conversation

soulitzer
Copy link
Contributor

Fixes #49091

@dr-ci
Copy link

dr-ci bot commented Dec 9, 2020

💊 CI failures summary and remediations

As of commit 4d9b345 (more details on the Dr. CI page):


  • 1/2 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)
  • 1/2 broken upstream at merge base 6568572 on Dec 18 from 1:48am to 9:50am

1 job timed out:

  • pytorch_linux_bionic_py3_8_gcc9_coverage_test2

🚧 1 fixed upstream failure:

These were probably caused by upstream breakages that were already fixed.

Please rebase on the viable/strict branch (expand for instructions)

If your commit is older than viable/strict, run these commands:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase FETCH_HEAD

Check out the recency history of this "viable master" tracking branch.


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

This comment has been revised 44 times.

@kshitij12345 kshitij12345 mentioned this pull request Dec 10, 2020
8 tasks
@soulitzer soulitzer changed the title Add support for integers and add OpInfo torch.reciprocal: promote integer inputs to float Dec 11, 2020
@soulitzer soulitzer marked this pull request as ready for review December 11, 2020 00:18
Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @soulitzer. Overall this looks very good. I made some inline comments.

I think this PR should also update reciprocal's documentation to clarify the integer behavior and note its different from NumPy's reciprocal. This should probably be done in a note.

@soulitzer
Copy link
Contributor Author

Reason for disabling test_reference_numerics for bfloat16:

>>> torch.tensor(-0.0009994506803359116).reciprocal()
tensor(-1000.5496)
>>> torch.tensor(-0.0009994506803359116).reciprocal().to(torch.bfloat16)
tensor(-1000., dtype=torch.bfloat16) #`torch.bfloat16` isn't able to represent `-1000.5496`.

@mruberry
Copy link
Collaborator

Tests failures are unrelated.

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work, @soulitzer! It's great that reciprocal is finally being tested with zero, and we have better undefined behavior annotations. It's also nice that we're improving over NumPy's behavior by defining reciprocal for integers sensibly.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@soulitzer merged this pull request in 5ab9593.

hwangdeyu pushed a commit to hwangdeyu/pytorch that referenced this pull request Jan 6, 2021
Summary:
Fixes pytorch#49091

Pull Request resolved: pytorch#49102

Reviewed By: VitalyFedyunin

Differential Revision: D25639541

Pulled By: soulitzer

fbshipit-source-id: 1dd360bd7b77f106d606143d8d3961610bac8cb7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: support int->float type promotion for torch.reciprocal
3 participants