Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add complex support for torch.nn.L1Loss #46640

Closed
wants to merge 1 commit into from

Conversation

anjali411
Copy link
Contributor

@anjali411 anjali411 commented Oct 21, 2020

Stack from ghstack:

TODO:

  1. update l1_loss_backward
  2. possibly update doc

anjali411 added a commit that referenced this pull request Oct 21, 2020
ghstack-source-id: d3a1d45b86cff29790b5aeb0b590cb3b54f49a5b
Pull Request resolved: #46640
@anjali411 anjali411 added module: nn Related to torch.nn module: complex Related to complex number support in PyTorch labels Oct 21, 2020
@dr-ci
Copy link

dr-ci bot commented Oct 21, 2020

🔗 Helpful links

💊 CI failures summary and remediations

As of commit 326a8d7 (more details on the Dr. CI page):


Commit 326a8d7 was recently pushed. Waiting for builds...


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

return apply_loss_reduction(loss, reduction);
}

Tensor& l1_loss_out(Tensor&result, const Tensor& input, const Tensor& target, int64_t reduction) {
auto diff = input.sub(target);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is interesting. In the previous code input - target is out of place if Reduction::None, but inplace otherwise. With this change it will always be out of place.

I'm not sure why that distinction existed previously, but it seems like this change can preserve the inplace subtraction in the else clause?

@@ -7056,11 +7056,12 @@ def test_pointwise_loss_broadcast(self):
# https://github.com/pytorch/pytorch/issues/27692 reports
# that l1_loss get a wrong result for big batch size
def test_l1_loss_correct(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there no other L1 loss test that should be updated?

This is making me thing we should (eventually) create test_losses.py.

facebook-github-bot pushed a commit that referenced this pull request Jan 15, 2021
Summary:
Building on top of the work of anjali411 (#46640)

Things added in this PR:
1. Modify backward and double-backward formulas
2. Add complex support for `new module tests` and criterion tests (and add complex tests for L1)
3. Modify some existing tests to support complex

Pull Request resolved: #49912

Reviewed By: zhangguanheng66

Differential Revision: D25853036

Pulled By: soulitzer

fbshipit-source-id: df619f1b71c450ab2818eb17804e0c55990aa8ad
@github-actions
Copy link

Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions bot added the Stale label Apr 13, 2022
@anjali411 anjali411 closed this Apr 21, 2022
@facebook-github-bot facebook-github-bot deleted the gh/anjali411/68/head branch May 22, 2022 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed module: complex Related to complex number support in PyTorch module: nn Related to torch.nn Stale
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants