Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[inductor] Added smooth_l1_loss refs #102077

Closed
wants to merge 3 commits into from

Conversation

vfdev-5
Copy link
Collaborator

@vfdev-5 vfdev-5 commented May 23, 2023

@pytorch-bot
Copy link

pytorch-bot bot commented May 23, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/102077

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit c6ade94:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a minor point, otherwise this LGTM. Approved contingent on that and the tests passing

torch/_refs/nn/functional/__init__.py Show resolved Hide resolved
test/test_decomp.py Outdated Show resolved Hide resolved
@vfdev-5 vfdev-5 force-pushed the inductor-refs-smooth_l1_loss branch from 827078e to 46b3cb8 Compare May 23, 2023 14:50
@vfdev-5 vfdev-5 marked this pull request as ready for review May 23, 2023 19:39
reduction = _get_string_reduction_arg(size_average=size_average, reduce=reduce)
_check_reduction_value(reduction)

if beta == 0.0:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you need this conditional?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Python nn functional API does the same:

pytorch/torch/nn/functional.py

Lines 3242 to 3245 in 76af221

if beta == 0.0:
return torch._C._nn.l1_loss(expanded_input, expanded_target, _Reduction.get_enum(reduction))
else:
return torch._C._nn.smooth_l1_loss(expanded_input, expanded_target, _Reduction.get_enum(reduction), beta)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we have enough logic as to optimise this example out in practice, as it would mean that we need to prove that (input-target).abs() is not negative. The conditional is alright for now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wdym? .abs() is non-negative. Functional API does this due to some numeric discrepancies in backward, this doesn't apply here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, my point simply comes from a perf perspective, wheere we would be computing both branches of the where and just using one, but probably LLVM should be able to catch the < 0 after .abs() and optimise it out.

That being said, I still think that keeping this closer to core is better, as we could think of eventually registering this operation and simply differentiating through it to get its backward. This beta==0 specialisation would make sure that this works in that case, as it does in master.

torch/_refs/nn/functional/__init__.py Outdated Show resolved Hide resolved
@vfdev-5 vfdev-5 force-pushed the inductor-refs-smooth_l1_loss branch from 46b3cb8 to c6ade94 Compare May 24, 2023 08:48
@vfdev-5
Copy link
Collaborator Author

vfdev-5 commented May 24, 2023

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label May 24, 2023
@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: This PR needs a release notes: label
If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Details for Dev Infra team Raised by workflow job

@vfdev-5
Copy link
Collaborator Author

vfdev-5 commented May 24, 2023

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@vfdev-5 vfdev-5 deleted the inductor-refs-smooth_l1_loss branch May 24, 2023 15:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants