-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Make binary_cross_entropy_with_logits composite compliant #75929
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make binary_cross_entropy_with_logits composite compliant #75929
Conversation
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit a50229c (more details on the Dr. CI page): Expand to see more
🕵️♀️ 1 failure not recognized by patterns:The following CI failures may be due to changes from the PR
This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a bit worrying that the test don't catch the fact that weights are ignored here :(
770a730 to
79d7bf2
Compare
|
|
That is a very good question! cc @zou3519 ?
It depends on which ones. You can only do inplace with Tensors that are guaranteed to be of the same "type". |
Composite compliance applies to CompositeImplicitAutograd pytorch operations OR a function (in python or c++) that calls PyTorch operations (a "composite"). binary_cross_entropy_with_logits's backward formula is not a PyTorch operator -- it's a C++ function that calls Pytorch operations. The backward formula is the thing that is not composite compliant. |
Should I change forward formula back to what it was with mostly inplace operations that aren't switched on subclass tensors - and keep the backward update? |
|
@zou3519 So the fixes need to exist only in |
This got me thinking and found I wrote a terrible sampling function which I think I corrected allowing me to remove a few more skips |
f4dcd26 to
2204344
Compare
Yes, sorry for the delayed reply.
Yes... I thought it was there but apparently it's not |
2204344 to
92b0386
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix for composite compliance LGTM. Since we're modifying the OpInfo, I have some questions about the semantics of the operator
lint error trying the original version
Add back some skips
e450519 to
febb26f
Compare
|
Just Rebased onto master and this skip: Is now failing for performing inplace ops on tensor subclass. I am going to add back in this skip and not update the forward pass because see above convseration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
|
@pytorchbot merge this please |
|
Hey @drisspg. |
Summary: Fixes one of the tests related to this issue #75680 Removes the skip for the composite compliance test_backward and updates implementation of binary_cross_entropy_with_logits so that the correct inplace and out of place operators are used depending on tensor subclasses being present Pull Request resolved: #75929 Approved by: https://github.com/zou3519 Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/368430036eefd7a4a440678286b7fbc3762122b9 Reviewed By: osalpekar Differential Revision: D35971230 Pulled By: drisspg fbshipit-source-id: 9d6e0164ffe1f4254da0a708e0b9999d8dc16388
Fixes one of the tests related to this issue #75680
Removes the skip for the composite compliance test_backward and updates implementation of binary_cross_entropy_with_logits so that the correct inplace and out of place operators are used depending on tensor subclasses being present