Skip to content

Conversation

y0ast
Copy link
Contributor

@y0ast y0ast commented Feb 22, 2021

Fixes #1674

This is a proposal solution to the issue linked above. I am not 100% sure of the implications of this change.

However if it is deemed acceptable, I would hope that maybe it can be part of a 0.4.4 bugfix release.

Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks @y0ast !

@vfdev-5 vfdev-5 changed the title Move detach outside of loss function computation [BC-breaking] Move detach outside of loss function computation Feb 22, 2021
@vfdev-5 vfdev-5 requested a review from sdesrozis February 22, 2021 16:30
Copy link
Contributor

@sdesrozis sdesrozis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@y0ast Thank you !

@vfdev-5 vfdev-5 merged commit 7b4da22 into pytorch:master Feb 23, 2021
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Feb 23, 2021

@y0ast could you please send a follow-up PR that emulates your duq evaluator example, such that we verify this fix ? Thanks

@y0ast
Copy link
Contributor Author

y0ast commented Feb 23, 2021

@vfdev-5 do you mean in the form of a test? Or something else

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Feb 23, 2021

@y0ast yes, exactly, a test which verifies that grads could still flow even with using Loss. Sorry, didn't mention that clearly.

vfdev-5 pushed a commit that referenced this pull request Mar 1, 2021
Fixes #1674

Co-authored-by: Sylvain Desroziers <sylvain.desroziers@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Loss automatically detaching inputs breaks some workflows
3 participants