Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix elu backward operation for negative alpha #49272

Closed
wants to merge 10 commits into from

Conversation

H-Huang
Copy link
Member

@H-Huang H-Huang commented Dec 12, 2020

Fixes #47671

Test:

x = torch.tensor([-2, -1, 0, 1, 2], dtype=torch.float32, requires_grad=True)
y = torch.nn.functional.elu_(x.clone(), alpha=-2)
grads = torch.tensor(torch.ones_like(y))
y.backward(grads)
RuntimeError: In-place elu backward calculation is triggered with a negative slope which is not supported. 
This is caused by calling in-place forward function with a negative slope, please call out-of-place 
version instead.

@H-Huang H-Huang marked this pull request as draft December 12, 2020 00:26
@H-Huang H-Huang force-pushed the EluBackward branch 2 times, most recently from 0e7975a to fa19fa0 Compare December 14, 2020 14:54
@H-Huang H-Huang marked this pull request as ready for review December 14, 2020 15:04
@albanD
Copy link
Collaborator

albanD commented Dec 14, 2020

Hi,

Thanks for sending a patch.
The forward computation is correct right, even when alpha is negative?
And the backward would work as well, as long as the forward was not inplace if we use the input properly?
So I think maybe doing something similar to leaky relu here would allow the users to keep using negative alpha. As long as they don't use it inplace.
You can update the backward formula to use the boolean flag set in the formula to only error out in the case that is not supported and allow all the others to work!

@H-Huang
Copy link
Member Author

H-Huang commented Dec 15, 2020

Hi @albanD, thank you for the feedback. According to the example in #47671, we are hitting an issue when alpha is negative for the backward calculation even when the forward was out-of-place. So I updated the PR to check if the alpha is negative OR if the forward was done in-place, and if so then prevent backward calculation. Is this understanding correct?

In the leaky relu case it looks like it checks if the slope is negative AND the forward was in-place, which is slightly different.

@albanD
Copy link
Collaborator

albanD commented Dec 15, 2020

we are hitting an issue when alpha is negative for the backward calculation even when the forward was out-of-place

I think this only happens because the formula here uses result instead of self. If you are using self there, that would make the out of place version work for negative alpha.
But as mentioned in the other PR, this would make the inplace (that uses the same formula from derivatives.yaml by default if none is provided) use extra memory which we don't want. Hence the explicit inplace formula that sets the proper flag telling that it is passing result instead of self (and thus allowing to raise a nice error in this one case that we cannot support).

@H-Huang H-Huang changed the title Add check for negative alphas for elu Disable in-place elu backward calculation with negative alpha Dec 15, 2020
@H-Huang
Copy link
Member Author

H-Huang commented Dec 15, 2020

I see, thank you for the helpful explanations @albanD! Updated the PR accordingly.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@H-Huang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@H-Huang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@albanD
Copy link
Collaborator

albanD commented Dec 16, 2020

Let me know when CI is green and this is ready for a final review :)

@H-Huang
Copy link
Member Author

H-Huang commented Dec 22, 2020

@albanD Can you take a look whenever you get the chance?

tools/autograd/derivatives.yaml Outdated Show resolved Hide resolved
tools/autograd/derivatives.yaml Outdated Show resolved Hide resolved
@H-Huang H-Huang force-pushed the EluBackward branch 3 times, most recently from 211afc3 to 7115db6 Compare January 8, 2021 23:44
@H-Huang H-Huang changed the title Disable in-place elu backward calculation with negative alpha Fix elu backward operation for negative alpha Jan 9, 2021
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@H-Huang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@H-Huang merged this pull request in ec51b67.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

A problem about elu_backward
4 participants