A problem about elu_backward #47671
Labels
high priority
module: autograd
Related to torch.autograd, and the autograd engine in general
module: bc-breaking
Related to a BC-breaking change
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
馃悰 Bug
Hi, when run the elu_backward operator锛孖 have some confusion with the result when param alpha is negative. I am not sure whether that is one bug inside.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
When param alpha is negative and the input x is less than zero, the elu_backward operator result should in range [alpha, 0] if we set the grads to ones. In other words, the result maybe alpha*exp(x) in this case.
Actual behavior
Environment
conda
,pip
, source): pip3Additional context
Torch CPU code implementation link.
Torch GPU code implementation link.
cc @ezyang @gchanan @zou3519 @bdhirsh @albanD @gqchen @pearu @nikitaved
The text was updated successfully, but these errors were encountered: