Skip to content

Commit

Permalink
fix a bug in leakyReLU (#48265)
Browse files Browse the repository at this point in the history
Summary:
The scale variable needs to be a scalar, otherwise it will report the following error: "RuntimeError: Cannot input a tensor of dimension other than 0 as a scalar argument"

Pull Request resolved: #48265

Test Plan: Tested locally and the error disappeared.

Reviewed By: zhizhengwu

Differential Revision: D25105423

Pulled By: jerryzh168

fbshipit-source-id: 2a0df24cf7e40278a950bffe6e0a9552f99da1d1
  • Loading branch information
Zhi-Zheng Wu authored and facebook-github-bot committed Nov 20, 2020
1 parent 998c4ca commit 7828a22
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions torch/nn/quantized/modules/activation.py
Expand Up @@ -97,8 +97,8 @@ class LeakyReLU(torch.nn.LeakyReLU):
"""
def __init__(self, scale: float, zero_point: int, negative_slope: float = 1e-2, inplace: bool = False):
super().__init__(negative_slope, inplace)
self.register_buffer('scale', torch.tensor([scale]))
self.register_buffer('zero_point', torch.tensor([zero_point]))
self.register_buffer('scale', torch.tensor(scale))
self.register_buffer('zero_point', torch.tensor(zero_point))

def forward(self, input):
return torch.ops.quantized.leaky_relu(
Expand Down

0 comments on commit 7828a22

Please sign in to comment.