Skip to content

[Bug] [PyTorch] BatchNorm gives incorrect output with eps=-1.0 #12590

@crawlingcub

Description

@crawlingcub

The BatchNorm operator for PyTorch's frontend fails when eps is -ive. I modiied the test_forward_batchnorm test in tests/python/frontend/pytorch/test_forward.py by adding a -ive eps, and the test failed. Is this expected? Does the batchnorm implementation ignore the -ive eps value?

Modified Test:

def test_forward_batchnorm():
    """test_forward_batchnorm"""

    def init_weight(m):
        torch.nn.init.normal_(m.weight, 0, 0.01)
        torch.nn.init.normal_(m.bias)

    inp_2d = torch.rand((1, 16, 10, 10))
    inp_3d = torch.rand((1, 16, 10, 10, 10))

    for bn, inp in [(torch.nn.BatchNorm2d(16, eps=-1.0), inp_2d), (torch.nn.BatchNorm3d(16, eps=-1.0), inp_3d)]:                                                                                                         init_weight(bn.eval())
        verify_model(bn.eval(), input_data=inp)

Output:


        for bn, inp in [(torch.nn.BatchNorm2d(16, eps=-1.0), inp_2d), (torch.nn.BatchNorm3d(16, eps=-1.0), inp_3d)]:
            init_weight(bn.eval())
>           verify_model(bn.eval(), input_data=inp)

tests/python/frontend/pytorch/test_forward.py:1317:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/python/frontend/pytorch/test_forward.py:204: in verify_model
    tvm.testing.assert_allclose(baseline_output, output, rtol=rtol, atol=atol)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

actual = array([[[[ inf,  inf,  inf, ...,  inf,  inf,  inf],
         [ inf,  inf,  inf, ...,  inf,  inf,  inf],
         [ inf...       [ inf,  inf,  inf, ...,  inf,  inf,  inf],
         [ inf,  inf,  inf, ...,  inf,  inf,  inf]]]], dtype=float32)
desired = array([[[[nan, nan, nan, ..., nan, nan, nan],
         [nan, nan, nan, ..., nan, nan, nan],
         [nan, nan, nan, ....an, nan],
         [nan, nan, nan, ..., nan, nan, nan],
         [nan, nan, nan, ..., nan, nan, nan]]]], dtype=float32)
rtol = 1e-05, atol = 1e-05

    def assert_allclose(actual, desired, rtol=1e-7, atol=1e-7):
        """Version of np.testing.assert_allclose with `atol` and `rtol` fields set
        in reasonable defaults.

        Arguments `actual` and `desired` are not interchangeable, since the function
        compares the `abs(actual-desired)` with `atol+rtol*abs(desired)`.  Since we
        often allow `desired` to be close to zero, we generally want non-zero `atol`.
        """
        actual = np.asanyarray(actual)
        desired = np.asanyarray(desired)
        np.testing.assert_allclose(actual.shape, desired.shape)
>       np.testing.assert_allclose(actual, desired, rtol=rtol, atol=atol, verbose=True)
E       AssertionError:
E       Not equal to tolerance rtol=1e-05, atol=1e-05
E
E       x and y nan location mismatch:
E        x: array([[[[ inf,  inf,  inf, ...,  inf,  inf,  inf],
E                [ inf,  inf,  inf, ...,  inf,  inf,  inf],
E                [ inf,  inf,  inf, ...,  inf,  inf,  inf],...
E        y: array([[[[nan, nan, nan, ..., nan, nan, nan],
E                [nan, nan, nan, ..., nan, nan, nan],
E                [nan, nan, nan, ..., nan, nan, nan],...

Excepted output:

Test should pass

Environment:

torch 1.8.0+cu111
torchvision 0.9.0+cu111
TVM version: latest
Python 3.7.12

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions