Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchNorm layer failed with width==height==1 #14

Closed
tensor-tang opened this issue Dec 26, 2016 · 5 comments
Closed

BatchNorm layer failed with width==height==1 #14

tensor-tang opened this issue Dec 26, 2016 · 5 comments
Labels
bug A confirmed library bug

Comments

@tensor-tang
Copy link
Contributor

tensor-tang commented Dec 26, 2016

Found the gtest failed with BN layer when manually setting h==w==1

Line 406 at tests/gtests/test_batch_normalization.cpp

INST_TEST_CASE(Simple_NCHW,
    PARAMS(nchw, nchw, 2, 10, 1, 1, EPS)
);
@rsdubtso rsdubtso added the bug A confirmed library bug label Dec 26, 2016
@rsdubtso
Copy link

rsdubtso commented Dec 26, 2016

Thanks. Reproduced. Will try to get this fixed this week.

@tensor-tang
Copy link
Contributor Author

Thanks.

@rsdubtso
Copy link

rsdubtso commented Jan 6, 2017

Sorry for a delay. I've not committed a fix, but I have some news. This looks like a test issue. The bnorm kernel precomputes some values as an optimization while the test does not. Also, depending on the compiler, the test may or may not use FMA instructions. I'm looking into finding a way to change the test to allow such behavior while not relaxing the accuracy requirements too much.

@tensor-tang
Copy link
Contributor Author

tensor-tang commented Jan 6, 2017

I calculated this iw==ih==1 case with ATLAS outside, but got different result with same input data.
While other cases are all equals.

@rsdubtso
Copy link

Not that I'm proud of this, but I had to resort to relaxing the precision requirements... I've added some 1x1 cases which now pass. Closing... (note to self: close this via a commit message next time)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A confirmed library bug
Projects
None yet
Development

No branches or pull requests

2 participants