-
Notifications
You must be signed in to change notification settings - Fork 7.2k
port test_accimage_xxx in test_transforms to pytest #3985
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
port test_accimage_xxx in test_transforms to pytest #3985
Conversation
test/test_transforms.py
Outdated
assert np.abs((expected_output - output).mean()) < 1e-3 | ||
assert (expected_output - output).var() < 1e-5 | ||
# note the high absolute tolerance | ||
self.assertTrue(np.allclose(output.numpy(), expected_output.numpy(), atol=5e-2)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be torch.testing.assert_close(output.numpy(), expected_output.numpy())
\maybe you'll need to adjust atol and rtol too
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Though I've made the changes as suggested, since I don't have support for accimage
these tests are skipped on my local machine and I'm not sure how to verify the atol and rtol values. Any ideas?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You did well using the numpy's default for rtol. Looks like the CI doesn't even run those tests anyway... that's not good lol.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Exactly! I had the same doubt and I checked that CI was also skipping them so I used the numpy default rtol and atol as before.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @AnirudhDagar !!
Reviewed By: fmassa Differential Revision: D29097728 fbshipit-source-id: 14f243179f6617fc0ede63ee5bb68f94954a62b9
Closed #3982 in favour of this PR. This fixes the Group B in #3945.
Extremely sorry for the confusion @NicolasHug.