-
-
Notifications
You must be signed in to change notification settings - Fork 656
cov: add tests for ignite.contrib.engines #1808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ydcjeff Thank you!
|
@ydcjeff Windows tests are ko. Could you have a look ? |
|
@sdesrozis windows tests are failing on =========================== short test summary info ===========================
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_binary_input_N
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_multilabel_input_NError is metric is computing nan > assert average_precision_score(np_y, np_y_pred) == pytest.approx(res)
E assert nan == nan � ???
E +nan
E -nan � ??? |
@KickItLikeShika have you seen this ? |
|
@vfdev-5 This never happened before.. |
|
@vfdev-5 i don't think it's something related to the metric computation itself, please restart the CI to see more details |
At first, you can try to checkout this PR locally and try to repro the issue if you have a Win32 machine. I think it is related as with new tests we have altered random state and it generates a corner case samples... |
|
@vfdev-5 i'm using Linux, i will try to investigate more and see what's the problem exactly |
|
@fco-dv could you help to track this issue on windows ? Thanks ! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update @ydcjeff
|
I've checkout the PR , those tests are PASSED on my side:
|
|
@fco-dv try complete test suite, maybe random seed is not the same between reduced and total runs. |
|
@vfdev-5 I've launched the tests with |
|
@fco-dv Sorry for that but could you test reproducing the CI, maybe in a docker ? IMO It would be useful to have this dockerfile in the repo to ease the tracking of CI errors on windows (I suppose). |
|
@KickItLikeShika could you please inspect this failure on windows with |
|
@vfdev-5 @KickItLikeShika Could this be failing case? >>> y_pred = torch.randint(0, 2, size=(10,)).long()
>>> n_iters = 16
>>> idx
13
>>> y_pred[idx : idx + batch_size]
tensor([], dtype=torch.int64) |
|
Thanks a lot @ydcjeff ! @KickItLikeShika can you make a pass over all contrib tests and fix all these issues we already talked about in some of your previous PRs. |
|
Thanks a lot @ydcjeff !
@vfdev-5 I will do that |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot @ydcjeff !
Part of #1790
Description: Add tests for
ignite.contrib.enginesCheck list: