Skip to content

Conversation

@ydcjeff
Copy link
Contributor

@ydcjeff ydcjeff commented Mar 17, 2021

Part of #1790

Description: Add tests for ignite.contrib.engines

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added the module: handlers Core Handlers module label Mar 17, 2021
Copy link
Contributor

@sdesrozis sdesrozis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ydcjeff Thank you!

@sdesrozis
Copy link
Contributor

@ydcjeff Windows tests are ko. Could you have a look ?

@ydcjeff
Copy link
Contributor Author

ydcjeff commented Mar 18, 2021

@sdesrozis windows tests are failing on test_average_precision

=========================== short test summary info ===========================
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_binary_input_N
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_multilabel_input_N

Error is metric is computing nan

>       assert average_precision_score(np_y, np_y_pred) == pytest.approx(res)
E       assert nan == nan � ???
E         +nan
E         -nan � ???

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 19, 2021

@sdesrozis windows tests are failing on test_average_precision

=========================== short test summary info ===========================
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_binary_input_N
FAILED tests/ignite/contrib/metrics/test_average_precision.py::test_multilabel_input_N

Error is metric is computing nan

>       assert average_precision_score(np_y, np_y_pred) == pytest.approx(res)
E       assert nan == nan � ???
E         +nan
E         -nan � ???

@KickItLikeShika have you seen this ?

@KickItLikeShika
Copy link
Contributor

@vfdev-5 This never happened before..

@KickItLikeShika
Copy link
Contributor

KickItLikeShika commented Mar 19, 2021

@vfdev-5 i don't think it's something related to the metric computation itself, please restart the CI to see more details

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 19, 2021

@vfdev-5 i don't think it's something related for the metric computation itself, please restart the CI to see more details

At first, you can try to checkout this PR locally and try to repro the issue if you have a Win32 machine. I think it is related as with new tests we have altered random state and it generates a corner case samples...

@KickItLikeShika
Copy link
Contributor

@vfdev-5 i'm using Linux, i will try to investigate more and see what's the problem exactly

@sdesrozis
Copy link
Contributor

@fco-dv could you help to track this issue on windows ? Thanks !

Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update @ydcjeff

@fco-dv
Copy link
Contributor

fco-dv commented Mar 25, 2021

I've checkout the PR , those tests are PASSED on my side:

@fco-dv could you help to track this issue on windows ? Thanks !

(ignite-dev) D:\git\ignite>pytest  tests/ignite/contrib/metrics/test_average_precision.py -vvv
================================================= test session starts =================================================
platform win32 -- Python 3.7.7, pytest-5.4.1, py-1.8.1, pluggy-0.13.1 -- d:\soft\miniconda3\envs\ignite-dev\python.exe
cachedir: .pytest_cache
rootdir: D:\git\ignite, inifile: setup.cfg
plugins: cov-2.8.1, forked-1.1.3, xdist-1.31.0
collected 15 items

tests/ignite/contrib/metrics/test_average_precision.py::test_no_sklearn PASSED                                   [  6%]
tests/ignite/contrib/metrics/test_average_precision.py::test_no_update PASSED                                    [ 13%]
tests/ignite/contrib/metrics/test_average_precision.py::test_input_types PASSED                                  [ 20%]
tests/ignite/contrib/metrics/test_average_precision.py::test_check_shape PASSED                                  [ 26%]
tests/ignite/contrib/metrics/test_average_precision.py::test_binary_input_N PASSED                               [ 33%]
tests/ignite/contrib/metrics/test_average_precision.py::test_multilabel_input_N PASSED                           [ 40%]
tests/ignite/contrib/metrics/test_average_precision.py::test_integration_binary_input_with_output_transform PASSED [ 46%]
tests/ignite/contrib/metrics/test_average_precision.py::test_integration_multilabel_input_with_output_transform PASSED [ 53%]
tests/ignite/contrib/metrics/test_average_precision.py::test_distrib_gpu SKIPPED                                 [ 60%]
tests/ignite/contrib/metrics/test_average_precision.py::test_distrib_cpu SKIPPED                                 [ 66%]
tests/ignite/contrib/metrics/test_average_precision.py::test_distrib_hvd SKIPPED                                 [ 73%]
tests/ignite/contrib/metrics/test_average_precision.py::test_multinode_distrib_cpu SKIPPED                       [ 80%]
tests/ignite/contrib/metrics/test_average_precision.py::test_multinode_distrib_gpu SKIPPED                       [ 86%]
tests/ignite/contrib/metrics/test_average_precision.py::test_distrib_single_device_xla SKIPPED                   [ 93%]
tests/ignite/contrib/metrics/test_average_precision.py::test_distrib_xla_nprocs SKIPPED                          [100%]

============================================ 8 passed, 7 skipped in 1.87s =============================================

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 25, 2021

@fco-dv try complete test suite, maybe random seed is not the same between reduced and total runs.

@fco-dv
Copy link
Contributor

fco-dv commented Mar 25, 2021

@vfdev-5 I've launched the tests with pytest tests -vvv and it passed , however I'm not sure to reproduce the command launched on the CI

tests/ignite/contrib/metrics/test_average_precision.py::test_no_update PASSED                                    [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_input_types PASSED                                  [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_check_shape PASSED                                  [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_binary_input_N PASSED                               [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_multilabel_input_N PASSED                           [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_integration_binary_input_with_output_transform PASSED [ 22%]
tests/ignite/contrib/metrics/test_average_precision.py::test_integration_multilabel_input_with_output_transform PASSED [ 22%]

@sdesrozis
Copy link
Contributor

@fco-dv Sorry for that but could you test reproducing the CI, maybe in a docker ? IMO It would be useful to have this dockerfile in the repo to ease the tracking of CI errors on windows (I suppose).

@vfdev-5 vfdev-5 mentioned this pull request Apr 8, 2021
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Apr 11, 2021

@KickItLikeShika could you please inspect this failure on windows with test_average_precision.py. Thanks

@ydcjeff
Copy link
Contributor Author

ydcjeff commented Apr 11, 2021

@vfdev-5 @KickItLikeShika Could this be failing case?

>>> y_pred = torch.randint(0, 2, size=(10,)).long()
>>> n_iters = 16
>>> idx
13
>>> y_pred[idx : idx + batch_size]
tensor([], dtype=torch.int64)

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Apr 11, 2021

Thanks a lot @ydcjeff ! @KickItLikeShika can you make a pass over all contrib tests and fix all these issues we already talked about in some of your previous PRs.

@KickItLikeShika
Copy link
Contributor

Thanks a lot @ydcjeff !

@KickItLikeShika can you make a pass over all contrib tests and fix all these issues we already talked about in some of your previous PRs.

@vfdev-5 I will do that

@ydcjeff ydcjeff requested a review from vfdev-5 April 11, 2021 10:51
Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @ydcjeff !

@vfdev-5 vfdev-5 merged commit 23d963c into pytorch:master Apr 11, 2021
@ydcjeff ydcjeff deleted the common-cov branch April 11, 2021 11:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: handlers Core Handlers module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants