Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split tests into core/tasks #391

Closed
tcompa opened this issue Jun 6, 2023 · 4 comments · Fixed by #425
Closed

Split tests into core/tasks #391

tcompa opened this issue Jun 6, 2023 · 4 comments · Fixed by #425

Comments

@tcompa
Copy link
Collaborator

tcompa commented Jun 6, 2023

After #378, we should also run the tests in two batches: the ones for the core functions, and the ones for the whole package that also include tasks. This will protect us from issues like "I added a cellpose dependency in the core library". Not the most urgent thing to do, but it's always good to improve the tooling of the repo/package.

Note that pytest does offer some kind of test labels (which we would also use to run the CI locally), but we'll still need to properly set up the GitHub action so that:

  1. It installs the core part
  2. It runs the core tests
  3. It install the tasks extra
  4. It runs the tasks tests
  5. It creates a unique coverage report
@tcompa
Copy link
Collaborator Author

tcompa commented Jun 6, 2023

For the record, this is the command that runs even without package extras:

poetry run pytest -x -v --ignore tests/test_unit_napari_workflows_wrapper.py --ignore tests/test_unit_parse_yokogawa_metadata.py --ignore tests/test_workflows_napari_workflows   --ignore tests/test_workflows_cellpose_segmentation.py --ignore tests/test_workflows_napari_workflows.py --ignore tests/test_import.py --ignore tests/test_valid_args_schemas.py --ignore tests/test_valid_task_interface.py

@tcompa
Copy link
Collaborator Author

tcompa commented Jun 9, 2023

Side comment: splitting the tests into core+tasks will also make it clear that some of our core modules require a much more thorough test coverage, e.g. I just realized that lib_masked_loading.py has a very poor one.

For a task, it could be complex to create and maintain test data for all branches, but for the core modules I think we should aim for a very high quality (this is especially true because we plan to have other packages depending on the core part).

@jluethi
Copy link
Collaborator

jluethi commented Jun 9, 2023

Agreed!

I also just realized how complex this can become for the napari workflows task when looking at the different 2D & 3D branches, those are hard to test exhaustively. But we should aim for high test coverage over our library functions.

@tcompa
Copy link
Collaborator Author

tcompa commented Jun 13, 2023

#425 implements a first version of this split in tests

What is covered is:

  • Fail fast if one of the core (i.e. not related to tasks) tests fail, for instance because we use a dependency that is listed in the fractal-tasks extra
  • Run the core-library and tasks tests in parallel (this is not very relevant at the moment, since the tasks tests take much longer.. but it could become relevant if we either trim down the tasks tests or make the core-library tests much more time-consuming)
  • Run the core-library tests also for python 3.11 (ref Python 3.11 support / llvmlite version #392)
  • Keep providing a comprehensive coverage report, for both the main package and the tasks subpackage

Out of scope, for now:

tcompa added a commit that referenced this issue Jun 13, 2023
@tcompa tcompa added the testing label Oct 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants