Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Test pass for build artifacts #1027

Open
reinecke opened this issue Jul 28, 2021 · 1 comment
Open

Add Test pass for build artifacts #1027

reinecke opened this issue Jul 28, 2021 · 1 comment

Comments

@reinecke
Copy link
Collaborator

reinecke commented Jul 28, 2021

Feature Request

With github actions producing wheels to deliver to pypi, it would be helpful to have a separate part of our build process to validate those exact artifacts before they can become candidates for pypi delivery.

Description

At a minimum, smoke tests could be run that did something simple like:

  1. Import otio
  2. create an example timeline
  3. write it to disk
  4. read the timeline back into disk

The testing should be run on target platforms called out as supported.

In a perfect world, the full unittest suite would be run against the wheels that are being delivered.

The main objective of this testing should be to catch packaging issues like:

  • Not building statically linked to libopentimelineio or libopentime or missing .so files for those in the package (whatever the chosen method of including those functions is)
  • Incorrect architecture mismatch in the binary parts of the distribution
  • Missing _otio module from the package distribution

Context

Over the course of improving our build system setup, the above listed issues have been hallmark things that pop up when iterating on our somewhat brittle combo of cmake configs and setup.py. A sanity check that this class of issues hasn't made it into our artifacts will help catch issues close to where they are introduced and increase our confidence when iterating on the build setup.

Some things to consider:

  • In testing, it is important to fail as fast as possible. I think we'd favor relying on the build time unittests a bit more in exchange for faster iteration time on build failures.
  • At the time of writing, there are 17 wheels being built. Running tests to cover every one of these may not be worth the effort. It may be best to choose a representative set of platform/python version combos to maximize coverage while minimizing the variant combos of test machines.
@JeanChristopheMorinPerso
Copy link
Member

JeanChristopheMorinPerso commented Aug 7, 2021

I like where you are going with this. Here is some more thoughts I have:

General testing good practices:

  • A wheel should be first created and then used for running the tests. This tests the whole process of building the wheel, testing its installation and running the tests against the installed wheel.
  • If we go wild we could even say we need to first build an sdist, then build a wheel from the sdist to test the entire process and make sure our sdists are valid and working as expected. (Side note: When tox is used in a project, it's tox that makes sure that an sdist is built and then used to build a wheel and then install the wheel to be tested. But I know tox was removed from OTIO a couple of months ago). One might argue that we already run check-manifest in CI and that it already covers verification of files included in the sdists. That is true indeed, but it doesn't test the create of a wheel from the sdist + test from this.
  • Care must be taken to make sure tests don't use any code from the source files (which means we need to run the tests from another folder, like a tmp folder). (Edit: OTIO uses a src folder, which means we should be safe and can run the tests from the root).

General improvements to be made for CI:

I have other things in my head but I think it would diverge a little bit from this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

No branches or pull requests

2 participants