Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build wheels with numpy 2.0. #1252

Merged
merged 14 commits into from
May 31, 2024
Merged

Build wheels with numpy 2.0. #1252

merged 14 commits into from
May 31, 2024

Conversation

joaander
Copy link
Member

@joaander joaander commented May 23, 2024

Description

Support numpy 2.0 in PyPI packages. This requires dropping builds for Python 3.8.

Motivation and Context

Allow users to pip install freud now and maintain compatibility when they install an updated numpy 2.0 in the future.

How Has This Been Tested?

CI tests.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds or improves functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation improvement (updates to user guides, docstrings, or developer docs)

Checklist:

  • I have read the CONTRIBUTING document.
  • My code follows the code style of this project.
  • I have updated the documentation (if relevant).
  • I have added tests that cover my changes (if relevant).
  • All new and existing tests passed.
  • I have updated the credits.
  • I have updated the Changelog.

@joaander joaander force-pushed the numpy-2.0 branch 7 times, most recently from dd0295e to aa406c6 Compare May 23, 2024 19:40
Test wheels witht the oldest supported numpy.
@joaander
Copy link
Member Author

Wheels are now building. The next step is to refactor run_tests to use reproducible builds with conda-lock https://github.com/conda/conda-lock and setup-micromamba https://github.com/mamba-org/setup-micromamba. We will need a base environment file with no pinnings and then generate one lock file for each supported version of python.

There is no need to test oldest dependencies in run_tests as build_wheels takes care of that. dependabot does not support conda lock files, so we will need to manually update the lock file to test the latest. We can do this as part of a release checklist, or write an action that creates pull requests.

@joaander joaander force-pushed the numpy-2.0 branch 9 times, most recently from 2c16f12 to 9ac58eb Compare May 28, 2024 15:31
@joaander joaander marked this pull request as ready for review May 28, 2024 18:50
@joaander joaander requested review from a team as code owners May 28, 2024 18:50
@joaander joaander requested review from tommy-waltmann and removed request for a team May 28, 2024 18:50
@joaander
Copy link
Member Author

I don't know why the testpypi build is failing. skip-existing is set to true and yet it says
Only one sdist may be uploaded per release.
https://github.com/glotzerlab/freud/actions/runs/9273644916/job/25514650308?pr=1252#step:6:176

Along with support for numpy 2.0 in the PyPI builds, this PR refactors much of freud's CI to use modern practices (lock files, uv, micromamba). To support the numpy 2.0 ABI, I had to drop support for Python 3.8.

The PyPI builds test against the oldest requirements, so I modified the run_test tests to use only the latest. Dependabot does not understand conda environments, so update-lockfiles.sh is something we will need to run manually from time to time. This solution is more stable than requesting that many pip packages be installed in a conda environment (which often leads to ABI incompatibilities).

Co-authored-by: Domagoj Fijan <50439291+DomFijan@users.noreply.github.com>
@DomFijan
Copy link
Contributor

After the last commit CI failed to build the wheels for 3.10:
https://github.com/glotzerlab/freud/actions/runs/9275726786/attempts/1

ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
    unknown package:
        Expected sha256 173308efba2270dcd61cd45a30dfded6ec0085b4b6eb33b5eb11ab443005e088
             Got        561bcec54164de855d0eacfb20de51984415e2f30195272da112c6849fdcc056

The CI errors when testing the wheel after it was already built and during the environment setup for running the test. From the log it can be seen that multiple different versions of numpy are downloaded and used - 1.22.4 and 1.26.4 (although its unclear if that is the problem, but it might be). During the step when the wheel is installed pip assumes that the requirement for freud is numpy>1.19 and just installs the latest version it finds (1.26.4):

pip install /private/var/folders/yv/tw23g49x7kb1jh_s6mf3zvyh0000gn/T/cibw-run-n3t8vd53/cp310-macosx_x86_64/repaired_wheel/freud_analysis-3.0.0-cp310-cp310-macosx_10_14_x86_64.whl
Processing /private/var/folders/yv/tw23g49x7kb1jh_s6mf3zvyh0000gn/T/cibw-run-n3t8vd53/cp310-macosx_x86_64/repaired_wheel/freud_analysis-3.0.0-cp310-cp310-macosx_10_14_x86_64.whl
  Collecting numpy>=1.19.0 (from freud-analysis==3.0.0)
    Downloading numpy-1.26.4-cp310-cp310-macosx_10_9_x86_64.whl.metadata (61 kB)
       ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.1/61.1 kB 7.4 MB/s eta 0:00:00
  Collecting rowan>=1.2.1 (from freud-analysis==3.0.0)
    Downloading rowan-1.3.0.post1-py2.py3-none-any.whl.metadata (7.5 kB)
  Collecting scipy>=1.7.3 (from freud-analysis==3.0.0)
    Downloading scipy-1.13.1-cp310-cp310-macosx_10_9_x86_64.whl.metadata (60 kB)
       ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.6/60.6 kB 5.5 MB/s eta 0:00:00
  Downloading numpy-1.26.4-cp310-cp310-macosx_10_9_x86_64.whl (20.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 20.6/20.6 MB 31.3 MB/s eta 0:00:00
  Downloading rowan-1.3.0.post1-py2.py3-none-any.whl (28 kB)
  Downloading scipy-1.13.1-cp310-cp310-macosx_10_9_x86_64.whl (39.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 39.3/39.3 MB 15.0 MB/s eta 0:00:00
  Installing collected packages: numpy, scipy, rowan, freud-analysis
  Successfully installed freud-analysis-3.0.0 numpy-1.26.4 rowan-1.3.0.post1 scipy-1.13.1

Later down the line when installing other testing requirements the correct version of numpy is used for installation (1.22.4).
Unsure if the workflow can be updated to try and force using the correct numpy versions for all the parts of the "build wheels" step?

Rerunning the CI seems to have "fixed" that. Totally unclear on why that helped. Uploading to test pypi still fails.

@joaander
Copy link
Member Author

After the last commit CI failed to build the wheels for 3.10: https://github.com/glotzerlab/freud/actions/runs/9275726786/attempts/1

ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
    unknown package:
        Expected sha256 173308efba2270dcd61cd45a30dfded6ec0085b4b6eb33b5eb11ab443005e088
             Got        561bcec54164de855d0eacfb20de51984415e2f30195272da112c6849fdcc056

This comes from pip install virtualenv -c /Users/runner/work/_temp/cibw/lib/python3.12/site-packages/cibuildwheel/resources/constraints-python310.txt. It is a sign that the hash cibuildwheel put in constraints-python310.txt was invalid. Either there was a glitch on the system to produce an invalid hash, an unchecked network error that resulted in a bad file, or pypi decided to serve up a different file. In any case, the error was completely outside freud and cibuildwheel's control, so rerunning CI is the only option.

The CI errors when testing the wheel after it was already built and during the environment setup for running the test. From the log it can be seen that multiple different versions of numpy are downloaded and used - 1.22.4 and 1.26.4 (although its unclear if that is the problem, but it might be). During the step when the wheel is installed pip assumes that the requirement for freud is numpy>1.19 and just installs the latest version it finds (1.26.4):

Later down the line when installing other testing requirements the correct version of numpy is used for installation (1.22.4). Unsure if the workflow can be updated to try and force using the correct numpy versions for all the parts of the "build wheels" step?

Yes, this is the expected behavior. When testing, cibuildwheel installs freud into an empty environment:

pip install /private/var/folders/yv/tw23g49x7kb1jh_s6mf3zvyh0000gn/T/cibw-run-og1a9f5c/cp310-macosx_x86_64/repaired_wheel/freud_analysis-3.0.0-cp310-cp310-macosx_10_14_x86_64.whl

Which will pull in the latest dependencies that match. numpy >= 1.19 is correct here, per numpy's guidance on updating to the 2.0 ABI

NumPy 1.25.0 supports Python 3.9 and higher and NumPy 1.19 is the first version to support Python 3.9. Thus, we guarantee that, when using defaults, NumPy 1.25 will expose a C-API compatible with NumPy 1.19. (the exact version is set within NumPy-internal header files).

The next command cibuildwheel issues is

pip install pytest==8.2.1 sympy==1.10 numpy==1.23.2 scipy==1.9.2 gsd==2.7.0 matplotlib==3.6.0

which downgrades packages to the lowest versions for testing. As the comment in build_wheels.yml states, I attempted to use CIBW_BEFORE_TEST to install the oldest versions before cibuildwheel installs the wheel. In this case, I found that the following pip install /path/to/freud*.whl step then upgraded numpy to the latest version.

@joaander
Copy link
Member Author

Regarding the failing sdist upload: this is considered not a bug by the gh-action-pypi-publish developers. They advise appending a commit-specific suffix to the version so every commit uploads a new set of wheels.

At the same time, the packaging guidlines state:

If your repository has frequent commit activity and every push is uploaded to TestPyPI as described, the project might exceed the PyPI project size limit. The limit could be increased, but a better solution may constitute to use a PyPI-compatible server like pypiserver in the CI for testing purposes.

  1. We don't use TestPyPI to install and test packages locally (one can always manually download the artifacts from a GitHub Actions build).
  2. freud would certainly hit the project size limit if we uploaded builds from every commit.
  3. I think installing pypiserver is overkill.

So, let's remove the TestPyPI upload step from CI. The only purpose it served was to test that our actions syntax was correct before the upload to PyPI occurs on tagging. That has been tested, so it is no longer needed.

The only other improvement we could do here is to adopt the officially blessed trusted publishing system. However, the code we have now is proven and working. I don't see a need to fix something that isn't broken.

@joaander joaander merged commit 876aaf4 into main May 31, 2024
26 checks passed
@joaander joaander deleted the numpy-2.0 branch May 31, 2024 12:47
@tommy-waltmann tommy-waltmann added this to the v3.1.0 milestone Jun 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants