Skip to content

brettcannon/build-and-inspect-python-package

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

build-and-inspect-python-package logo
Never upload a faulty Python package to PyPI again.

build-and-inspect-python-package is a GitHub Action that provides the following functionality to Python package maintainers:

Builds your package using PyPA’s build (this works with any PEP 517-compatible build backend, including Hatch, Flit, Setuptools, PDM, or Poetry). SOURCE_DATE_EPOCH is set to the timestamp of the last commit, giving you reproducible builds with meaningful file timestamps.

Uploads the built wheel and the source distribution (SDist) as GitHub Actions artifacts, so you can download and inspect them from the Summary view of a run, or upload them to PyPI automatically once the verification succeeds.

Lints the wheel contents using check-wheel-contents.

Lints the PyPI README using Twine and uploads it as an GitHub Actions artifact for further manual inspection. To level up your PyPI README game, check out hatch-fancy-pypi-readme!

Prints the tree of both SDist and wheel in the CI output, so you don’t have to download the packages, if you just want to check the content list.

Prints and uploads the packaging metadata as a GitHub Actions artifact.

Popular Use Cases

Build Once – Use Across Jobs

To increase the fidelity of your tests to what your users will experience, you can build and store your package as a first step, depend on the step in the remaining steps, and – instead of checking out the source tree – retrieve the built packages and run your tests against that. For example, by unpacking the tests and config from the SDist and using tox run --installpkg dist/*.whl ... to run the tests against the built wheel without access to the package source code.

You can see this technique in action in structlog’s CI.

Automatic Uploading

You can use a workflow that builds your package and – depending on the CI event (push to main, new tag, new release, ...) – uses PyPI’s trusted publisher feature to upload it to Test PyPI1, PyPI, or both. This way you can continuously check how the package will look on PyPI.

structlog uses this technique too: It uploads every commit on main to Test PyPI and whenever a GitHub Release is created, also to the real PyPI.

Define Python Version Matrix Based On Package Metadata

build-and-inspect-python-package extracts the Python versions your package supports from the trove classifiers in your package’s metadata and offers them as an action output.

That means that you can define your CI matrix based on the Python versions your package supports without duplicating the information between your package configuration and your CI configuration.

Applications

If you package an application as a Python package, this action is useful to double-check you’re shipping everything you need, including all templates, translation files, et cetera.

Usage

build-and-inspect-python-package only works on Linux runners:

jobs:
  check-package:
    name: Build & inspect our package.
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4
      - uses: hynek/build-and-inspect-python-package@v2

If you’re using a VCS tag-based version extractor like setuptools-scm and need the built package to have the correct version, you must use actions/checkout with fetch-depth: 0 – unless the latest commit is the version tag.

Caution

build-and-inspect-python-package uses actions/upload-artifact for storing the built artifacts that you can download with actions/download-artifact.

Unfortunately, v4 of both is incompatible with previous versions, so you have to make sure that your download-artifact version matches the version that build-and-inspect-python-package uses for uploading.

  • If you’re using download-artifact@v3, you have to use build-and-inspect-python-package@v1.
  • If you’re using download-artifact@v4, you have to use build-and-inspect-python-package@v2.

While build-and-inspect-python-package will build a wheel for you by default, we recommend using cibuildwheel if your package contains compiled extensions.

Inputs

  • path: the location of the Python package to build (optional, default: .).

  • skip-wheel: Whether to skip building the wheel in addition to the source distribution. The only meaningful value is 'true' (note the quotes – GitHub Actions only allow string inputs) and everything else is treated as falsey.

    This is useful if you build your wheels using advanced tools like cibuildwheel anyway. (optional, default: 'false').

  • upload-name-suffix: A suffix to append to the artifact names to make them unique for upload-artifact@v4.

    Use this if you want to build multiple packages in one workflow. (optional, default: '').

Outputs

  • dist: The location with the built packages.

    See, for example, how argon2-cffi-bindings uses this feature to check the built wheels don’t break a package that depends on it.

  • supported_python_classifiers_json_array: A JSON array of Python versions that are supported by the package as defined by the trove classifiers in the package metadata (for example, Programming Language :: Python :: 3.12).

    You can assign this to a matrix strategy key in your CI job (for example, strategy.matrix.python-version) to test against multiple Python versions without duplicating the information. Since GitHub Actions only allows for strings as variables, you have to parse it with fromJSON in your workflow.

    If all this sounds confusing: Check out our supported Pythons CI workflow for a realistic example.

  • supported_python_classifiers_json_job_matrix_value: Same as supported_python_classifiers_json_array, but it’s a mapping with the JSON array bound to the python-version key.

    This is useful if you only want to define a matrix based on Python versions, because then you can just assign this to strategy.matrix.

Artifacts

After a successful run, you’ll find the following artifacts in the run’s Summary view:

  • Packages: The built packages. Perfect for automated PyPI upload workflows!
  • Package Metadata: the extracted packaging metadata (hint: it’s formatted as an email).
  • PyPI README: the extracted PyPI README, exactly how it would be used by PyPI as your project’s landing page. PEP 621 calls it readme, in classic setuptools it’s long_description.

Job Summaries

To save you from downloading the artifacts just to check their contents, build-and-inspect-python-package creates the following job summaries:

  • SDist contents: A tree of the source distribution.
  • Wheel contents: A tree of the built wheel – if one was built. This output has no timestamps because wheel unpack does not preserve them from the built wheel, leading to confusion.
  • Metadata: A plain-text dump of package metadata (includes the PyPI README).

Examples

Our CI uses all inputs and outputs, if you want to see them in action.

Our supported Pythons CI workflow demonstrates how to use supported_python_classifiers_json_array to set up a matrix of Python versions for your CI jobs without duplicating the information with your packaging metadata.

License

The scripts and documentation in this project are released under the MIT License.

Footnotes

  1. Note, though, that a prerequisite for the Test PyPI workflow is that each of your commits builds with a unique version number. This is easily achievable using tools like setuptools-scm or hatch-vcs, but beyond the scope of this humble README.

About

Build and Inspect Python Packages in GitHub Actions

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published