Conversation
| # This workflow is triggered two ways: | ||
| # | ||
| # 1. When a tag is created, the workflow will upload the package to | ||
| # 1. When a commit is made, the workflow will upload the package to |
There was a problem hiding this comment.
Hmm, I won't be able to do releases in one PR now, right? It'll require me to do two.
There was a problem hiding this comment.
Please could you explain your current release process?
https://github.com/python/tzdata/blob/master/docs/maintaining.rst#making-a-release doesn't mention PRs.
That says:
- Push a tag -> publishes to Test PyPI.
- Create a GH release -> publishes to PyPI.
There was a problem hiding this comment.
Push a tag -> publishes to Test PyPI.
I can push a tag for a commit in a PR, as long as it's from a branch in this repository (which is what the bot does).
|
|
||
| - uses: hynek/build-and-inspect-python-package@fe0a0fb1925ca263d076ca4f2c13e93a6e92a33e # v2.17.0 | ||
|
|
||
| # Publish to Test PyPI on every commit on main. |
There was a problem hiding this comment.
Won't this fail due to duplicate version numbers?
There was a problem hiding this comment.
Good point. As mentioned, blurb uses hatch-vcs so dev versions are like 2.0.1.dev37 and will change based on the number of commits since the last tag. So it's not a problem there.
Will have to rethink this!
There was a problem hiding this comment.
I think changing the if: guard to pushed tags would work?
Then we can still test the package build works when not publishing. And only publish for tags (Test PyPI) and releases (prod PyPI).
Co-authored-by: Hugo van Kemenade <1324225+hugovk@users.noreply.github.com>
This follows the https://github.com/python/blurb/blob/main/.github/workflows/release.yml pattern as much as possible, which is very similar to the other PyPI Trusted Publishing workflows we have under https://github.com/python/, which will help ease maintenance burden.
As before, it publishes to Test PyPI for commits to main, and to prod PyPI when releases are created.
The main difference is we build the artifacts (sdist and wheel) in an isolated job then upload as GH artifacts. Then another isolated job will download and publish to the relevant index.
This isolates the installation of build deps from the job that uploads, and helps prevent supply chain attacks.
It will also run when we're not in "publish mode", and verify the artifacts can be built. We also get a nice summary of the packages and their contents. For example:
This also includes extra linting of artifacts. There was a bunch of "W002: Wheel contains duplicate files" warnings:
I've ignored these, as I think these are inherent to how tzdata is built? Anyway, this is pre-existing in the last published wheel:
check-wheel-contents --no-config tzdata-2026.2-py2.py3-none-any.whl