To develop, we suggest using Python virtual environments together with pip
. Once the virtual environment is activated and you have SSH keys setup with GitHub, clone the repo from GitHub
git clone git@github.com:scikit-hep/pyhf
and install all necessary packages for development
python -m pip install --upgrade --editable .[complete]
Then setup the Git pre-commit hooks by running
pre-commit install
inside of the virtual environment. pre-commit.ci keeps the pre-commit hooks updated through time, so pre-commit will automatically update itself when you run it locally after the hooks were updated.
A function-scoped fixture called datadir
exists for a given test module which will automatically copy files from the associated test modules data directory into a temporary directory for the given test execution. That is, for example, if a test was defined in test_schema.py
, then data files located in test_schema/
will be copied to a temporary directory whose path is made available by the datadir
fixture. Therefore, one can do:
def test_patchset(datadir):
data_file = open(datadir.join("test.txt"), encoding="utf-8")
...
which will load the copy of text.txt
in the temporary directory. This also works for parameterizations as this will effectively sandbox the file modifications made.
To run the test suite in full, from the top level of the repository run
pytest
More practically for most local testing you will not want to test the benchmarks, contrib module, or notebooks, and so instead to test the core codebase a developer can run
pytest --ignore tests/benchmarks/ --ignore tests/contrib --ignore tests/test_notebooks.py
To run the visualization tests for the contrib
module with the pytest-mpl
pytest
plugin run
pytest tests/contrib --mpl --mpl-baseline-path tests/contrib/baseline --mpl-generate-summary html
pyhf
's configuration of pytest
will automatically run doctest
on all the modules when the full test suite is run. To run doctest
on an individual module or file just run pytest
on its path. For example, to run doctest
on the JAX backend run
pytest src/pyhf/tensor/jax_backend.py
Publishing to TestPyPI and PyPI is automated through the PyPA's PyPI publish GitHub Action and the pyhf
bump version GitHub Actions workflow.
As part of the release process a checklist is required to be completed to make sure steps aren't missed. There is a GitHub Issue template for this that the maintainer in charge of the release should step through and update if needed.
A release tag can be created by a maintainer by using the bump version GitHub Actions workflow through workflow dispatch. The maintainer needs to:
- Select the semantic versioning (SemVer) type (major, minor, patch) of the release tag.
- Select if the release tag is a release candidate or not.
- Input the SemVer version number of the release tag.
- Select if to override the SemVer compatibility of the previous options (default is to run checks).
- Select if a dry run should be performed (default is to do a dry run to avoid accidental release tags).
The maintainer should do a dry run first to make sure everything looks reasonable. Once they have done that, they can run the bump version GitHub Actions workflow which will produce a new tag, bump the version of all files defined in tbump.toml, and then commit and push these changes and the tag back to the main
branch.
The push of a tag to the repository will trigger a build of a sdist and wheel, and then the deployment of them to TestPyPI.
pyhf
tests packaging and distribution by publishing to TestPyPI in advance of releases. Installation of the latest test release from TestPyPI can be tested by first installing pyhf
normally, to ensure all dependencies are installed from PyPI, and then upgrading pyhf
to a test release from TestPyPI
python -m pip install pyhf
python -m pip install --upgrade --extra-index-url https://test.pypi.org/simple/ --pre pyhf
Note
This adds TestPyPI as an additional package index to search when installing. PyPI will still be the default package index pip
will attempt to install from for all dependencies, but if a package has a release on TestPyPI that is a more recent release then the package will be installed from TestPyPI instead. Note that dev releases are considered pre-releases, so 0.1.2
is a "newer" release than 0.1.2.dev3
.
Once the TestPyPI deployment has been examined, installed, and tested locally by the maintainers final deployment to PyPI can be done by creating a GitHub Release:
- From the
pyhf
GitHub releases page select the "Draft a new release" button. - Select the release tag that was just pushed, and set the release title to be the tag (e.g.
v1.2.3
). - Use the "Auto-generate release notes" button to generate a skeleton of the release notes and then augment them with the preprepared release notes the release maintainer has written.
- Select "This is a pre-release" if the release is a release candidate.
- Select "Create a discussion for this release" if the release is a stable release.
- Select "Publish release".
Once the release has been published to GitHub, the publishing workflow will build a sdist and wheel, and then deploy them to PyPI.
The .zenodo.json
and codemeta.json
files have the version number automatically updated through tbump
, though their additional metadata should be checked periodically by the dev team (probably every release). The codemeta.json
file can be generated automatically from a PyPI install of pyhf
using codemetapy
codemetapy --no-extras pyhf > codemeta.json
though the author
metadata will still need to be checked and revised by hand. The .zenodo.json
is currently generated by hand, so it is worth using codemeta.json
as a guide to edit it.