Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supporting missing backends in tests #1454

Open
matthewfeickert opened this issue May 13, 2021 · 5 comments · Fixed by #2340
Open

Supporting missing backends in tests #1454

matthewfeickert opened this issue May 13, 2021 · 5 comments · Fixed by #2340
Labels
feat/enhancement New feature or request help wanted Extra attention is needed / contributions welcome tests pytest

Comments

@matthewfeickert
Copy link
Member

matthewfeickert commented May 13, 2021

Description

This is motivated by a discussion with @henryiii about why we don't currently support Python 3.9. My current understanding of the problem is that TensorFlow just started supporting Python 3.9 in v2.5.0 (which was released today (2021-05-13)) and we currently cap TensorFlow to v2.2.x

pyhf/setup.py

Lines 5 to 8 in 6b769fd

'tensorflow': [
'tensorflow~=2.2.1', # TensorFlow minor releases are as volatile as major
'tensorflow-probability~=0.10.1',
],

given reasons laid out (not neatly) in Issue #293.

@henryiii points out that capping all the other backends at what TensorFlow supports isn't great as then you're limited by a library that you're not even using. To support this though we need to change the way that we run the tests as I don't think (could be wrong) that it is possible for us to run through the test suite with a backend not installed due to it not supporting a Python runtime.

@lukasheinrich @kratsg Thoughts here on how to proceed?

@matthewfeickert matthewfeickert added feat/enhancement New feature or request help wanted Extra attention is needed / contributions welcome tests pytest labels May 13, 2021
@matthewfeickert matthewfeickert added this to To do in pyhf development via automation May 13, 2021
@matthewfeickert
Copy link
Member Author

To support this though we need to change the way that we run the tests as I don't think (could be wrong) that it is possible for us to run through the test suite with a backend not installed due to it not supporting a Python runtime.

Also, yeah, I just tried this and got the expected error at the install stage

ERROR: Could not find a version that satisfies the requirement tensorflow~=2.2.1; extra == "test" (from pyhf[test]) (from versions: 2.5.0rc0, 2.5.0rc1, 2.5.0rc2, 2.5.0rc3, 2.5.0)
ERROR: No matching distribution found for tensorflow~=2.2.1; extra == "test"

@alexander-held
Copy link
Member

alexander-held commented Feb 16, 2023

An additional benefit of this would be that developers could run a subset of tests using a significantly more lightweight environment (see #2111 (comment)).

@kratsg
Copy link
Contributor

kratsg commented Sep 20, 2023

To support this though we need to change the way that we run the tests as I don't think (could be wrong) that it is possible for us to run through the test suite with a backend not installed due to it not supporting a Python runtime.

One solution is to pass in flags from the command line for pytest to enable tests allowed for a specific backend to run. This shouldn't be too hard to code up in actuality.

@alexander-held
Copy link
Member

pytest supports this in a pretty straightforward way, cabinetry example here: https://github.com/scikit-hep/cabinetry/blob/80b4af15c3c8b9738afe1be5450ac3e9d83a12eb/tests/conftest.py#L419-L438. This adds a --runslow and a decorator to add to tests.

@matthewfeickert
Copy link
Member Author

So it seems that PR #2340 only partially fixed this under the case where the backend to be skipped must still be installed. In Issue #2372, when I've been running on a test branch and trying to run with the following configuration

    - name: Install dependencies
      run: |
        python -m pip install --upgrade pip setuptools wheel
        # No TensorFlow for Python 3.12 yet
        python -m pip install --upgrade ".[jax,torch,minuit,xmlio,contrib,shellcomplete,test]"

    - name: List installed Python packages
      run: python -m pip list

    - name: Test with pytest and coverage
      run: |
        coverage run --module pytest --disable-backend tensorflow --ignore tests/contrib --ignore tests/benchmarks --ignore tests/test_notebooks.py

this fails immediatley with

...
____________________ ERROR collecting tests/test_tensor.py _____________________
ImportError while importing test module '/Users/runner/work/pyhf/pyhf/tests/test_tensor.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_tensor.py:5: in <module>
    import tensorflow as tf
E   ModuleNotFoundError: No module named 'tensorflow'
...

as there is not tensorflow out for Python 3.12 yet. So we need to be able to make this so that an import is not event attempted, or that if it is, the exception is caught and handeled.

pyhf development automation moved this from Done to In progress Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/enhancement New feature or request help wanted Extra attention is needed / contributions welcome tests pytest
Projects
pyhf development
  
In progress
Development

Successfully merging a pull request may close this issue.

3 participants