Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests: build isolation issues with C/C++ ABI dependencies #11778

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

d1saster
Copy link

@d1saster d1saster commented Feb 3, 2023

This commit provides a simple test that demonstrates the issues a resolver-unaware build isolation imposes on packages with C/C++ ABI dependencies.

Cf. #9542 for the corresponding discussion.

This commit provides a simple test that demonstrates the issues
a resolver-unaware build isolation imposes on packages with C/C++
ABI dependencies.

Cf. pypa#9542 for the corresponding
discussion.
@d1saster
Copy link
Author

d1saster commented Feb 6, 2023

On request, some short explanation what the test does, as well as the files the test generates.

  • The base-0.1.0-py2.py3-none-any.whl is a simple wheel file. Nothing special, except that we depend on base.__version__="0.1.0" being correctly defined.
  • base-0.2.0-py2.py3-none-any.whl same file just with the version bumped.
  • script-0.1.0.tar.gz contains two files, a pyproject.toml and a setup.py.
    pyproject.toml
    [build-system]
    requires = [
        "base>=0.1.0",
        "setuptools>=40.8.0",
        "wheel"
    ]
    build-backend = "setuptools.build_meta"
    setup.py
    import sys
    from setuptools import find_packages, setup
    
    import base
    
    # pip tests generate a bit more code here, but this is unneccesary for the explanation here
    
    setup(name='script', version='0.1.0',
        install_requires=["base=="+base.__version__])

The setup.py script pins the specific version of base present during build time, since it emulates the behavior of many packages with C/C++ extensions that link to a specific ABI of the base package.

The issue now arises when you (loosely speaking) try to pip install script base==0.1.0.

Running the test code yields:

Script result: python -m pip install --verbose --no-cache-dir --no-index --find-links /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/scratch --find-links /home/runner/work/pip/pip/tests/data/common_wheels script base==0.1.0
  return code: 1
-- stderr: --------------------
  Running command pip subprocess to install build dependencies
  Looking in links: /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/scratch, /home/runner/work/pip/pip/tests/data/common_wheels
  Processing ./base-0.2.0-py2.py3-none-any.whl
  Processing /home/runner/work/pip/pip/tests/data/common_wheels/setuptools-67.1.0-py3-none-any.whl
  Processing /home/runner/work/pip/pip/tests/data/common_wheels/wheel-0.38.4-py3-none-any.whl
  Installing collected packages: base, wheel, setuptools
  Successfully installed base-0.2.0 setuptools-67.1.0 wheel-0.38.4
  Running command Getting requirements to build wheel
  running egg_info
  creating script.egg-info
  writing script.egg-info/PKG-INFO
  writing dependency_links to script.egg-info/dependency_links.txt
  writing requirements to script.egg-info/requires.txt
  writing top-level names to script.egg-info/top_level.txt
  writing manifest file 'script.egg-info/SOURCES.txt'
  reading manifest file 'script.egg-info/SOURCES.txt'
  writing manifest file 'script.egg-info/SOURCES.txt'
  Running command Preparing metadata (pyproject.toml)
  running dist_info
  creating /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info
  writing /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/PKG-INFO
  writing dependency_links to /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/dependency_links.txt
  writing requirements to /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/requires.txt
  writing top-level names to /tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/top_level.txt
  writing manifest file '/tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/SOURCES.txt'
  reading manifest file '/tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/SOURCES.txt'
  writing manifest file '/tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script.egg-info/SOURCES.txt'
  creating '/tmp/pytest-of-runner/pytest-1/popen-gw1/test_new_resolver_build_isolat0/workspace/tmp/pip-modern-metadata-f36rv80p/script-0.1.0.dist-info'
ERROR: Cannot install base==0.1.0 and script==0.1.0 because these package versions have conflicting dependencies.

pip will install version 0.2.0 of the base package in the build isolation environment, with which the script wheel is built. The latter will have a fixed/pinned dependency on base-0.2.0, thus leading to the conflict.

The issues people describe in #9542 are mostly due to this behavior, except that they experience the problem more down the road, at runtime. IMO this is due to most of the packages not correctly encoding the additionally more constrained dependency in their setup.py script (or whatever build backend they use). However, as this test illustrates, fixing this only pushes the issue more upstream, with pip erroring out during installation.

Files:
script-0.1.0.tar.gz
base-0.1.0-py2.py3-none-any.zip
base-0.2.0-py2.py3-none-any.zip
Github does not like uploading the .whl suffix, so zip it is.

@pfmoore
Copy link
Member

pfmoore commented Feb 6, 2023

Thanks for the explanation.

Is this not simply a consequence of:

  1. The fact that because of the way Python's packaging works, dependency information cannot be known in advance, but has to be determined when the candidate is considered (which involves a build step, on the case of a sdist).
  2. Builds can generate metadata non-deterministically (in this case the non-determinism is "what version of base was installed for the build" but it could just as easily be the time of day, or a RNG).

This is a pretty fundamental aspect of Python's packaging system, and I don't think there's going to be a straightforward solution. I can think of a number of ways of making a build like that work - for example something like the following might work

cat "base==0.1.0" > constraints.txt
PIP_CONSTRAINTS=constraints.txt pip install script base

But that's not the point here - people are expecting the unadorned command to "just work", and I don't think that's possible as the setup.py has no knowledge of what other requirements might have been specified to the original pip process.

Ultimately, the problem here is that you cannot have two wheels for script-0.1.0-py3-none-any.whl with different dependency information. Using a sdist is in a certain sense, a workaround to hide that limitation, but it does so by creating a wheel that is valid in the build environment but not in the target environment.

So I guess what I'm saying is that for most of the examples we've seen there are workarounds (typically, build the sdist with --no-build-isolation and carefully control dependencies, and then install from the generated wheel, or just use --no-build-isolation and accept that you need to have build-time dependencies in your target environment). Those workarounds aren't ideal, and don't really satisfy the case of wanting to publish sdists for users who don't have the expertise to apply that sort of workaround. But they are the best you're likely to get without a more fundamental look at the underlying ideas and data models. And that's something that needs to go to the Packaging discourse, and ultimately to be implemented as a change to the existing standards (or a new standard in its own right). This is a hard problem, and we're getting close to the point where people say "use conda" rather than try to tackle it directly, so it's going to take some effort to fix.

In the interim, I'm not opposed to pip trying to offer extra workarounds for specific cases, but I don't want people to feel that the basic issue is something pip can solve on its own.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants