New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pip needs a dependency resolver #988

Open
cboylan opened this Issue Jun 11, 2013 · 103 comments

Comments

Projects
None yet
@cboylan
Contributor

cboylan commented Jun 11, 2013

pip's resolver logic is currently like so:

  1. for top-level requirements:

    a. only one spec allowed per project, regardless of conflicts or not. otherwise a "double requirement" exception
    b. they override sub-dependency requirements.

  2. for sub-dependencies

    a. "first found, wins" (where the order is breadth first)

While 1b is reasonable (and actually a feature of pip currently), 1a and 2a are not. pip should be attempting to resolve.

NOTE

If 2a is a problem for you now, there is a workaround . Specifically declare what you want the solution to be in requirements file, or as a top-level pip install argument, and that will be honored.

e.g. if you're installing myproject, but depproject is not ending up with the correctly resolved version, then specify what the correct answer should be like this:

pip install myproject depproject>=1.5,<2.0


edited by @qwcode

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Jun 11, 2013

Member

I added this to 1.5 because I think it's a pretty big wart that should be sorted out.

Member

dstufft commented Jun 11, 2013

I added this to 1.5 because I think it's a pretty big wart that should be sorted out.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Jun 11, 2013

Contributor

for now, the "pip solution" is to declare in your requirements file (or as an install argument directly) what specifier you want, and it overrides what happens down in the dependency tree.

i.e. in this case, put "foo>=1.0,<=2.0" in your requirements or add it as an install argument.

also, pip compiles everything it's going to install into a set first, then installs.
it's not installing everything as it's discovered, and it's discovery order is not depth first.
the order of discovery isn't the issue, but rather the lack of doing any resolution as it's compiling the set.
Currently, first found wins.

Contributor

qwcode commented Jun 11, 2013

for now, the "pip solution" is to declare in your requirements file (or as an install argument directly) what specifier you want, and it overrides what happens down in the dependency tree.

i.e. in this case, put "foo>=1.0,<=2.0" in your requirements or add it as an install argument.

also, pip compiles everything it's going to install into a set first, then installs.
it's not installing everything as it's discovered, and it's discovery order is not depth first.
the order of discovery isn't the issue, but rather the lack of doing any resolution as it's compiling the set.
Currently, first found wins.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Jun 11, 2013

Contributor

I keep meaning to add a "conflict resolution" section to the Internal logic section to the docs that covers this.
practically speaking, pip's override logic is pretty nice IMO, and is all most people need most of the time.

Contributor

qwcode commented Jun 11, 2013

I keep meaning to add a "conflict resolution" section to the Internal logic section to the docs that covers this.
practically speaking, pip's override logic is pretty nice IMO, and is all most people need most of the time.

@emonty

This comment has been minimized.

Show comment
Hide comment
@emonty

emonty Aug 11, 2013

Contributor

I believe, based on discussions with @dstufft, that this may be related to the problems people will have if they try to upgrade from old distribute to new setuptools in the same set of software as other things. Or, more specifically, if you depend on a piece of software that depends on distribute with an unbounded high end, and you are currently running on a pre-merge distribute, what will happen is that the dependency discovery will put distribute into the list along with the rest of the things that software A depends on, then it will add distribute's dependency, setuptools, to the list, later. THEN, when it's installing, distribute will be upgraded, which removes the distribute provided setuptools code, then the next thing in the list is installed, which is not setuptools, and which fails because setuptools is not installed yet.

Contributor

emonty commented Aug 11, 2013

I believe, based on discussions with @dstufft, that this may be related to the problems people will have if they try to upgrade from old distribute to new setuptools in the same set of software as other things. Or, more specifically, if you depend on a piece of software that depends on distribute with an unbounded high end, and you are currently running on a pre-merge distribute, what will happen is that the dependency discovery will put distribute into the list along with the rest of the things that software A depends on, then it will add distribute's dependency, setuptools, to the list, later. THEN, when it's installing, distribute will be upgraded, which removes the distribute provided setuptools code, then the next thing in the list is installed, which is not setuptools, and which fails because setuptools is not installed yet.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Aug 11, 2013

Contributor

@emonty the distribute to setuptools upgrade problem you speak of is described in #1064, but also here http://www.pip-installer.org/en/latest/cookbook.html#importerror-no-module-named-setuptools

but this is not related to pip's conflict resolution shortcoming issues described in this issue.

Contributor

qwcode commented Aug 11, 2013

@emonty the distribute to setuptools upgrade problem you speak of is described in #1064, but also here http://www.pip-installer.org/en/latest/cookbook.html#importerror-no-module-named-setuptools

but this is not related to pip's conflict resolution shortcoming issues described in this issue.

@dracos

This comment has been minimized.

Show comment
Hide comment
@dracos

dracos Aug 29, 2013

Just had this issue installing a project - on a system that already had python-dateutil 1.4 installed - that depended upon python-dateutil (no specific version) and django-tastypie 0.9.16. Even though django-tastypie has a python-dateutil dependency of >=1.5 (and not 2.0), and the parent has no specific version dependency on python-dateutil, python-dateutil remained at 1.4. This seems pretty fundamentally against what a package installer should be doing in such a circumstance :)

dracos commented Aug 29, 2013

Just had this issue installing a project - on a system that already had python-dateutil 1.4 installed - that depended upon python-dateutil (no specific version) and django-tastypie 0.9.16. Even though django-tastypie has a python-dateutil dependency of >=1.5 (and not 2.0), and the parent has no specific version dependency on python-dateutil, python-dateutil remained at 1.4. This seems pretty fundamentally against what a package installer should be doing in such a circumstance :)

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Aug 29, 2013

Member

FWIW I've been experimenting with making a real resolver for pip.

Member

dstufft commented Aug 29, 2013

FWIW I've been experimenting with making a real resolver for pip.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Aug 29, 2013

Contributor

@dracos there is a "solution" right now (short of a new resolver for pip; although that would be nice). Specifically declare what you want the python-dateutil requirement to be in a requirements file, or as a top-level pip install argument, and that will be honored.

http://www.pip-installer.org/en/latest/cookbook.html#requirements-files
http://www.pip-installer.org/en/latest/logic.html#requirement-specifiers

pip install myproject datetutil>=1.5,<2.0

Contributor

qwcode commented Aug 29, 2013

@dracos there is a "solution" right now (short of a new resolver for pip; although that would be nice). Specifically declare what you want the python-dateutil requirement to be in a requirements file, or as a top-level pip install argument, and that will be honored.

http://www.pip-installer.org/en/latest/cookbook.html#requirements-files
http://www.pip-installer.org/en/latest/logic.html#requirement-specifiers

pip install myproject datetutil>=1.5,<2.0

@dracos

This comment has been minimized.

Show comment
Hide comment
@dracos

dracos Aug 29, 2013

@qwcode Thanks, I know that and will have to do so - but this means anyone working on the project will have to manually check all dependencies. Say someone upgrades django-tastypie and it now requires a later version of python-dateutil, there's no way to detect this besides manual checking of each dependency upgrade/install. Oh well.

dracos commented Aug 29, 2013

@qwcode Thanks, I know that and will have to do so - but this means anyone working on the project will have to manually check all dependencies. Say someone upgrades django-tastypie and it now requires a later version of python-dateutil, there's no way to detect this besides manual checking of each dependency upgrade/install. Oh well.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Aug 29, 2013

Contributor

@dracos understood. it's a pain point. but just want others who find this, to at least know, there is some kind of solution.

Contributor

qwcode commented Aug 29, 2013

@dracos understood. it's a pain point. but just want others who find this, to at least know, there is some kind of solution.

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Aug 29, 2013

Contributor

@dstufft as you work on a new resolver, keep in mind that top level requirements (i.e. pip install arguments are requirements file entries) will still have to be considered overrides (or dominant). that's a pip feature. Also, to state the obvious, this change will be "backwards incompatible" in the sense that many installations will turn out differently.

Contributor

qwcode commented Aug 29, 2013

@dstufft as you work on a new resolver, keep in mind that top level requirements (i.e. pip install arguments are requirements file entries) will still have to be considered overrides (or dominant). that's a pip feature. Also, to state the obvious, this change will be "backwards incompatible" in the sense that many installations will turn out differently.

@dracos

This comment has been minimized.

Show comment
Hide comment
@dracos

dracos Aug 31, 2013

I've made a script (that is basically a patch to pip's prepare_files(), but I didn't want to patch pip, I can't control pip everywhere I use it for example) - available at https://github.com/dracos/check-pip-dependencies - that notes multiple dependency requests for the same package and outputs a list of conflicts, that you can then manually resolve using the methods suggested above.

dracos commented Aug 31, 2013

I've made a script (that is basically a patch to pip's prepare_files(), but I didn't want to patch pip, I can't control pip everywhere I use it for example) - available at https://github.com/dracos/check-pip-dependencies - that notes multiple dependency requests for the same package and outputs a list of conflicts, that you can then manually resolve using the methods suggested above.

@benoitbryon

This comment has been minimized.

Show comment
Hide comment
@benoitbryon

benoitbryon Sep 24, 2013

This ticket looks like #174.

benoitbryon commented Sep 24, 2013

This ticket looks like #174.

@y-p

This comment has been minimized.

Show comment
Hide comment
@y-p

y-p Dec 19, 2013

Contributor

The pydata people at continuum analytics have built a packaging toolchain
more suited to scientific packaging (That's good, it can get funky).
Facing the same issues, they've implemented a dependency resolver using a SAT solver,
see here. The paper mentioned there is readable and they've open sourced the wrapper package
around the (open as well) SAT solver engine.

Basically, it's all there.

Contributor

y-p commented Dec 19, 2013

The pydata people at continuum analytics have built a packaging toolchain
more suited to scientific packaging (That's good, it can get funky).
Facing the same issues, they've implemented a dependency resolver using a SAT solver,
see here. The paper mentioned there is readable and they've open sourced the wrapper package
around the (open as well) SAT solver engine.

Basically, it's all there.

@Ivoz

This comment has been minimized.

Show comment
Hide comment
@Ivoz

Ivoz Feb 18, 2014

Member

@y-p the only problem with that is that it relies on C code, rather than pure python. I presume that it would notionally be unacceptable to expect that everyone installing pip have a C compiler easily available on their system; thus you would have to provide a compiled pip for every system Python hopes to run on. That is a major ask (anyone compiling and distributing for python arm?).

It might be "all there", but it's in an unusable form for pip's general audience.

Member

Ivoz commented Feb 18, 2014

@y-p the only problem with that is that it relies on C code, rather than pure python. I presume that it would notionally be unacceptable to expect that everyone installing pip have a C compiler easily available on their system; thus you would have to provide a compiled pip for every system Python hopes to run on. That is a major ask (anyone compiling and distributing for python arm?).

It might be "all there", but it's in an unusable form for pip's general audience.

@merwok

This comment has been minimized.

Show comment
Hide comment
@merwok

merwok Feb 18, 2014

MIT-licensed pure Python: https://github.com/ActiveState/depgraph
(I just know of it, never tested it on the field)

merwok commented Feb 18, 2014

MIT-licensed pure Python: https://github.com/ActiveState/depgraph
(I just know of it, never tested it on the field)

@Ivoz

This comment has been minimized.

Show comment
Hide comment
@Ivoz

Ivoz Mar 4, 2014

Member

0install used to be based on python (now on OCaml), and also wrote a pure python solver, which can be found in their 1.16 release (also in sat.py); afaik it was py23 compatible.

Also some good notes on sat solving on their page, and an awesome email.

minisat paper; GRASP paper

Member

Ivoz commented Mar 4, 2014

0install used to be based on python (now on OCaml), and also wrote a pure python solver, which can be found in their 1.16 release (also in sat.py); afaik it was py23 compatible.

Also some good notes on sat solving on their page, and an awesome email.

minisat paper; GRASP paper

@Wilfred

This comment has been minimized.

Show comment
Hide comment
@Wilfred

Wilfred Mar 14, 2014

Contributor

pkglib has recently released a pure Python dependency solver, FWIW: https://github.com/ahlmss/pkglib/blob/master/pkglib/pkglib/setuptools/dependency.py#L596

Contributor

Wilfred commented Mar 14, 2014

pkglib has recently released a pure Python dependency solver, FWIW: https://github.com/ahlmss/pkglib/blob/master/pkglib/pkglib/setuptools/dependency.py#L596

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Mar 14, 2014

Member

Thanks, I'll take a look at it when I take a look at the enthought one as well.

Member

dstufft commented Mar 14, 2014

Thanks, I'll take a look at it when I take a look at the enthought one as well.

@pfmoore

This comment has been minimized.

Show comment
Hide comment
@pfmoore

pfmoore Jun 30, 2014

Member

I've been reading some of the references. One thing I don't see in any of them is how version dependencies are managed - specifically, if package A depends on B (version >=2.0) how is that encoded. Is each version of a package treated as an entirely separate object for resolution purposes? Does anyone have any pointers to references on how this is handled?

Member

pfmoore commented Jun 30, 2014

I've been reading some of the references. One thing I don't see in any of them is how version dependencies are managed - specifically, if package A depends on B (version >=2.0) how is that encoded. Is each version of a package treated as an entirely separate object for resolution purposes? Does anyone have any pointers to references on how this is handled?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Jun 30, 2014

Member

So for a raw SAT solver yes. Basically a SAT solver lets you solve a boolean equation, ideally efficiently.

So if you have foo>=2.0 you'd look and see that foo has 1.0, 2.0, 2.1, 2.2, and 2.3, and you'd translate that into an equation like foo_2_0 or foo_2_1, or foo_2_2 or foo_2_3 (It's actually more complicated than that, because it also needs an equation to ensure that only 1 foo_x_y can be true at a time). The SAT solver will then spit back at you which set of variables should be True/False to satisfy the equation.

The most basic of SAT solver can simply set all variables to False, and then randomly try setting a single variable to True (and recording which it's already tried) to bruteforce the equation. Speed ups occur when you add other things on top of that such as backtracking (instead of starting over from the beginning you undo the last choice and try a different choice), simplifying (Finding subsets of the problem that are the same and collapsing them into themselves), as well as other techniques.

You can also be smarter about how you choose which variables to try to set to True. In the case above we probably don't want to randomly select one because if the only version specifier is foo>=2.0 then we'll end up with a random foo each time, but instead we'd want to try the highest version of foo first, and then go backwards, or even better, try the currently installed version of foo first, and then resort to trying the highest versions.

Basically all the SAT solver papers are just defining better ways at guessing which variables and computing/storing the results to speed up what is essentially a brute force problem.

Member

dstufft commented Jun 30, 2014

So for a raw SAT solver yes. Basically a SAT solver lets you solve a boolean equation, ideally efficiently.

So if you have foo>=2.0 you'd look and see that foo has 1.0, 2.0, 2.1, 2.2, and 2.3, and you'd translate that into an equation like foo_2_0 or foo_2_1, or foo_2_2 or foo_2_3 (It's actually more complicated than that, because it also needs an equation to ensure that only 1 foo_x_y can be true at a time). The SAT solver will then spit back at you which set of variables should be True/False to satisfy the equation.

The most basic of SAT solver can simply set all variables to False, and then randomly try setting a single variable to True (and recording which it's already tried) to bruteforce the equation. Speed ups occur when you add other things on top of that such as backtracking (instead of starting over from the beginning you undo the last choice and try a different choice), simplifying (Finding subsets of the problem that are the same and collapsing them into themselves), as well as other techniques.

You can also be smarter about how you choose which variables to try to set to True. In the case above we probably don't want to randomly select one because if the only version specifier is foo>=2.0 then we'll end up with a random foo each time, but instead we'd want to try the highest version of foo first, and then go backwards, or even better, try the currently installed version of foo first, and then resort to trying the highest versions.

Basically all the SAT solver papers are just defining better ways at guessing which variables and computing/storing the results to speed up what is essentially a brute force problem.

@Ivoz

This comment has been minimized.

Show comment
Hide comment
@Ivoz

Ivoz Jun 30, 2014

Member

@pfmoore in addition to dstufft's reply, I would guess that reading my previous link http://0install.net/solver.html would tell you how this is done, it's a great article.

Member

Ivoz commented Jun 30, 2014

@pfmoore in addition to dstufft's reply, I would guess that reading my previous link http://0install.net/solver.html would tell you how this is done, it's a great article.

@pfmoore

This comment has been minimized.

Show comment
Hide comment
@pfmoore

pfmoore Jun 30, 2014

Member

@Ivoz thanks I'd missed that one

Member

pfmoore commented Jun 30, 2014

@Ivoz thanks I'd missed that one

openstack-gerrit pushed a commit to openstack/oslo.db that referenced this issue Jul 6, 2014

Test for distinct SQLAlchemy major releases
This change presents one way we might include test support
for oslo.db against specific SQLAlchemy major releases, currently
including the 0.7, 0.8, and 0.9 series.  As we will want to
begin including features within oslo.db that target advanced
and in some cases semi-public APIs within SQLAlchemy, it will
be important that we test these features against each major release,
as there may be variances between major revs as well as
version-specific approaches within oslo.

To accomplish this, I was not able to override "deps" alone,
as the SQLAlchemy revision within requirements.txt conflicts
with a hand-entered requirement, and due to pip's lack of
a dependency resolver (see pypa/pip#988
and pypa/pip#56) I instead overrode
"commands".  I don't know that this is the best approach, nor
do I know how the tox.ini file is accommodated by CI servers,
if these CI servers would need their tox invocation altered or
how that works.

This patch may or may not be the way to go, but in any case
I'd like to get input on how we can ensure that more SQLAlchemy-specific
oslo.db features can be tested against multiple SQLAlchemy versions.
Note that even with this change, running the "sqla_07" environment
does in fact produce test failures, see http://paste.openstack.org/show/85263/;
so already oslo.db expects behaviors that are not present in
all SQLAlchemy versions listed in the common requirements.txt.

Change-Id: I4128272ce15b9e576d7b97b1adab4d5027108c7c

openstack-gerrit added a commit to openstack/openstack that referenced this issue Jul 6, 2014

Updated openstack/openstack
Project: openstack/oslo.db  a1fd49fd9b726017de02856ab0e0dfe3751e2394

Test for distinct SQLAlchemy major releases

This change presents one way we might include test support
for oslo.db against specific SQLAlchemy major releases, currently
including the 0.7, 0.8, and 0.9 series.  As we will want to
begin including features within oslo.db that target advanced
and in some cases semi-public APIs within SQLAlchemy, it will
be important that we test these features against each major release,
as there may be variances between major revs as well as
version-specific approaches within oslo.

To accomplish this, I was not able to override "deps" alone,
as the SQLAlchemy revision within requirements.txt conflicts
with a hand-entered requirement, and due to pip's lack of
a dependency resolver (see pypa/pip#988
and pypa/pip#56) I instead overrode
"commands".  I don't know that this is the best approach, nor
do I know how the tox.ini file is accommodated by CI servers,
if these CI servers would need their tox invocation altered or
how that works.

This patch may or may not be the way to go, but in any case
I'd like to get input on how we can ensure that more SQLAlchemy-specific
oslo.db features can be tested against multiple SQLAlchemy versions.
Note that even with this change, running the "sqla_07" environment
does in fact produce test failures, see http://paste.openstack.org/show/85263/;
so already oslo.db expects behaviors that are not present in
all SQLAlchemy versions listed in the common requirements.txt.

Change-Id: I4128272ce15b9e576d7b97b1adab4d5027108c7c
@piotr-dobrogost

This comment has been minimized.

Show comment
Hide comment
@piotr-dobrogost

piotr-dobrogost Nov 26, 2014

https://github.com/nvie/pip-tools project has pip-compile command dealing with resolution of dependencies. At http://nvie.com/posts/better-package-management/ there's this statement:

We’ve created pip-compile to be smart with respect to resolving complex dependency trees

Might be worth to find out what algorithm is used there.

piotr-dobrogost commented Nov 26, 2014

https://github.com/nvie/pip-tools project has pip-compile command dealing with resolution of dependencies. At http://nvie.com/posts/better-package-management/ there's this statement:

We’ve created pip-compile to be smart with respect to resolving complex dependency trees

Might be worth to find out what algorithm is used there.

@qwcode qwcode changed the title from Pip needs a real dependency resolver to Pip needs a dependency resolver Feb 3, 2015

achimnol added a commit to lablup/backend.ai-client-py that referenced this issue Jun 2, 2018

RyuzakiKK added a commit to RyuzakiKK/gnome-keysign that referenced this issue Jun 5, 2018

bump the required version of twisted in setup.py
Magic wormhole requires twisted[tls]>=17.5.0, and due to this bug in pip
pypa/pip#988 we need to list it with the same
minimum version.

tsibley added a commit to nextstrain/sacra that referenced this issue Jun 7, 2018

Declare dependencies with minimum rather than single versions
While pip doesn't have fully-resolved dependency calculations¹, conda
does and correctly detects a version incompatibility between these two
exact versions:

    UnsatisfiableError: The following specifications were found to be in conflict:
      - biopython==1.69 -> numpy=1.12
      - numpy==1.14.3

This is a roadblock to installing sacra requirements into a conda
environment using `conda install`.  By declaring minimums, we let the
dependency resolver figure out how to get us at least the versions we
need (presumably for bug fixes or features).

¹ https://pip.pypa.io/en/stable/user_guide/#requirements-files
  pypa/pip#988

zenhack added a commit to zenhack/simp_le that referenced this issue Jun 11, 2018

Fix broken install due to pypa/pip#988
idna 2.7 is out, and we hit the same problem as in #62. At some point
our old workaround was removed; I don't recall when. This is a more
direct workaround than we had before.
@RR2DO2

This comment has been minimized.

Show comment
Hide comment
@RR2DO2

RR2DO2 Jun 12, 2018

As per pradyunsg#1 part of this has landed in pip 10, but we ran into an issue yesterday that made a conflict go through unnoticed.

Cryptography and requests were listed in our requirements.

cryptography==2.2.2 depends on idna>=2.1
requests==2.18.4 depends on idna>=2.5,<2.7

Now idna 2.7 was released yesterday, which caused our wheel packaging to result in an incompatible set without warnings.

"pip install" now warns with "requests 2.18.4 has requirement idna<2.7,>=2.5, but you'll have idna 2.7 which is incompatible."

But running "pip wheel" doesn't do any such thing.

I was looking at getting a test case for this, but not sure what the correct expectation should be. The core of the issue can be gotten close to by this, my expectation would be that the >=2.1 specifier gets reduced to the range specified by requests:

def test_specifier_reduction():
    ge21 = InstallRequirement.from_line("idna>=2.1")
    ge25l27 = InstallRequirement.from_line("idna>=2.5,<2.7")

    req_set = RequirementSet()
    req_set.add_requirement(
        ge21,
        parent_req_name="cryptography"
    )
    req_set.add_requirement(
        ge25l27,
        parent_req_name="requests"
    )
    assert req_set.requirements['idna'].specifier == '>=2.5,<2.7'

I don't know if this is a pip issue or something to do with one of the libraries it depends on. At a bare minimum the same warning as 'pip install' causes should happen. Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.

Our workaround was to specifically specify idna==2.6; but in an ideal world pip would auto-reduce the set to this compatible number without requiring specification.

RR2DO2 commented Jun 12, 2018

As per pradyunsg#1 part of this has landed in pip 10, but we ran into an issue yesterday that made a conflict go through unnoticed.

Cryptography and requests were listed in our requirements.

cryptography==2.2.2 depends on idna>=2.1
requests==2.18.4 depends on idna>=2.5,<2.7

Now idna 2.7 was released yesterday, which caused our wheel packaging to result in an incompatible set without warnings.

"pip install" now warns with "requests 2.18.4 has requirement idna<2.7,>=2.5, but you'll have idna 2.7 which is incompatible."

But running "pip wheel" doesn't do any such thing.

I was looking at getting a test case for this, but not sure what the correct expectation should be. The core of the issue can be gotten close to by this, my expectation would be that the >=2.1 specifier gets reduced to the range specified by requests:

def test_specifier_reduction():
    ge21 = InstallRequirement.from_line("idna>=2.1")
    ge25l27 = InstallRequirement.from_line("idna>=2.5,<2.7")

    req_set = RequirementSet()
    req_set.add_requirement(
        ge21,
        parent_req_name="cryptography"
    )
    req_set.add_requirement(
        ge25l27,
        parent_req_name="requests"
    )
    assert req_set.requirements['idna'].specifier == '>=2.5,<2.7'

I don't know if this is a pip issue or something to do with one of the libraries it depends on. At a bare minimum the same warning as 'pip install' causes should happen. Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.

Our workaround was to specifically specify idna==2.6; but in an ideal world pip would auto-reduce the set to this compatible number without requiring specification.

@pradyunsg

This comment has been minimized.

Show comment
Hide comment
@pradyunsg

pradyunsg Jun 13, 2018

Member

"pip wheel" doesn't do any such thing.

Expanding the warnings to pip wheel (and also pip download) sounds like a reasonable enhancement/feature request. I've gone ahead and filed it as #5497.

Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.

For situations where you want to ensure that the dependencies are consistent, there's pip check that exits with an exit code of 1 if the dependencies aren't consistent. It uses the same underlying logic as pip install for printing/generating these warnings/errors.

Further, in the next release, pip install's warnings will be limited to only checking graphs of packages directly influenced by the installation run. pip check will continue to check all the packages in the graph.

pip would auto-reduce the set to this compatible number without requiring specification.

That's exactly what this issue is for tracking. :)

Member

pradyunsg commented Jun 13, 2018

"pip wheel" doesn't do any such thing.

Expanding the warnings to pip wheel (and also pip download) sounds like a reasonable enhancement/feature request. I've gone ahead and filed it as #5497.

Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.

For situations where you want to ensure that the dependencies are consistent, there's pip check that exits with an exit code of 1 if the dependencies aren't consistent. It uses the same underlying logic as pip install for printing/generating these warnings/errors.

Further, in the next release, pip install's warnings will be limited to only checking graphs of packages directly influenced by the installation run. pip check will continue to check all the packages in the graph.

pip would auto-reduce the set to this compatible number without requiring specification.

That's exactly what this issue is for tracking. :)

@pfmoore

This comment has been minimized.

Show comment
Hide comment
@pfmoore

pfmoore Jun 13, 2018

Member

pip wheel and pip download can put multiple versions of a package in the target directory (unlike pip install). For many uses of pip wheel and pip download that makes the warnings inaccurate (or at best, misleading). The use case above seems to be a very specific workflow (pip wheel -r requirements.txt into an empty directory, which is then expected to be usable as a consistent install set). That's not how the majority of uses of pip wheel that I have seen work.

Member

pfmoore commented Jun 13, 2018

pip wheel and pip download can put multiple versions of a package in the target directory (unlike pip install). For many uses of pip wheel and pip download that makes the warnings inaccurate (or at best, misleading). The use case above seems to be a very specific workflow (pip wheel -r requirements.txt into an empty directory, which is then expected to be usable as a consistent install set). That's not how the majority of uses of pip wheel that I have seen work.

yarikoptic added a commit to chaselgrove/datalad-crawler that referenced this issue Jul 28, 2018

openstack-gerrit pushed a commit to openstack-infra/project-config that referenced this issue Aug 13, 2018

Bump ansible for linters
I noticed this when I tried to use Ansible 2.5 "loop:" constructs.

Unfortunately, we can't just rely on bringing in zuul to pull the
right version of ansible.  ansible-lint being uncapped just takes it
over, and we end up with the latest version.  Pin it to the current
zuul requirements and add a note (maybe one day
pypa/pip#988 will get a fix ...)

Change-Id: Iaf8f0f5cdc46df41fec7c436c1179c80bb5c368e
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment