Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

allow overriding/forcing a certain version #215

Open
stefanfoulis opened this issue Oct 9, 2015 · 11 comments
Open

allow overriding/forcing a certain version #215

stefanfoulis opened this issue Oct 9, 2015 · 11 comments
Labels
question User question

Comments

@stefanfoulis
Copy link

In some cases one might run into a situation where dependencies of dependencies define conflicting dependencies. Often just because one or the other package has not updated its dependency constraints after a newer version of a other package has become compatible again.

For these cases it should be possible to override the requested version for a certain package, without checking anymore. e.g by setting a absolute version in requirements.in (possible with something like an # important! comment.

@offbyone
Copy link

I see that this is still an ongoing issue, which is unfortunate as I have, yet again, bumped into this. Whenever I find a library that is both a) unmaintained and b) excessively ambitious in its version pinning -- an unfortunately frequent occurrence! -- this trips me up again. Sometimes, I as the consumer of these libraries know that I can break the deadlock between them, and giving us the tools to express precedence or preference in conflicts would really help here.

@mredaelli
Copy link

Another example. python3-saml, which is now unmaintened, hardcodes a version of lxml that has a security bug.

Would love to override that dependency as a quick fix

@voxeljorge
Copy link

I just wanted to chime in that the https://github.com/apache/flink has this issue in their latest apache-flink sdk. They have pinned protobuf<3.18 due what appears to be some kind of library incompatibility from last year, but their dev-requirements.txt file currently installs protobuf<3.21 which seems to indicate it works fine with newer versions of the protobuf library. Notably, the gcloud sdk packages all seem to now require protobuf>3.19 which creates a really difficult to solve issue between these two packages that have a reasonably high chance of being used together.

@ofey404
Copy link

ofey404 commented May 11, 2023

+1. Another victim here.

Basically I want to do:

langchain[all]==0.0.165

And I encounter severe conflicting on requests package.

image

@b-kiiskila
Copy link

Running into the same issue when trying to use the shippo package, seems to be a pattern with them. Is there still no workaround for this?

@ippeiukai
Copy link

ippeiukai commented May 25, 2023

It would be nice if certain dependency can be explicitly replaced/overridden.

We want to replace psycopg2 with psycopg2-binary of the same version in development (we have layered requirements).
https://www.psycopg.org/docs/install.html#psycopg-vs-psycopg-binary

@pakal
Copy link

pakal commented Aug 4, 2023

Same issue here, recurrently encountered on numerous projects, with different package management tools.
I'm desperately trying to find a dependency management system which lets me workaround the numerous quirks of existing python package metadata.

Lots of perfectly working but unmaintained packages have abusive "version<X" restrictions ; and I developed compatibility layers for Django which make most version requirements useless, but still can't easily upgrade my locked (sub)dependencies.

Having version overrides like NPM has would be a bliss.

PS : Poetry is stuck too on that matter (python-poetry/poetry#697)

@voxeljorge
Copy link

So there are examples of the problem here but not a lot of proposed solutions. The main solution that I've seen in other threads for this problem is to add some flag that ignores dependencies for a package or for the whole compile process, both of which seem like they could be excessively destructive. In the process of "correctly" ignoring one pinned version you could easily break the rest of the resolution process.

Perhaps a more targeted workaround would be possible where instead of ignoring all version deps for a package there could be some form of an annotation that could be used to specifically replace a single version dependency statement between packages A->B. This way for any given old package with an overly strict version dependency on one package, this could be overridden with the least damage to the health of the dependency graph.

@pakal
Copy link

pakal commented Aug 4, 2023

The best solution imho would be to copy what Yarn does (https://classic.yarnpkg.com/lang/en/docs/selective-version-resolutions/)

Allowing to really pin a specific version of a specific package MYPACKAGE, thus completly overriding all other constraints (>, <, == ....) that any (sub)dependency would have on that specific package MYPACKAGE.

As long as the system keeps warning about the conflicts between this pinned version, and the existing version constraints (while bypassing them), the project maintainer should have all the information needed to manage this "wart" in his dependency tree. This is supposed to be a rare occurrence amongst the hundreds of packages that common projects require, but for now each of these occurrences breaks the package locking workflow.

I have no idea how hard this "little" change would be to implement, though.

@voxeljorge
Copy link

The best solution imho would be to copy what Yarn does (https://classic.yarnpkg.com/lang/en/docs/selective-version-resolutions/)

Allowing to really pin a specific version of a specific package MYPACKAGE, thus completly overriding all other constraints (>, <, == ....) that any (sub)dependency would have on that specific package MYPACKAGE.

As long as the system keeps warning about the conflicts between this pinned version, and the existing version constraints (while bypassing them), the project maintainer should have all the information needed to manage this "wart" in his dependency tree. This is supposed to be a rare occurrence amongst the hundreds of packages that common projects require, but for now each of these occurrences breaks the package locking workflow.

I have no idea how hard this "little" change would be to implement, though.

I suspect that this would probably be a lot harder to build into dependency resolution system here as it would be a pretty deep behavioral change. It also has the potentially unfortunate side-effect that if you introduce a new package which depends on MYPACKAGE you could easily miss a warning that the new dependency is violated and potentially have a broken package interaction. This is the approach I was talking about above.

A less disruptive solution which would require more effort from the developers involved but would potentially move this issue from being unresolvable to at least having a workaround is to allow a developer to override specific version constraints. Here's an example from the problem I posted in a previous message about apache-flink:

Consider the following conflict:
apache-flink depends on protobuf<3.18 but we know that it actually works just fine with protobuf<3.21
gcloud depends on protobuf>3.19

To resolve this we could:

  1. force apache-flink to be installed ignoring version dependencies
  2. force protobuf==3.20 to be installed, ignoring any dependencies
  3. override the apache-flink constraint to be protobuf<3.21

The problem with 1 is that you could very easily force the installation of apache-flink while breaking some other package dependency. Perhaps some warning could be shown but it would probably be hard to figure out whether the warning is meaningful.

The problem with 2 is that you could easily add some package in the future that has the constraint protobuf>3.20 and again you could maybe get a warning but it would be very easy to gloss over a real problem with this. It would also be difficult to know why someone decided to pin protobuf==3.20

With the third option here, the specificity means that the only package that could be affected by the override is the apache-flink package (and any that depend on it of course). It's likely that the override process would be a little more arduous for a developer but in the end at least you'd have an explicit set of overrides written into a requirements.in file which you could review/update/drop in the future. Also hopefully the resolver changes wouldn't be incredibly complex, because you'd expect the resolver to behave in the exact same way with the exception that it would ignore a dependency declared in setup.py and just consume it from requirements.in instead.

@pakal
Copy link

pakal commented Aug 23, 2023

Overriding a specific dependency's subdependencies would indeed be more fine-grained and explicit.

But franky, as long as "theoretical incompatibilities" are reported each time we use pip-tools (like pip somehow does it), developers like me would already be much happier with simple, brutal, subdependency pinning.

There is already a gap between declared constraints and what they really are (semantic versioning gets more and more ditched in favor of calendar versioning, packages can suddenly break if they declare no upper bounds for a dependency as it's now more and more advised...), so investigating weird breakages is already part of the developer work alas, in the python ecosystem.

And this brutal pinning would be a way to voluntarily enter the "Danger Zone", so it's up to the project to document why such pinnings exist, and to periodically check if they can be removed (an additional warning could be systematically issued when such a forced pinning exists in the first place).

(Still I prefer your advanced solution of course, it's just to say that the easiest solution for developers involved might be imho more pragmatic on the short term)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question User question
Projects
None yet
Development

No branches or pull requests

9 participants