-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please allow "file:" for setup.cfg install_requires #1951
Comments
There are two things I'd like to address here:
My intuition is that I think that supporting Regardless of how this turns out in
This is probably not a big deal for I would say I am -0 on both of these propositions. I'm instinctively conservative here and don't see a super compelling use case, though I will admit that for people like @nedbat who have many different contextual dependencies, it makes sense to provide them some ability to refactor and avoid having a huge monolithic configuration file. One possible compromise here could be supporting |
This is something you should expect to see more of now with pipx simplifying the installation of Python CLIs from PyPI. It is an anti-pattern in the context of libraries but the reality is setuptools and PyPI are also used for application delivery. |
Just because a feature can be mis-used does not mean it should be forbidden. I have a reason to use this, and I know what I am doing. Is it better that I hack together something with sed and make? |
In my mind, things have changed as there is a more compelling use-case presented above. I'd be open to a PR that (a) adds support for
I would recommend that if pinning is done, it should be done outside of setuptools using tools like I don't think the |
I agree that that's true, but it really depends on baseline rates. If we add this and for 99.9% of people it is mis-used and it's still possible to accomplish it without this feature, I think we're doing people a disservice by giving them a feature that's so easy to misuse. I think that's the case here except that encouraging people to use It seems like the safest way to do this would be to allow |
I don't think |
Not sure what you mean by "extract dependencies from pip requirement specifiers", but if you look at Ned's original use case I could definitely imagine someone with that many separate extras dependencies not wanting to combine those into one huge
I think neither of these are the case. The problem to solve here is that a lot of people don't understand the difference between As long as it's not easy to output the As an aside, this reminds me of the Susan B. Anthony Dollar, which was a US$1 coin that looks almost identical to a quarter ($0.25). People complained because it was very easy to confuse the two and give someone $4 when you meant to give them $1. It was such a similar shape and size that you could even put the $1 coins in vending machines designed for $0.25 coins. My reasoning for using a different format is similar to the reason for making $1 coins a different color and size than $0.25 coins: This is just an idea, though. I'm willing to be persuaded otherwise. |
I mean these:
If no other program is able to interoperate with the new file - the only benefit there being to reduce the size of
I think a lot of people don't appreciate that these are different formats. That you are able to pipe a |
Another idea: pip-compile can read source requirements from |
Note: I understand the logic of avoiding |
CI+caching is another usecase where this would be useful: I'd like to cache my venv folder (using e.g. the hash of requirements.txt as a cache key), but right now this means i have to duplicate the requirements in setup.cfg and requirements.txt for this to work. |
I too have the use case where I want to reuse my At the bare minimum, it would be awesome if setuptools provided an entry point so that a third party library could read requirements files and set the various Would you be open for said entry-point? |
So, for those interested, I managed to hack something which might also help you, https://pypi.org/project/setuptools-declarative-requirements |
Some more info on this: if you have a |
I'd take this a step further and argue that Example:
|
Same thing here: preinstalling dependencies in a container image (that's rebuilt when dependencies change) can yield <10s CI times (incl. testing on all supported versions of CPython, linting, etc.) on most of my projects, but doing that in a sensible way requires having the list of dependencies in a separate file. Quick CI times can do a lot for project health and developer experience:
I know it's perfectly feasible with |
PS: I just added to bork (a dev/release automation tool for Python projects) the ability to dump the list of dependencies, using |
For the # setup.cfg
[metadata]
name = my-package
# ...
[options]
install_requires =
requests>2,<3
[options.extras_require]
dev =
mypy
types-requests $ pip-compile --extra dev --output-file requirements-dev.txt setup.cfg Resulting in:
|
I was looking for this feature for years but since pip-compile got support for reading setup.cfg, I am no longer sure how critical this is. Once of the deep PITA remaining is that dependabot team failed to add support for reading setup.cfg, ignoring what is in fact the recommended way to record dependencies for python projects. Also I was not able to find a tool that can properly update setup.cfg with new requirements, maybe some watching this ticket know one. If support for My use case for pip-compile is bit different than the original one, I am using it to build |
I don't understand why this is still being debated :|. Let the developer provide install_requires = file: requirements.txt. What's the big deal? |
FYI, there's an easy footgun with this feature: if you forget to include the requirements source in the SDist, you'll get broken SDists. Technically, I think you could fall back on the egg-info if (and only if) this happened, but don't know if it's worth it. Also, this is the requirements it is including, to see how this is being used in the wild. :D |
Probably Setuptools' file inference should always include any files it used to build metadata (i.e. any I note that this issue would not have occurred in a system where the file is stored in source control and a SCM-based file finder was employed (e.g. |
That is very true. Since it was first proposed, there was a general understanding that this feature is "not for everyone", and that the developer "needs to know what they are doing". So this kind of problem does not really come as a surprise to me. There are a few open FR tickets related to this (#2821, #3570, maybe more) with the Meanwhile we can also add a "warning" box in the documentation. |
Well, I don't think that the developer needs to know what they are doing to use |
The whole reason I was opposed to this feature is that it is enabling a bad workflow that many people naively think would be a good thing to do (including a requirements.txt). The reason it was added was to make it more convenient for some people who want to do a different thing that most people don't want to do (include a requirements.in file). You absolutely need to know what you are doing to use this feature correctly. Of course, the much better solution to this would be for pip-compile to support compiling from PEP 621 metadata or from PEP 517 metadata, since that works no matter what your backend. When / if that happens, the only people using this feature will be people who don't know that it's not needed and people doing the wrong thing. 🙁 |
@pganssle Currently I distribute multiple applications through pypi.
Some versions are properly pinned and some dependencies use semantic versioning so I can use a version range in the requirements files. During CI and local testing I install the Since I am not distributing a e.g. low level library I thought it would be good practice to pin the dependencies and ensure the application installation is reproducible. |
@spacemanspiff2007 From my point of view: For applications it is less of an issue than for libraries. I do not think anyone is saying the opposite. But... Even for applications it can end up being counter-productive to hard pin the dependency versions directly in the package metadata. If I want to install one of your applications X years from now, long after you have stopped maintaining this application, while the dependencies of this application have received later releases with critical bug and security fixes... How hard is it going for me to install the application with the fixed dependencies if their versions are hard pinned in the package metadata of the application? How many hoops will I have to jump through? My recommendation is:
And finally, there is also not much debate about the fact that wheels (and sdist and whatever we can find on PyPI right now) are okay-ish but not great means of distribution for applications (for libraries there is not much to complain about). If you are serious about distributing applications, you should think about moving towards additional tooling. |
@spacemanspiff2007 Instead of |
Thanks for the hint. I just typed it quickly together, of course I'm using
You make the assumption that during the whole time there were no breaking changes in the application dependencies and my application will still start and work as expected. I guess it's up to what the actual goal is:
But
I absolutely agree. |
The one and only important argument I am trying to make, and I believe you already understood it, is that So now if you ask where you should list all pinned dependencies for a good end-user application packaging and distribution flow. Then sadly there is no real good answer (but for sure the answer is not An okay-ish recommendation I can make is to provide and document a file with the
Personally I feel like we will not have great solutions until we have a great (standardized) lock file format (which is being worked on, so keep an eye out for that but do not hold your breath). I can also recommend to look at things like @spacemanspiff2007 Although I feel like everything has already been said here and elsewhere, if you feel like continuing the discussion on this, I suggest to continue in a different location. I can suggest these 2 places, where you can ping/mention me ( |
Installing with `pip install -e .` fails due to setup.py being deprecated after setuptools adopted PEP-517 [1]. Migrate to building using a setup.cfg file. This was mostly a 1-to-1 migration except for setting `install_requires`. Setuptools recommends not reading `requirements.txt` as the value for `install_requires` [4]. See also: - A Practical Guide to Setuptools and Pyproject.toml [2] - Configuring setuptools using setup.cfg files [3] [1] https://peps.python.org/pep-0517/ [2] https://godatadriven.com/blog/a-practical-guide-to-setuptools-and-pyproject-toml/ [3] https://setuptools.pypa.io/en/latest/userguide/declarative_config.html [4] pypa/setuptools#1951
I understand that the We need this feature for prevent future issues and inconsistencies. |
This feature is already there since a long time... |
Since 62.6.0 actually. (though slightly better in 66.1.0 since then the files get auto-included in the SDist, too) |
Right, I was using an older version of setuptools. I read issue #3793 and thought it had been rejected. Thanks! |
The new declarative setup.cfg syntax is fabulous. It's wonderful to have two-line setup.py files!
I tried to use "install_requires = file: requirements/base.in" and found that "file:" wasn't supported there. Our pip-compile workflow works really well, and it would be great to be able to start with base.in as our source of truth, and easily use it in our setup.cfg.
As a design point, I'm not sure it makes sense to special-case which fields can use which kinds of info-gatherers. Why can't I use "file:" for any field? I can understand if the implementation requires it, but the more the developer can choose how to structure their own world, the better.
The text was updated successfully, but these errors were encountered: