Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: Global configuration file #1685

Closed
tgs opened this issue Feb 12, 2019 · 9 comments
Closed

Request: Global configuration file #1685

tgs opened this issue Feb 12, 2019 · 9 comments
Labels
enhancement Needs Discussion Issues where the implementation still needs to be discussed.

Comments

@tgs
Copy link

tgs commented Feb 12, 2019

Hello and thanks for all your work!

I work at an organization where we run a private PyPI (Artifactory) that that hosts some private tools and acts as a caching proxy for the public PyPI. It's easy enough to configure pip to use this internal pypi using /etc/pip.conf. However, there's no global configuration file for setuptools. This wouldn't be a problem, except that we also use some setup_requires build tools, and some of our build environments are isolated from the internet (they have to use our private PyPI proxy). In order to tell setuptools the URL of our proxy, we have resorted to patching virtualenv to modify the $VIRTUAL_ENV/lib/pythonX.Y/distutils/distutils.cfg that it creates, adding our own index_url. However, this is a pain to maintain - we have to rebuild and patch every virtualenv release.

If setuptools would look at, for example, /etc/pydistutils.cfg, then we could put the index_url configuration there. This would let us use the vanilla version of virtualenv while still allowing our builds to use the PyPI proxy.

Does this sound like a direction you'd be willing to go?

Thanks again!

@pganssle pganssle added enhancement Needs Discussion Issues where the implementation still needs to be discussed. labels Feb 12, 2019
@pganssle
Copy link
Member

I think the preferred way to do this is to specify a ${HOME}/.pydistutils.cfg, or to modify virtualenv as you specify.

As a general rule, we're actually deprecating setup_requires and trying to remove easy_install entirely. PEP 517 and PEP 518 already allow you to specify your build-time dependencies, so for your own projects the setup_requires requirements will never be hit (using pip >= 18.0 I think).

For projects that you install from a wheel, you also don't have to worry about this, because wheels do not have a build step and thus do not require setup_requires.

So given all that, the only remaining packages that are both out of your control (i.e. can't just implement PEP 518) and likely to cause problems are those that both have setup_requires specified, don't use PEP 517 and don't ship a wheel for your platform, which, hopefully, is a small number. One way to mitigate the immediate effects without having to worry about patching anything would be to build and upload wheels for the "pain point" packages on your private PyPI endpoint (which I'm guessing is some sort of pass-through caching proxy). Some combination of those things and encouraging the upstream projects to either ship a wheel or implement PEP 517/518 will probably solve most of your problems.

I am aware that it would certainly be simpler if we implemented a global configuration for distutils, and I am not necessarily against this happening, but I am not sure I want to do it or maintain it for what is essentially a deprecated code path. I am mildly worried that it will be a pain to work out the right precedence ordering of these files, but it's probably not a big deal.

@tgs
Copy link
Author

tgs commented Feb 12, 2019

Thanks for the discussion! We will look more closely at those PEPs in the next day or two and comment again on this ticket.

@tgs
Copy link
Author

tgs commented Feb 12, 2019

OK, reporting back.

PEP 517 does indeed seem like a much simpler system for managing this situation.

I tried it on our internal system, starting from a virtualenv that had the newest pip and setuptools. Pip noticed the pyproject.toml file and successfully installed all the build dependencies! So that was great! Unfortunately the next step failed:

  Getting requirements to build wheel ...   Running command /home/smithtg/.virtualenvs/tmp-492fa4b87abca32/bin/python3.6 /home/smithtg/.virtualenvs/tmp-492fa4b87abca32/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /tmp/tmpxrgo5a0d
  Traceback (most recent call last):
    File "/home/smithtg/.virtualenvs/tmp-492fa4b87abca32/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 213, in <module>
      main()
    File "/home/smithtg/.virtualenvs/tmp-492fa4b87abca32/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 203, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/home/smithtg/.virtualenvs/tmp-492fa4b87abca32/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 54, in get_requires_for_build_wheel
      backend = _build_backend()
    File "/home/smithtg/.virtualenvs/tmp-492fa4b87abca32/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 45, in _build_backend
      obj = getattr(obj, path_part)
  AttributeError: module 'setuptools.build_meta' has no attribute '__legacy__'
error

I added an open('/tmp/version.txt', 'w').write(repr(pkg_resources.get_distribution('setuptools'))) to the _build_backend function in that last file, and it was actually seeing the setuptools==40.4.3 from our base Python installation, not the setuptools==40.8.0 from the virtualenv. Is that what you expect? I find it weird for env/bin/pip to be interfacing with the global setuptools.

So my test didn't work all the way through, but it was pretty promising. Given that, I think I agree that it's not worth adding functionality to the setup_requires feature as it's getting phased out. We feel okay about maintaining our virtualenv patch a bit longer, and trying this again once some more kinks are worked out.

@tgs tgs closed this as completed Feb 12, 2019
@pganssle
Copy link
Member

@tgs Are you using --system-site-packages with that virtual env? And can you show the pyproject.toml?

I saw the same bug today and I'm trying to figure out why it's happening.

@tgs
Copy link
Author

tgs commented Feb 12, 2019

Oh, I think it did have --system-site-packages (that's default at our site for somewhat silly reasons I won't get into). I can try that again.

@tgs
Copy link
Author

tgs commented Feb 12, 2019

@pganssle - Yes, in a virtualenv without --system-site-packages, the installation works correctly. That seems to be the difference.

The pyproject.toml is:

[build-system]
requires = ["setuptools", "wheel", "packit"]

@pganssle
Copy link
Member

@tgs I think the issue is a bug in virtualenv or pip that is breaking the build isolation. You can fix it by tightening the bounds on the build requirements to require setuptools >= 40.8.0 or by explicitly adding a build backend (add the line build-backend = "setuptools.build_meta" below the requires line).

I'll report the bug to virtualenv later tonight.

@tgs
Copy link
Author

tgs commented Feb 12, 2019

Thanks!

@dmtucker
Copy link

One way to mitigate the immediate effects without having to worry about patching anything would be to build and upload wheels for the "pain point" packages on your private PyPI endpoint (which I'm guessing is some sort of pass-through caching proxy).

Might be worth noting that some options preclude wheel usage:

/.../.tox/py27/lib/python2.7/site-packages/pip/_internal/commands/install.py:206: UserWarning: Disabling all use of wheels due to the use of --build-options / --global-options / --install-options.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Needs Discussion Issues where the implementation still needs to be discussed.
Projects
None yet
Development

No branches or pull requests

3 participants