New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] How to integrate with setup.py? #209

Closed
robsonpeixoto opened this Issue Feb 7, 2017 · 38 comments

Comments

Projects
None yet
@robsonpeixoto

robsonpeixoto commented Feb 7, 2017

Using requirements.txt I can do it:

from pip.req import parse_requirements
requirements = [str(r.req) for r in
                parse_requirements('requirements.txt', session=False)]
test_requirements = [str(r.req) for r in
                     parse_requirements('requirements-test.txt', session=False)]

How can I do the same using Pipfile?

@nateprewitt

This comment has been minimized.

Show comment
Hide comment
@nateprewitt

nateprewitt Feb 8, 2017

Member

You should be able to accomplish this with the code below. This will load the Pipfile from your current project and return the dependencies in a pip compatible list.

from pipenv.project import Project
from pipenv.utils import convert_deps_to_pip

pfile = Project(chdir=False).parsed_pipfile
requirements = convert_deps_to_pip(pfile['packages'], r=False)
test_requirements = convert_deps_to_pip(pfile['dev-packages'], r=False)

Let us know if you have any further questions :)

Member

nateprewitt commented Feb 8, 2017

You should be able to accomplish this with the code below. This will load the Pipfile from your current project and return the dependencies in a pip compatible list.

from pipenv.project import Project
from pipenv.utils import convert_deps_to_pip

pfile = Project(chdir=False).parsed_pipfile
requirements = convert_deps_to_pip(pfile['packages'], r=False)
test_requirements = convert_deps_to_pip(pfile['dev-packages'], r=False)

Let us know if you have any further questions :)

@elgertam

This comment has been minimized.

Show comment
Hide comment
@elgertam

elgertam May 10, 2017

The above is very helpful. I have run into issues with setup.py depending on Pipenv being installed in a virtualenv before setup.py install (or anything else) can be run. The only solutions to this problem I can think of are either vendorizing Pipenv into a project, which seems less than ideal, or even somehow hacking the setup.py to install Pipenv before attempting to run the from pipenv... imports, which seems wrong. Anyone have any ideas or better solutions than this?

Should we harass suggest to others in the Python community (PyPA, etc.) to just bless Pipenv as an officially-included tool in future Python releases? 😄

elgertam commented May 10, 2017

The above is very helpful. I have run into issues with setup.py depending on Pipenv being installed in a virtualenv before setup.py install (or anything else) can be run. The only solutions to this problem I can think of are either vendorizing Pipenv into a project, which seems less than ideal, or even somehow hacking the setup.py to install Pipenv before attempting to run the from pipenv... imports, which seems wrong. Anyone have any ideas or better solutions than this?

Should we harass suggest to others in the Python community (PyPA, etc.) to just bless Pipenv as an officially-included tool in future Python releases? 😄

@DanLipsitt

This comment has been minimized.

Show comment
Hide comment
@DanLipsitt

DanLipsitt Jul 12, 2017

Would it help to add pipenv to setup_requires? It looks like it might also be necessary to add it to install_requires which seems unfortunate.

DanLipsitt commented Jul 12, 2017

Would it help to add pipenv to setup_requires? It looks like it might also be necessary to add it to install_requires which seems unfortunate.

@kennethreitz

This comment has been minimized.

Show comment
Hide comment
@kennethreitz

kennethreitz Sep 27, 2017

Contributor

Do not do this.

Contributor

kennethreitz commented Sep 27, 2017

Do not do this.

@prcastro

This comment has been minimized.

Show comment
Hide comment
@prcastro

prcastro Oct 7, 2017

@kennethreitz what do you mean by "this"? Do you mean the integration with setup.py, the @nateprewitt solution or the last two suggestions?

prcastro commented Oct 7, 2017

@kennethreitz what do you mean by "this"? Do you mean the integration with setup.py, the @nateprewitt solution or the last two suggestions?

@iddan

This comment has been minimized.

Show comment
Hide comment
@iddan

iddan Oct 17, 2017

So how do users should know they need pipenv installed?

iddan commented Oct 17, 2017

So how do users should know they need pipenv installed?

@techalchemy

This comment has been minimized.

Show comment
Hide comment
@techalchemy

techalchemy Oct 17, 2017

Member
Member

techalchemy commented Oct 17, 2017

@iddan

This comment has been minimized.

Show comment
Hide comment
@iddan

iddan Oct 17, 2017

so how can I use @nateprewitt -'s code in setup.py?

iddan commented Oct 17, 2017

so how can I use @nateprewitt -'s code in setup.py?

@elgertam

This comment has been minimized.

Show comment
Hide comment
@elgertam

elgertam Oct 17, 2017

@iddan: I have attempted to solve the Pipenv bootstrapping problem by just vendorizing a version of Pipenv into my project skeleton (https://github.com/elgertam/cookiecutter-pypackage/blob/master/%7B%7Bcookiecutter.project_slug%7D%7D/setup.py). So far I haven't had any issues, although I can't say I've had a lot of opportunities to test it out in a situation where I install a package using setup.py.

I can understand why we'd be worried about not running a CLI when loading setup.py, but from what I can tell, the code I'm using (copy-pasted from @nateprewitt's post here) is fairly safe.

I figure this hack will no longer be necessary when pip has sufficient internals to understand the Pipfile format.

elgertam commented Oct 17, 2017

@iddan: I have attempted to solve the Pipenv bootstrapping problem by just vendorizing a version of Pipenv into my project skeleton (https://github.com/elgertam/cookiecutter-pypackage/blob/master/%7B%7Bcookiecutter.project_slug%7D%7D/setup.py). So far I haven't had any issues, although I can't say I've had a lot of opportunities to test it out in a situation where I install a package using setup.py.

I can understand why we'd be worried about not running a CLI when loading setup.py, but from what I can tell, the code I'm using (copy-pasted from @nateprewitt's post here) is fairly safe.

I figure this hack will no longer be necessary when pip has sufficient internals to understand the Pipfile format.

@nateprewitt

This comment has been minimized.

Show comment
Hide comment
@nateprewitt

nateprewitt Oct 17, 2017

Member

@iddan, to be clear, that code is solely to convert a Pipfiles dependencies into a requirements.txt style format that pip will read. It looks like Kenneth redacted something from the original post but I'm not sure what.

Pipenv is intended as an environment management and deployment tool, not for distribution like setup.py. We highly suggest documenting that you're using a Pipfile and possibly linking to pipenv.org for installation instructions. Treat this the same way you treat pip, it's not expected for a user to install pip every time they install a new Python package.

Member

nateprewitt commented Oct 17, 2017

@iddan, to be clear, that code is solely to convert a Pipfiles dependencies into a requirements.txt style format that pip will read. It looks like Kenneth redacted something from the original post but I'm not sure what.

Pipenv is intended as an environment management and deployment tool, not for distribution like setup.py. We highly suggest documenting that you're using a Pipfile and possibly linking to pipenv.org for installation instructions. Treat this the same way you treat pip, it's not expected for a user to install pip every time they install a new Python package.

@iddan

This comment has been minimized.

Show comment
Hide comment
@iddan

iddan Oct 17, 2017

I understand that perfectly. What I don't understand is what do you expect that would happen when a user downloads a package using this script and doesn't have pipenv installed

iddan commented Oct 17, 2017

I understand that perfectly. What I don't understand is what do you expect that would happen when a user downloads a package using this script and doesn't have pipenv installed

@isobit

This comment has been minimized.

Show comment
Hide comment
@isobit

isobit Oct 17, 2017

@nateprewitt, if we want to distribute a package then (through the usual means, using pip), should we maintain a copy of the dependency list in setup.py or a requirements.txt? I was hoping to use my Pipfile as a single source of truth.

isobit commented Oct 17, 2017

@nateprewitt, if we want to distribute a package then (through the usual means, using pip), should we maintain a copy of the dependency list in setup.py or a requirements.txt? I was hoping to use my Pipfile as a single source of truth.

@isobit

This comment has been minimized.

Show comment
Hide comment
@isobit

isobit Oct 17, 2017

To clarify, I assume the code in the first reply is meant to be run as part of a build, rather than actually being used in a setup.py.

isobit commented Oct 17, 2017

To clarify, I assume the code in the first reply is meant to be run as part of a build, rather than actually being used in a setup.py.

@tuukkamustonen

This comment has been minimized.

Show comment
Hide comment
@tuukkamustonen

tuukkamustonen Oct 17, 2017

Unless I'm badly mistaken, the code in #209 (comment) is safe to be used if you're building a wheel (=binary dist), but not if you're building sdist (=source dist).

For wheels, setup.py does not get included in the package (but is rather evaluated on build-time, and metadata files are constructed based on information it gathers). With wheels, setup.py is never executed in the machine where the package is being installed, only where it was built.

With sdist, setup.py is actually run on installation machine and so pipenv needs to be available there.

tuukkamustonen commented Oct 17, 2017

Unless I'm badly mistaken, the code in #209 (comment) is safe to be used if you're building a wheel (=binary dist), but not if you're building sdist (=source dist).

For wheels, setup.py does not get included in the package (but is rather evaluated on build-time, and metadata files are constructed based on information it gathers). With wheels, setup.py is never executed in the machine where the package is being installed, only where it was built.

With sdist, setup.py is actually run on installation machine and so pipenv needs to be available there.

@isobit

This comment has been minimized.

Show comment
Hide comment
@isobit

isobit Oct 17, 2017

Oh, yes. @tuukkamustonen, my particular use case is an sdist. Since I don't want to require the package user to install pipenv prior to doing a pip install, I assume I am stuck with deriving my install_requires outside of setup.py (I.e. manually or as part of a build)?

isobit commented Oct 17, 2017

Oh, yes. @tuukkamustonen, my particular use case is an sdist. Since I don't want to require the package user to install pipenv prior to doing a pip install, I assume I am stuck with deriving my install_requires outside of setup.py (I.e. manually or as part of a build)?

@elgertam

This comment has been minimized.

Show comment
Hide comment
@elgertam

elgertam Oct 17, 2017

If I'm reading correctly, I believe Kenneth and the other maintainers don't want us to treat Pipenv as a project dependency as we might for pytest, or even a normal package dependency. Ideally, it sounds like we should install and update Pipenv in the same way as pip itself, i.e. pip is installed when Python is installed or when a virtualenv is created. That is what Kenneth meant when he said, "Do not do this."

That said, @isobit echoed my thoughts that Pipfile should be the single source of truth. I can see two compelling use cases for favoring Pipfile (and there are others): first, a CI/CD pipeline that depends on Pipfile to set up the build environment is much more robust than one depending on requirements.txt; and second, a contributor may blindly try to install a Pipfile-based project may be frustrated if python setup.py install doesn't work the way she expects. Considering that neither Pipenv or Pipfile-aware pip are standard Python tools yet, and Pipenv is indeed the reference implementation for Pipfile, we have few options to solve the problem:

  1. Specify in your project documentation that your project depends on Pipenv. You can still depend on Pipenv in your setup.py, and this will break if Pipenv is not installed in your Python environment. Concretely, a contributor to your code would have to manually install Pipenv to her virtualenv in order to install the project with setup.py.
  2. Still have setup.py depend on requirements.txt, which you generate periodically based on your Pipfile. This remains fully compatible with pip and setuptools, but requires any maintainer to generate the requirements.txt whenever the project is built and deployed. A possible variation of this would be for a CI/CD pipeline to update the requirements.txt at build time.
  3. Vendorize a version of Pipenv into a project and call it using from _vendor.pipenv.project import Project... inside setup.py. One variation of this could be to only import from the vendorized version when the global import fails.
  4. Some other option that's not presented here and that I'm not smart enough to think of.

I'm personally using (3) (see #209 (comment)) until Pipfile becomes more of a common standard, at which point I will not have any of my projects depend on Pipenv code directly, since Pipenv seems clearly meant to be a tool for managing a virtualenv based on a Pipfile, and not necessarily a Pipfile library itself.

I hope this clarifies the issue based on what I've read here, but if I misspoke or said something egregious, please let me know @nateprewitt.

elgertam commented Oct 17, 2017

If I'm reading correctly, I believe Kenneth and the other maintainers don't want us to treat Pipenv as a project dependency as we might for pytest, or even a normal package dependency. Ideally, it sounds like we should install and update Pipenv in the same way as pip itself, i.e. pip is installed when Python is installed or when a virtualenv is created. That is what Kenneth meant when he said, "Do not do this."

That said, @isobit echoed my thoughts that Pipfile should be the single source of truth. I can see two compelling use cases for favoring Pipfile (and there are others): first, a CI/CD pipeline that depends on Pipfile to set up the build environment is much more robust than one depending on requirements.txt; and second, a contributor may blindly try to install a Pipfile-based project may be frustrated if python setup.py install doesn't work the way she expects. Considering that neither Pipenv or Pipfile-aware pip are standard Python tools yet, and Pipenv is indeed the reference implementation for Pipfile, we have few options to solve the problem:

  1. Specify in your project documentation that your project depends on Pipenv. You can still depend on Pipenv in your setup.py, and this will break if Pipenv is not installed in your Python environment. Concretely, a contributor to your code would have to manually install Pipenv to her virtualenv in order to install the project with setup.py.
  2. Still have setup.py depend on requirements.txt, which you generate periodically based on your Pipfile. This remains fully compatible with pip and setuptools, but requires any maintainer to generate the requirements.txt whenever the project is built and deployed. A possible variation of this would be for a CI/CD pipeline to update the requirements.txt at build time.
  3. Vendorize a version of Pipenv into a project and call it using from _vendor.pipenv.project import Project... inside setup.py. One variation of this could be to only import from the vendorized version when the global import fails.
  4. Some other option that's not presented here and that I'm not smart enough to think of.

I'm personally using (3) (see #209 (comment)) until Pipfile becomes more of a common standard, at which point I will not have any of my projects depend on Pipenv code directly, since Pipenv seems clearly meant to be a tool for managing a virtualenv based on a Pipfile, and not necessarily a Pipfile library itself.

I hope this clarifies the issue based on what I've read here, but if I misspoke or said something egregious, please let me know @nateprewitt.

@vphilippon

This comment has been minimized.

Show comment
Hide comment
@vphilippon

vphilippon Oct 17, 2017

Member

I have the feeling that the issue here arise from an original misuse (IMO) of the requirements.txt to begin with. That's appart from the usage of pipenv.

I'm going to point to this wonderful article from Donald Stufft, setup.py v.s requirements.txt:
https://caremad.io/posts/2013/07/setup-vs-requirement/

TL;DR (but you really should read though): The setup.py's install_requires is meant to detail the requirements (dependencies) of a package. The requirements.txt (that would be replaced by the Pipfile/Pipfile.lock combo here) should be used to list which exact packages will be used to satisfy what is required, based on the metadata from the setup.py, in order to make a reproducible environment.
Populating the install_requires from the requirements.txt is like going backward.

setup.py's install_requires =/= requirements.txt (or Pipfile/Pipfile.lock).
Pipfile (or rather Pipfile.lock) should be the single source of truth of the packages to install in the app's environment.
setup.py's install_requires provides metadata used to generate a valid Pipfile.lock.

I think that's where the friction is coming from. I hope that make sense.

Member

vphilippon commented Oct 17, 2017

I have the feeling that the issue here arise from an original misuse (IMO) of the requirements.txt to begin with. That's appart from the usage of pipenv.

I'm going to point to this wonderful article from Donald Stufft, setup.py v.s requirements.txt:
https://caremad.io/posts/2013/07/setup-vs-requirement/

TL;DR (but you really should read though): The setup.py's install_requires is meant to detail the requirements (dependencies) of a package. The requirements.txt (that would be replaced by the Pipfile/Pipfile.lock combo here) should be used to list which exact packages will be used to satisfy what is required, based on the metadata from the setup.py, in order to make a reproducible environment.
Populating the install_requires from the requirements.txt is like going backward.

setup.py's install_requires =/= requirements.txt (or Pipfile/Pipfile.lock).
Pipfile (or rather Pipfile.lock) should be the single source of truth of the packages to install in the app's environment.
setup.py's install_requires provides metadata used to generate a valid Pipfile.lock.

I think that's where the friction is coming from. I hope that make sense.

@elgertam

This comment has been minimized.

Show comment
Hide comment
@elgertam

elgertam Oct 17, 2017

I like that reply a lot, Vincent, and I absolutely agree on Pipfile.lock being a complete (and better) replacement for requirements.txt.

Having used Pipenv for a few months now, though, my usage of Pipfile leads me think that install_requires really is almost identical to what goes in the Pipfile. If I need numpy, I pipenv install numpy and a new entry goes into in my Pipfile's [packages] group: numpy = "*". In other words, my usage is totally different from requirements.txt, which I used to just generate before committing using pip freeze > requirements.txt.

Perhaps this is just a peculiar way that I am using Pipenv, and I'm going against the grain (I also install my virtualenv in .venv/ inside the project directory, so I'm a rogue Pythonista), in which case I can easily comply with the Python community's convention of having a wall of separation between setup.py and Pipfile|Pipfile.lock|requirements.txt.

What am I missing here, @vphilippon? Why is Pipfile's [packages] too constrained to use in install_requires or tests_require?

elgertam commented Oct 17, 2017

I like that reply a lot, Vincent, and I absolutely agree on Pipfile.lock being a complete (and better) replacement for requirements.txt.

Having used Pipenv for a few months now, though, my usage of Pipfile leads me think that install_requires really is almost identical to what goes in the Pipfile. If I need numpy, I pipenv install numpy and a new entry goes into in my Pipfile's [packages] group: numpy = "*". In other words, my usage is totally different from requirements.txt, which I used to just generate before committing using pip freeze > requirements.txt.

Perhaps this is just a peculiar way that I am using Pipenv, and I'm going against the grain (I also install my virtualenv in .venv/ inside the project directory, so I'm a rogue Pythonista), in which case I can easily comply with the Python community's convention of having a wall of separation between setup.py and Pipfile|Pipfile.lock|requirements.txt.

What am I missing here, @vphilippon? Why is Pipfile's [packages] too constrained to use in install_requires or tests_require?

@isobit

This comment has been minimized.

Show comment
Hide comment
@isobit

isobit Oct 17, 2017

Thanks for the info, @vphilippon. Perhaps we are going about this backwards, it sounds like what we really want is the reverse- a way use abstract deps from install_requires in our Pipfile, like Donald mentions with regard to -e . in the requirements.txt. It looks like there was already an issue about that (#339), but it didn't seem to go anywhere.

isobit commented Oct 17, 2017

Thanks for the info, @vphilippon. Perhaps we are going about this backwards, it sounds like what we really want is the reverse- a way use abstract deps from install_requires in our Pipfile, like Donald mentions with regard to -e . in the requirements.txt. It looks like there was already an issue about that (#339), but it didn't seem to go anywhere.

@isobit

This comment has been minimized.

Show comment
Hide comment
@isobit

isobit Oct 17, 2017

Is this already covered by the Pipfile syntax? I just noticed the requests library Pipfile uses "e1839a8" = {path = ".", editable = true, extras=["socks"]} in its packages section. Something similar is apparent in the Pipfile examples, but I don't see any other documentation.

isobit commented Oct 17, 2017

Is this already covered by the Pipfile syntax? I just noticed the requests library Pipfile uses "e1839a8" = {path = ".", editable = true, extras=["socks"]} in its packages section. Something similar is apparent in the Pipfile examples, but I don't see any other documentation.

@vphilippon

This comment has been minimized.

Show comment
Hide comment
@vphilippon

vphilippon Oct 17, 2017

Member

First, disclaimer: My expertise is mainly with lib packages. I might miss some points. I hold the right to be wrong, and I'm ready to use it!
Also, this got me to scratch my head a few time. I'd really like some review on this.

Now, let's get to this.

I'll start with this statement @elgertam:

[...] my usage of Pipfile leads me think that install_requires really is almost identical to what goes in the Pipfile. If I need numpy, I pipenv install numpy and a new entry goes into in my Pipfile's [packages] group [...]

You added numpy to your environment, you did not add numpy to the dependencies of your app.
Those are two different things. Keep reading, you'll see what I mean.

In other words, my usage is totally different from requirements.txt, which I used to just generate before committing using pip freeze > requirements.txt.

Surprisingly, your usage is not so different if you think about it:

  • Your previous worflow: pip install stuff -> pip freeze > requirements.txt -> feed install_requires from requirements.txt
  • Your new (attempted) workflow: pipenv install stuff -> Pipfile automatically updated -> trying to feed install_requires with the Pipfile.
  • What's the intended idea: Add stuff to install_requires -> pipenv install -> Environment and Pipfile.lock are updated.

And for that intended way to work, you want a Pipfile that states that you want to install your app.
Something like the requests Pipfile linked by @isobit.

Or, an example:

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true

[dev-packages]
pytest = ">=2.8.0"
tox = "*"

[packages]
"-e ." = "*"

Your Pipfile is to describe your environment, not the dependencies of a package. As you see, the Pipfile above defines what I want to install, which is the local package in editable mode.

This may look a bit "useless", as right now its all drived by a single package, but let say you want to install your app, but with requests[security], but it's not a strict dependency from your app, you do pipenv install requests[security], and then:

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true

[dev-packages]
pytest = ">=2.8.0"
tox = "*"

[packages]
"-e ." = "*"
requests = { extras = ['security'] }

And voilà, here's an example of the difference between your abstract requirements, and your concrete requirements. Same goes if you wanted to install gunicorn or anything else needed in the environment, but that's not part of the app itself.

What am I missing here, @vphilippon? Why is Pipfile's [packages] too constrained to use in install_requires or tests_require?

If I've explained this well enough, you can see that its just supposed to be the other way around.
You put your dependencies in install_requires, you put you package in the Pipfile, and then you get and environment, with a Pipfile.lock for reproducibility (as it will resolve and respect your package dependencies).

For test_require, I'll admit I'm not sure that fits in all of this. IIRC, its a setuptools specific feature. We could argue that it's a set of abstract dependencies for testing, and expect pipenv to resolve and install those then doing pipenv install --dev, for all packages, but I have a feeling that's not quite right. I dont have a clear idea or opinion on this and the rationale around it, sorry.

I hope this all make sense somehow.

Member

vphilippon commented Oct 17, 2017

First, disclaimer: My expertise is mainly with lib packages. I might miss some points. I hold the right to be wrong, and I'm ready to use it!
Also, this got me to scratch my head a few time. I'd really like some review on this.

Now, let's get to this.

I'll start with this statement @elgertam:

[...] my usage of Pipfile leads me think that install_requires really is almost identical to what goes in the Pipfile. If I need numpy, I pipenv install numpy and a new entry goes into in my Pipfile's [packages] group [...]

You added numpy to your environment, you did not add numpy to the dependencies of your app.
Those are two different things. Keep reading, you'll see what I mean.

In other words, my usage is totally different from requirements.txt, which I used to just generate before committing using pip freeze > requirements.txt.

Surprisingly, your usage is not so different if you think about it:

  • Your previous worflow: pip install stuff -> pip freeze > requirements.txt -> feed install_requires from requirements.txt
  • Your new (attempted) workflow: pipenv install stuff -> Pipfile automatically updated -> trying to feed install_requires with the Pipfile.
  • What's the intended idea: Add stuff to install_requires -> pipenv install -> Environment and Pipfile.lock are updated.

And for that intended way to work, you want a Pipfile that states that you want to install your app.
Something like the requests Pipfile linked by @isobit.

Or, an example:

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true

[dev-packages]
pytest = ">=2.8.0"
tox = "*"

[packages]
"-e ." = "*"

Your Pipfile is to describe your environment, not the dependencies of a package. As you see, the Pipfile above defines what I want to install, which is the local package in editable mode.

This may look a bit "useless", as right now its all drived by a single package, but let say you want to install your app, but with requests[security], but it's not a strict dependency from your app, you do pipenv install requests[security], and then:

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true

[dev-packages]
pytest = ">=2.8.0"
tox = "*"

[packages]
"-e ." = "*"
requests = { extras = ['security'] }

And voilà, here's an example of the difference between your abstract requirements, and your concrete requirements. Same goes if you wanted to install gunicorn or anything else needed in the environment, but that's not part of the app itself.

What am I missing here, @vphilippon? Why is Pipfile's [packages] too constrained to use in install_requires or tests_require?

If I've explained this well enough, you can see that its just supposed to be the other way around.
You put your dependencies in install_requires, you put you package in the Pipfile, and then you get and environment, with a Pipfile.lock for reproducibility (as it will resolve and respect your package dependencies).

For test_require, I'll admit I'm not sure that fits in all of this. IIRC, its a setuptools specific feature. We could argue that it's a set of abstract dependencies for testing, and expect pipenv to resolve and install those then doing pipenv install --dev, for all packages, but I have a feeling that's not quite right. I dont have a clear idea or opinion on this and the rationale around it, sorry.

I hope this all make sense somehow.

@elgertam

This comment has been minimized.

Show comment
Hide comment
@elgertam

elgertam Oct 18, 2017

@vphilippon You explained it quite well, and I think you've convinced me.

TL;DR: Specify the abstract, absolutely necessary dependencies in setup.py, then add context (and thus concreteness) in Pipfile and Pipfile.lock, including the fantastic "-e ." = "*" entry.

elgertam commented Oct 18, 2017

@vphilippon You explained it quite well, and I think you've convinced me.

TL;DR: Specify the abstract, absolutely necessary dependencies in setup.py, then add context (and thus concreteness) in Pipfile and Pipfile.lock, including the fantastic "-e ." = "*" entry.

@tuukkamustonen

This comment has been minimized.

Show comment
Hide comment
@tuukkamustonen

tuukkamustonen Oct 18, 2017

A problem with #209 (comment) is that when we want to deploy our app to a server, we're not getting a reproducible environment.

I mean normally, when developing a library that will be used in other projects, we want our install_requires to be pretty loose (=no binding to exact versions). Yes. But when we're building a web app (or any app) and deploying it on a remote server or docker container, then we probably want fixed dependencies. Even if we specify exact versions in install_requires, transitive dependencies are not locked and installation may actually download a different (newer) version of a transitive dependency and that may break your deployment.

(Manually declaring exact versions of transitive dependencies is not an on option - way too cumbersome.)

In this use case, we ought to depend on a requirements.txt style lockfile (that specifies exact versions even for transitive dependencies). However, it doesn't seem as if pipenv allows to exclude development requirements in pipenv lock -r (e.g. pipenv lock --no-dev -r), so that we could actually craft such requirements.txt (that can then be read into install_requires)?

tuukkamustonen commented Oct 18, 2017

A problem with #209 (comment) is that when we want to deploy our app to a server, we're not getting a reproducible environment.

I mean normally, when developing a library that will be used in other projects, we want our install_requires to be pretty loose (=no binding to exact versions). Yes. But when we're building a web app (or any app) and deploying it on a remote server or docker container, then we probably want fixed dependencies. Even if we specify exact versions in install_requires, transitive dependencies are not locked and installation may actually download a different (newer) version of a transitive dependency and that may break your deployment.

(Manually declaring exact versions of transitive dependencies is not an on option - way too cumbersome.)

In this use case, we ought to depend on a requirements.txt style lockfile (that specifies exact versions even for transitive dependencies). However, it doesn't seem as if pipenv allows to exclude development requirements in pipenv lock -r (e.g. pipenv lock --no-dev -r), so that we could actually craft such requirements.txt (that can then be read into install_requires)?

@vphilippon

This comment has been minimized.

Show comment
Hide comment
@vphilippon

vphilippon Oct 18, 2017

Member

I'll quote Donald Stufft's article:

An application typically has a set of dependencies, often times even a very complex set of dependencies, that it has been tested against. Being a specific instance that has been deployed, it typically does not have a name, nor any of the other packaging related metadata. This is reflected in the abilities of a pip requirements file.

In other word: the app is not the package. The app is the environment, with a set of concrete dependencies installed. An app dependencies should be represented by a requirements.txt (or Pipfile/Pipfile.lock), not a single package install_requires.

Personally, I would go as far as to say that the full set of pinned dependencies (including the transitive one) should be in a requirements.txt for the app, not in the package's setup.py. This outlines the idea that deploying an app is not done with pip install myapp==1.0.0, but rather pip install -r requirements.txt (or pipenv install with a Pipfile.lock), where the requirements.txt includesmyapp==1.0.0 as well as all of the other dependencies and transitive dependencies, pinned.
It might looks like I'm going a bit far, but I've worked in a context where the "app" deployed is driven by a set of packages. There's not a single package that represent the app itself, so this notion that "the package is not the app" was thrown in my face pretty early on 😄 .

I have a strong feeling that Pipenv/Pipfile/Pipfile.lock follows this idea.
That would be why there seems to be a gap for going from Pipfile.lock to setup.py's install_requires: its really not meant to be done that way, in any case.

@maintainers I'd like your input here to know if this indeed how you see all of this. I've been preaching a vision of how we should be treating dependencies, but I don't want to be talking in your stead either.

Member

vphilippon commented Oct 18, 2017

I'll quote Donald Stufft's article:

An application typically has a set of dependencies, often times even a very complex set of dependencies, that it has been tested against. Being a specific instance that has been deployed, it typically does not have a name, nor any of the other packaging related metadata. This is reflected in the abilities of a pip requirements file.

In other word: the app is not the package. The app is the environment, with a set of concrete dependencies installed. An app dependencies should be represented by a requirements.txt (or Pipfile/Pipfile.lock), not a single package install_requires.

Personally, I would go as far as to say that the full set of pinned dependencies (including the transitive one) should be in a requirements.txt for the app, not in the package's setup.py. This outlines the idea that deploying an app is not done with pip install myapp==1.0.0, but rather pip install -r requirements.txt (or pipenv install with a Pipfile.lock), where the requirements.txt includesmyapp==1.0.0 as well as all of the other dependencies and transitive dependencies, pinned.
It might looks like I'm going a bit far, but I've worked in a context where the "app" deployed is driven by a set of packages. There's not a single package that represent the app itself, so this notion that "the package is not the app" was thrown in my face pretty early on 😄 .

I have a strong feeling that Pipenv/Pipfile/Pipfile.lock follows this idea.
That would be why there seems to be a gap for going from Pipfile.lock to setup.py's install_requires: its really not meant to be done that way, in any case.

@maintainers I'd like your input here to know if this indeed how you see all of this. I've been preaching a vision of how we should be treating dependencies, but I don't want to be talking in your stead either.

@tuukkamustonen

This comment has been minimized.

Show comment
Hide comment
@tuukkamustonen

tuukkamustonen Oct 18, 2017

@vphilippon I think there are different viewpoints and terminology. But ultimately, we want a list of pinned dependencies to install. So a requirements.txt with pinned versions (or a package with such declared dependencies, doesn't really matter). The question is, how to actually craft such a file?

With pip-compile (of pip-tools), I can compile such requirements.txt from requirements.in and it will contain only the non-dev dependencies that my app needs. I'm not sure if I'm interpreting your response correctly - do you really mean that we should maintain a pinned dependencies requirements.txt by hand (also slightly duplicating what's already in setup.py), also for transitive dependencies? That can't be the solution...

If there was pipenv lock --no-dev -r, I think that would solve this problem.

tuukkamustonen commented Oct 18, 2017

@vphilippon I think there are different viewpoints and terminology. But ultimately, we want a list of pinned dependencies to install. So a requirements.txt with pinned versions (or a package with such declared dependencies, doesn't really matter). The question is, how to actually craft such a file?

With pip-compile (of pip-tools), I can compile such requirements.txt from requirements.in and it will contain only the non-dev dependencies that my app needs. I'm not sure if I'm interpreting your response correctly - do you really mean that we should maintain a pinned dependencies requirements.txt by hand (also slightly duplicating what's already in setup.py), also for transitive dependencies? That can't be the solution...

If there was pipenv lock --no-dev -r, I think that would solve this problem.

@vphilippon

This comment has been minimized.

Show comment
Hide comment
@vphilippon

vphilippon Oct 18, 2017

Member

@tuukkamustonen Sorry for the confusion, I was really only addressing the idea of the install_requires v.s. requirements.txt/Pipfile/Pipfile.lock.

So a requirements.txt with pinned versions (or a package with such declared dependencies, doesn't really matter).

I think the distinction is really important, but as you said, there's different viewpoints here. Let's agree to disagree for now. As a side note, it would be nice to have somewhere to keep going on the subject without adding noise to a specific issue. That's the kind of stuff that needs more discussion and sharing in the community, IMO.

However, it doesn't seem as if pipenv allows to exclude development requirements in pipenv lock -r

But ultimately, we want a list of pinned dependencies to install. [...] The question is, how to actually craft such a file? [...] With pip-compile (of pip-tools), I can compile such requirements.txt from requirements.in and it will contain only the non-dev dependencies that my app needs.

Ah, I skipped that part at first, sorry. And it seems you're right: I'm not able to find a way to say pipenv install --not-dev-stuff (I was pretty sure there was one though, weird), and generate a non-dev environment. What's the point of having a 2 separate section then? I might be missing something now then, and that's unrelated to the usage with setup.py. Maybe its worth discussing in a new issue.

EDIT:
I made a mistake here. I indeed did not find a way to generate a Pipfile.lock without dev packages, but, in a new environment, with existing Pipfile/Pipfile.lock, doing pipenv install does not install the dev packages. This does not solve the point of @tuukkamustonen, but I was wrong when stating there was no way to install a "prod environment", my mistake.

Member

vphilippon commented Oct 18, 2017

@tuukkamustonen Sorry for the confusion, I was really only addressing the idea of the install_requires v.s. requirements.txt/Pipfile/Pipfile.lock.

So a requirements.txt with pinned versions (or a package with such declared dependencies, doesn't really matter).

I think the distinction is really important, but as you said, there's different viewpoints here. Let's agree to disagree for now. As a side note, it would be nice to have somewhere to keep going on the subject without adding noise to a specific issue. That's the kind of stuff that needs more discussion and sharing in the community, IMO.

However, it doesn't seem as if pipenv allows to exclude development requirements in pipenv lock -r

But ultimately, we want a list of pinned dependencies to install. [...] The question is, how to actually craft such a file? [...] With pip-compile (of pip-tools), I can compile such requirements.txt from requirements.in and it will contain only the non-dev dependencies that my app needs.

Ah, I skipped that part at first, sorry. And it seems you're right: I'm not able to find a way to say pipenv install --not-dev-stuff (I was pretty sure there was one though, weird), and generate a non-dev environment. What's the point of having a 2 separate section then? I might be missing something now then, and that's unrelated to the usage with setup.py. Maybe its worth discussing in a new issue.

EDIT:
I made a mistake here. I indeed did not find a way to generate a Pipfile.lock without dev packages, but, in a new environment, with existing Pipfile/Pipfile.lock, doing pipenv install does not install the dev packages. This does not solve the point of @tuukkamustonen, but I was wrong when stating there was no way to install a "prod environment", my mistake.

@nateprewitt

This comment has been minimized.

Show comment
Hide comment
@nateprewitt

nateprewitt Oct 21, 2017

Member

Phew, that was a lot to catch up on.

Unless I'm badly mistaken, the code in #209 (comment) is safe to be used if you're building a wheel (=binary dist), but not if you're building sdist (=source dist).

@tuukkamustonen this is intended only to be used in a standalone script for deployments, do not include it in your setup.py. This is a semi-hacky workaround from before we had pipenv lock -r, many months ago. This approach will work for your use case of splitting packages and dev-packages though.

I'm personally using (3) (see #209 (comment)) until Pipfile becomes more of a common standard, at which point I will not have any of my projects depend on Pipenv code directly, since Pipenv seems clearly meant to be a tool for managing a virtualenv based on a Pipfile, and not necessarily a Pipfile library itself.

@elgertam it seems your opinions may have been swayed since this comment, but I would note that it's probably not a good idea to bundle pipenv with your project. There's nothing explicity prohibiting that, but we do a lot of path patching which is prone to causing issues when used like this. I guess I'll just wrap this with a "use at your own risk" warning.

maintainers I'd like your input here to know if this indeed how you see all of this. I've been preaching a vision of how we should be treating dependencies, but I don't want to be talking in your stead either.

I think you're pretty well in line with what our vision has been for the length of the project. Thanks for compiling all of this an articulating it so well @vphilippon!

It looks like the other relevant parts of this discussion have been moved into #942, so I think we're good here. Please ping me if I didn't address anything.

Member

nateprewitt commented Oct 21, 2017

Phew, that was a lot to catch up on.

Unless I'm badly mistaken, the code in #209 (comment) is safe to be used if you're building a wheel (=binary dist), but not if you're building sdist (=source dist).

@tuukkamustonen this is intended only to be used in a standalone script for deployments, do not include it in your setup.py. This is a semi-hacky workaround from before we had pipenv lock -r, many months ago. This approach will work for your use case of splitting packages and dev-packages though.

I'm personally using (3) (see #209 (comment)) until Pipfile becomes more of a common standard, at which point I will not have any of my projects depend on Pipenv code directly, since Pipenv seems clearly meant to be a tool for managing a virtualenv based on a Pipfile, and not necessarily a Pipfile library itself.

@elgertam it seems your opinions may have been swayed since this comment, but I would note that it's probably not a good idea to bundle pipenv with your project. There's nothing explicity prohibiting that, but we do a lot of path patching which is prone to causing issues when used like this. I guess I'll just wrap this with a "use at your own risk" warning.

maintainers I'd like your input here to know if this indeed how you see all of this. I've been preaching a vision of how we should be treating dependencies, but I don't want to be talking in your stead either.

I think you're pretty well in line with what our vision has been for the length of the project. Thanks for compiling all of this an articulating it so well @vphilippon!

It looks like the other relevant parts of this discussion have been moved into #942, so I think we're good here. Please ping me if I didn't address anything.

@taion

This comment has been minimized.

Show comment
Hide comment
@taion

taion Nov 27, 2017

I followed up with a concrete proposal in pypa/pipfile#98 that I believe gives us something actionable and pragmatic that could improve DX for maintaining Python libraries.

taion commented Nov 27, 2017

I followed up with a concrete proposal in pypa/pipfile#98 that I believe gives us something actionable and pragmatic that could improve DX for maintaining Python libraries.

@cornfeedhobo

This comment has been minimized.

Show comment
Hide comment
@cornfeedhobo

cornfeedhobo Feb 19, 2018

Thoughts?

import json, jmespath

install_requires = []
with open('Pipfile.lock') as f:
    context = json.loads(f.read())
    install_requires.extend(map(
        lambda n, v : n + v,
        jmespath.search('default | keys(@)', context),
        jmespath.search('default.*.version', context),
    ))

setup(
    name='foobar',
    packages=find_packages(),
    setup_requires=['jmespath'],
    install_requires=install_requires,
)

cornfeedhobo commented Feb 19, 2018

Thoughts?

import json, jmespath

install_requires = []
with open('Pipfile.lock') as f:
    context = json.loads(f.read())
    install_requires.extend(map(
        lambda n, v : n + v,
        jmespath.search('default | keys(@)', context),
        jmespath.search('default.*.version', context),
    ))

setup(
    name='foobar',
    packages=find_packages(),
    setup_requires=['jmespath'],
    install_requires=install_requires,
)
@epot

This comment has been minimized.

Show comment
Hide comment
@epot

epot Feb 20, 2018

@cornfeedhobo my understanding is that setup_requires does not play well with pip. Could you elaborate a bit on how you would suggest to use this sample?

epot commented Feb 20, 2018

@cornfeedhobo my understanding is that setup_requires does not play well with pip. Could you elaborate a bit on how you would suggest to use this sample?

@uranusjr

This comment has been minimized.

Show comment
Hide comment
@uranusjr

uranusjr Feb 20, 2018

Member

The sample will not work unless you install jmespath first because setup.py is evaluated as normal Python code. The setup_requires argument does not actually achieve anything: if the program gets that far, jmespath is guaranteed to be installed.

I mentioned this in another issue, but can’t locate it atm (there are so many duplicated discussions all over the issue tracker it’s impossible to find anything anymore), so I’ll say it again: Please do not put anything not built-in inside setup.py unless you provide proper fallback, or have a perfect reason. A package containing the above setup.py will not even work with Pipenv with jmespath installed in the virtualenv; Pipenv invokes setup.py egg_info in a clean environment, and will fail to execute the jmespath import. This is bad practice. Please avoid it.

Member

uranusjr commented Feb 20, 2018

The sample will not work unless you install jmespath first because setup.py is evaluated as normal Python code. The setup_requires argument does not actually achieve anything: if the program gets that far, jmespath is guaranteed to be installed.

I mentioned this in another issue, but can’t locate it atm (there are so many duplicated discussions all over the issue tracker it’s impossible to find anything anymore), so I’ll say it again: Please do not put anything not built-in inside setup.py unless you provide proper fallback, or have a perfect reason. A package containing the above setup.py will not even work with Pipenv with jmespath installed in the virtualenv; Pipenv invokes setup.py egg_info in a clean environment, and will fail to execute the jmespath import. This is bad practice. Please avoid it.

@cornfeedhobo

This comment has been minimized.

Show comment
Hide comment
@cornfeedhobo

cornfeedhobo Feb 20, 2018

@epot I was not aware of that

@uranusjr Thanks for the well thought out answer with details. I am just exploring this whole issue, so I might return for more embarrassment. I'm also tracking pypa/pipfile#98

cornfeedhobo commented Feb 20, 2018

@epot I was not aware of that

@uranusjr Thanks for the well thought out answer with details. I am just exploring this whole issue, so I might return for more embarrassment. I'm also tracking pypa/pipfile#98

@mschwager

This comment has been minimized.

Show comment
Hide comment
@mschwager

mschwager Apr 19, 2018

What if we don't require jmespath?

import json


install_requires = []
tests_require = []

with open('Pipfile.lock') as fd:
    lock_data = json.load(fd)
    install_requires = [
        package_name + package_data['version']
        for package_name, package_data in lock_data['default'].items()
    ]
    tests_require = [
        package_name + package_data['version']
        for package_name, package_data in lock_data['develop'].items()
    ]

mschwager commented Apr 19, 2018

What if we don't require jmespath?

import json


install_requires = []
tests_require = []

with open('Pipfile.lock') as fd:
    lock_data = json.load(fd)
    install_requires = [
        package_name + package_data['version']
        for package_name, package_data in lock_data['default'].items()
    ]
    tests_require = [
        package_name + package_data['version']
        for package_name, package_data in lock_data['develop'].items()
    ]
@uranusjr

This comment has been minimized.

Show comment
Hide comment
@uranusjr

uranusjr Apr 19, 2018

Member

@mschwager You don’t want to pin the version, or users will have a difficult time. #1921 is an example of a library using == ends up breaking a user’s build.

Member

uranusjr commented Apr 19, 2018

@mschwager You don’t want to pin the version, or users will have a difficult time. #1921 is an example of a library using == ends up breaking a user’s build.

Ducatel added a commit to TraceSoftwareInternational/HostMyDocs-python-client that referenced this issue May 3, 2018

close #1
Thanks to @mschwager for the solution pypa/pipenv#209 (comment)

Ducatel added a commit to TraceSoftwareInternational/HostMyDocs-python-client that referenced this issue May 3, 2018

close #1
Thanks to @mschwager for the solution pypa/pipenv#209 (comment)

Ducatel added a commit to TraceSoftwareInternational/HostMyDocs-python-client that referenced this issue May 3, 2018

close #4
Thanks to @mschwager for the solution pypa/pipenv#209 (comment)
@ekhaydarov

This comment has been minimized.

Show comment
Hide comment
@ekhaydarov

ekhaydarov Jun 28, 2018

My apologies but whats the difference between using setup.py in order to use it as a package or requirements.txt/Pipfile to manage dependencies of said package? The required libs HAVE to be identical between setup.py and requirements.txt/Pipfile right? Therefore there is no reason not to integrate Pipfile. setup.py already parses requirements.txt. Why should it not the be able to parse Pipfile?

Would be great to get rid of requirements.txt and just use Pipfile

ekhaydarov commented Jun 28, 2018

My apologies but whats the difference between using setup.py in order to use it as a package or requirements.txt/Pipfile to manage dependencies of said package? The required libs HAVE to be identical between setup.py and requirements.txt/Pipfile right? Therefore there is no reason not to integrate Pipfile. setup.py already parses requirements.txt. Why should it not the be able to parse Pipfile?

Would be great to get rid of requirements.txt and just use Pipfile

@uranusjr

This comment has been minimized.

Show comment
Hide comment
@uranusjr

uranusjr Jun 28, 2018

Member

No, there is no reason they have to be identical. That is quite a radical assumption, and many in the community would beg the differ.

There are indeed, however, reasons that they can be identical. Pipenv does not exclude that. It is just out of the scope, and not supported by this project. You can totally build a library to support that, and leverage PEP 518, which is implrmrneted in pip 10.0, to provide build-time support.

As you said, there is no reason not to allow setup.py to parse Pipfile. I look forward to you making that happen :)

Member

uranusjr commented Jun 28, 2018

No, there is no reason they have to be identical. That is quite a radical assumption, and many in the community would beg the differ.

There are indeed, however, reasons that they can be identical. Pipenv does not exclude that. It is just out of the scope, and not supported by this project. You can totally build a library to support that, and leverage PEP 518, which is implrmrneted in pip 10.0, to provide build-time support.

As you said, there is no reason not to allow setup.py to parse Pipfile. I look forward to you making that happen :)

@ekhaydarov

This comment has been minimized.

Show comment
Hide comment
@ekhaydarov

ekhaydarov Jun 29, 2018

Understand that some people like the abstract libs of setup.py yet that is not a must have? I mean golang suffers from only having concrete requirements but still being able to substitute a required lib with your own fork just because it matches the name? Understandable that setup.py intergation is just not in scope.

However, would be interesting to see what the long term roadmap of pipenv is. Would be great to see it become the go to tool in python. e.g. somehow replaces setup.py or generates an appropriate setup.py for the user, either way pipenv being de facto package manger is awesome. Just wondering if there is a possibility to extend the scope to include setup.py?

If pipenv is like npm etc., then their package.json allows remote installation, no reason pipenv cant interact or replace setup.py, making it in scope. Am I making sense or does it sound like I am taking crazy pills?

ekhaydarov commented Jun 29, 2018

Understand that some people like the abstract libs of setup.py yet that is not a must have? I mean golang suffers from only having concrete requirements but still being able to substitute a required lib with your own fork just because it matches the name? Understandable that setup.py intergation is just not in scope.

However, would be interesting to see what the long term roadmap of pipenv is. Would be great to see it become the go to tool in python. e.g. somehow replaces setup.py or generates an appropriate setup.py for the user, either way pipenv being de facto package manger is awesome. Just wondering if there is a possibility to extend the scope to include setup.py?

If pipenv is like npm etc., then their package.json allows remote installation, no reason pipenv cant interact or replace setup.py, making it in scope. Am I making sense or does it sound like I am taking crazy pills?

@uranusjr

This comment has been minimized.

Show comment
Hide comment
@uranusjr

uranusjr Jun 29, 2018

Member

Thank you for thinking it be a must-have. With you considering it be so crucial, I believe we will be able to run a Pipfile-based build system in no time.

Member

uranusjr commented Jun 29, 2018

Thank you for thinking it be a must-have. With you considering it be so crucial, I believe we will be able to run a Pipfile-based build system in no time.

@pypa pypa locked as resolved and limited conversation to collaborators Jun 29, 2018

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.