New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workflow for layered requirements (e.g. prod<-test<-dev requirements)? #398

Open
dan-passaro opened this Issue Oct 6, 2016 · 22 comments

Comments

Projects
None yet
@dan-passaro

dan-passaro commented Oct 6, 2016

Say I have

requirements.in:

Django~=1.8.0

And also

requirements-dev.in:

django-debug-toolbar

How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?

For now I have an ad-hoc script that compiles requirements.in first, then requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow? I'm worried that in the future if I add a dependency it will try and update a bunch of stuff I don't want it to update, but I haven't actually used this tool long enough to determine whether that's truly a problem. Wondering if anyone else has used pip-tools in this fashion and has any advice?

@jamescooke

This comment has been minimized.

Show comment
Hide comment
@jamescooke

jamescooke Nov 13, 2016

requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow?

Yes I totally think that's a good strategy.

Wondering if anyone else has used pip-tools in this fashion and has any advice?

I've just published my pip-tools workflow for managing dependent requirements files: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

jamescooke commented Nov 13, 2016

requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow?

Yes I totally think that's a good strategy.

Wondering if anyone else has used pip-tools in this fashion and has any advice?

I've just published my pip-tools workflow for managing dependent requirements files: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

@nvie

This comment has been minimized.

Show comment
Hide comment
@nvie

nvie Nov 17, 2016

Member

Hi @jamescooke, I just saw your post, it looks great! Once suggestion I could make here is that you can include the shared .in file (so not the .txt file!) from within your .in files. That way, pip-compile has just a tiny little bit more information to compile the per-env output files.

Member

nvie commented Nov 17, 2016

Hi @jamescooke, I just saw your post, it looks great! Once suggestion I could make here is that you can include the shared .in file (so not the .txt file!) from within your .in files. That way, pip-compile has just a tiny little bit more information to compile the per-env output files.

@nvie

This comment has been minimized.

Show comment
Hide comment
@nvie

nvie Nov 17, 2016

Member

To answer the original question, you can use this for your requirements-dev.in:

-r requirements.in
django-debug-toolbar

And then use this to compile it:

pip-compile requirements-dev.in

And then it all just works™.

Member

nvie commented Nov 17, 2016

To answer the original question, you can use this for your requirements-dev.in:

-r requirements.in
django-debug-toolbar

And then use this to compile it:

pip-compile requirements-dev.in

And then it all just works™.

@nvie nvie closed this Nov 17, 2016

@jamescooke

This comment has been minimized.

Show comment
Hide comment
@jamescooke

jamescooke Nov 17, 2016

Hi @nvie - thanks for the kind words about the blog post 😊.

The reason that I recommended including the .in files rather than the .txt files is to account for changes in the package indexes which might mean that a testing requirements file ends up with different package versions that the base file.

As an example, let's say that a project wants any Django that's version 1.8, so in requirements.in:

django<1.9

When we compile that file in October it picks 1.8.15 and makes requirements.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements.txt requirements.in
#
django==1.8.15

Now, in November a new version of Django is released 1.8.16. We add or update the testing requirements (without touching the production requirements) requirements-dev.in:

-r requirements.in
django-debug-toolbar

Using pip-compile requirements-dev.in, we compile that to requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.16
sqlparse==0.2.2           # via django-debug-toolbar

As you'll see, Django has now been bumped in the dev requirements only to 1.8.16 from 1.8.15, even though the main requirements.in and requirements.txt have not changed. A good developer would spot this for sure - but I've missed something similar before on previous projects, with much resulting pain.

It's for this reason that I have been including the txt file instead of the in file - I've found it keeps the versions exactly the same between requirements layers.

So with requiremements-dev.in as:

-r requirements.txt
django-debug-toolbar

When this is compiled we now get requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.15
sqlparse==0.2.2           # via django-debug-toolbar

This maintains the exact Django 1.8.15 version that we're looking for, regardless of the fact that the dev requirements were compiled after the new version of Django was released. When we then update the requirements.txt and recompile the dev requirements, then that version will be bumped.

I'd be really happy to see the just works™️ version of the kind of pinning that I'm talking about - I'm sure I'm getting the wrong end of the stick somewhere.

One alternative I could see is to pin the Django version in an .in file, but isn't that missing the point of the them?

Sorry for the long essay 😞

jamescooke commented Nov 17, 2016

Hi @nvie - thanks for the kind words about the blog post 😊.

The reason that I recommended including the .in files rather than the .txt files is to account for changes in the package indexes which might mean that a testing requirements file ends up with different package versions that the base file.

As an example, let's say that a project wants any Django that's version 1.8, so in requirements.in:

django<1.9

When we compile that file in October it picks 1.8.15 and makes requirements.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements.txt requirements.in
#
django==1.8.15

Now, in November a new version of Django is released 1.8.16. We add or update the testing requirements (without touching the production requirements) requirements-dev.in:

-r requirements.in
django-debug-toolbar

Using pip-compile requirements-dev.in, we compile that to requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.16
sqlparse==0.2.2           # via django-debug-toolbar

As you'll see, Django has now been bumped in the dev requirements only to 1.8.16 from 1.8.15, even though the main requirements.in and requirements.txt have not changed. A good developer would spot this for sure - but I've missed something similar before on previous projects, with much resulting pain.

It's for this reason that I have been including the txt file instead of the in file - I've found it keeps the versions exactly the same between requirements layers.

So with requiremements-dev.in as:

-r requirements.txt
django-debug-toolbar

When this is compiled we now get requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.15
sqlparse==0.2.2           # via django-debug-toolbar

This maintains the exact Django 1.8.15 version that we're looking for, regardless of the fact that the dev requirements were compiled after the new version of Django was released. When we then update the requirements.txt and recompile the dev requirements, then that version will be bumped.

I'd be really happy to see the just works™️ version of the kind of pinning that I'm talking about - I'm sure I'm getting the wrong end of the stick somewhere.

One alternative I could see is to pin the Django version in an .in file, but isn't that missing the point of the them?

Sorry for the long essay 😞

@nvie

This comment has been minimized.

Show comment
Hide comment
@nvie

nvie Nov 17, 2016

Member

Thanks for the extensive explanation, @jamescooke! You're absolutely correct. This is why I normally also advise to always recompile both (all?) .in files at once, never one and not the other, so the pinned versions remain in sync. But yeah, I agree that this takes discipline and if you forget about it, the tooling won't protect you against these diverging pins indeed, your example illustrates that perfectly. Not sure how we can build support for this workflow into pip-compile natively.

Thanks for shining a little light on this subject.

Member

nvie commented Nov 17, 2016

Thanks for the extensive explanation, @jamescooke! You're absolutely correct. This is why I normally also advise to always recompile both (all?) .in files at once, never one and not the other, so the pinned versions remain in sync. But yeah, I agree that this takes discipline and if you forget about it, the tooling won't protect you against these diverging pins indeed, your example illustrates that perfectly. Not sure how we can build support for this workflow into pip-compile natively.

Thanks for shining a little light on this subject.

@franklinyu

This comment has been minimized.

Show comment
Hide comment
@franklinyu

franklinyu Dec 29, 2016

Sorry if I missed something, but what's the issue of -ring the .txt? The workflow looks good, except that I may need to intervene when updating a single package (but I typically just update all).

franklinyu commented Dec 29, 2016

Sorry if I missed something, but what's the issue of -ring the .txt? The workflow looks good, except that I may need to intervene when updating a single package (but I typically just update all).

@Groxx

This comment has been minimized.

Show comment
Hide comment
@Groxx

Groxx Dec 29, 2016

I'll just chime in with 100% voting for -r .txt. It's important that prod == dev == test as much as possible, and that's essentially the only way to ensure everything's a strict superset of prod.
We manage ours with a small script that compiles them in the right order, and (though sometimes it breaks) we do a non-upgrade compile in CI that diffs the results, to make sure nobody modified them by hand. It has stopped several mistakes already.

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt - in an ideal world, we'd be able to check the rules from everything at once and (sometimes) avoid that. But it's usually pretty easy to trace down.


Maybe a better end-result would be to be able to compile all .in files at once, so pip-compile can detect the dependencies between them, and have it produce multiple corresponding .txt files instead of one combined one? I'd find that more useful in pretty much every scenario I've encountered, and it seems like the right place to do it.

Groxx commented Dec 29, 2016

I'll just chime in with 100% voting for -r .txt. It's important that prod == dev == test as much as possible, and that's essentially the only way to ensure everything's a strict superset of prod.
We manage ours with a small script that compiles them in the right order, and (though sometimes it breaks) we do a non-upgrade compile in CI that diffs the results, to make sure nobody modified them by hand. It has stopped several mistakes already.

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt - in an ideal world, we'd be able to check the rules from everything at once and (sometimes) avoid that. But it's usually pretty easy to trace down.


Maybe a better end-result would be to be able to compile all .in files at once, so pip-compile can detect the dependencies between them, and have it produce multiple corresponding .txt files instead of one combined one? I'd find that more useful in pretty much every scenario I've encountered, and it seems like the right place to do it.

@franklinyu

This comment has been minimized.

Show comment
Hide comment
@franklinyu

franklinyu Dec 30, 2016

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt

@Groxx Does that happen after you upgrade (recompile) requirements.txt? So some new version in requirements.txt stops some legacy package in requirements-dev.in?

For (imaginary) example, django-debug-toolbar==1.6 (latest) works with django<1.9, and you bump Django to 2.0 in requirements.txt? In this case django-debug-toolbar gets out of date.

franklinyu commented Dec 30, 2016

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt

@Groxx Does that happen after you upgrade (recompile) requirements.txt? So some new version in requirements.txt stops some legacy package in requirements-dev.in?

For (imaginary) example, django-debug-toolbar==1.6 (latest) works with django<1.9, and you bump Django to 2.0 in requirements.txt? In this case django-debug-toolbar gets out of date.

@Groxx

This comment has been minimized.

Show comment
Hide comment
@Groxx

Groxx Dec 30, 2016

Yep, exactly. Though maybe more accurately (with fake versions):

  • django in requirements.in
  • django-debug-toolbar in requirements-dev.in
  • run pip-compile --upgrade requirements.in, succeed with django==2.0
  • run pip-compile requirements-dev.in (or with --upgrade)...
  • ... discover django==2.0 is incompatible with all versions of django-debug-toolbar...
  • ... (╯°□°)╯︵ ┻━┻ #ragequit

It's all made worse when the cause is in requirements-do-this-first.in and the conflict is in requirements-dev-test-localdev-really-long-req-chain.in and it takes a while to figure out why django==2.0 is being chosen in the first place. But once people understand the process / know to use --verbose, it doesn't usually take too long.

Groxx commented Dec 30, 2016

Yep, exactly. Though maybe more accurately (with fake versions):

  • django in requirements.in
  • django-debug-toolbar in requirements-dev.in
  • run pip-compile --upgrade requirements.in, succeed with django==2.0
  • run pip-compile requirements-dev.in (or with --upgrade)...
  • ... discover django==2.0 is incompatible with all versions of django-debug-toolbar...
  • ... (╯°□°)╯︵ ┻━┻ #ragequit

It's all made worse when the cause is in requirements-do-this-first.in and the conflict is in requirements-dev-test-localdev-really-long-req-chain.in and it takes a while to figure out why django==2.0 is being chosen in the first place. But once people understand the process / know to use --verbose, it doesn't usually take too long.

@franklinyu

This comment has been minimized.

Show comment
Hide comment
@franklinyu

franklinyu Dec 30, 2016

Even if it doesn't take long, it does waste time; and I believe it's one of the reason why we have this project. I came from Ruby background, where Bundler did the right thing: even if you ask it to only upgrade a single package, it re-constructs the entire dependency graph, including development ones. Similarly, when upgrading, I believe pip-compile should

  1. take all the requirements-*.in;
  2. list the union of all the packages;
  3. list the union of all requirements-*.txt pins;
  4. try to construct a dependency graph with all the package, satisfying all the pins;
  5. come up with a list of all the new pins;
  6. for each requirements-*.in, pick some packages from the final (new) pin list, and generate the respective requirements-*.txt.

I'm not sure whether current Pip works with this workflow, that is, whether this workflow is a feasible solution for pip-compile.

franklinyu commented Dec 30, 2016

Even if it doesn't take long, it does waste time; and I believe it's one of the reason why we have this project. I came from Ruby background, where Bundler did the right thing: even if you ask it to only upgrade a single package, it re-constructs the entire dependency graph, including development ones. Similarly, when upgrading, I believe pip-compile should

  1. take all the requirements-*.in;
  2. list the union of all the packages;
  3. list the union of all requirements-*.txt pins;
  4. try to construct a dependency graph with all the package, satisfying all the pins;
  5. come up with a list of all the new pins;
  6. for each requirements-*.in, pick some packages from the final (new) pin list, and generate the respective requirements-*.txt.

I'm not sure whether current Pip works with this workflow, that is, whether this workflow is a feasible solution for pip-compile.

@jamescooke

This comment has been minimized.

Show comment
Hide comment
@jamescooke

jamescooke Dec 30, 2016

@Groxx thanks for the django==2.0 and django-debug-toolbar example... this is the exact kind of scenario I've been concerned about. I found your example a good illustration.

@franklinyu - The Bundler strategy might also work. Thanks for illustrating the unions of package requirements 👍

jamescooke commented Dec 30, 2016

@Groxx thanks for the django==2.0 and django-debug-toolbar example... this is the exact kind of scenario I've been concerned about. I found your example a good illustration.

@franklinyu - The Bundler strategy might also work. Thanks for illustrating the unions of package requirements 👍

@Groxx

This comment has been minimized.

Show comment
Hide comment
@Groxx

Groxx Dec 30, 2016

@jamescooke yeah, it does happen. The alternative though, if you include -r requirements.in in your requirements-dev.in, is that this is possible:

  1. pip-compile --upgrade requirements.in, get django==2.0
  2. pip-compile --upgrade requirements-dev.in, get django==1.9 and django-debug-toolbar
  3. dev, test, etc against 1.9, but release against 2.0.

The mismatch between dev and prod and the lack of any error or warning are largely what pip-tools helps eliminate, so to me this is an entirely unacceptable result. I much prefer to have the second step fail, which reveals that there is a problem, rather than rely on my eyes to catch the disparity between the two.

Groxx commented Dec 30, 2016

@jamescooke yeah, it does happen. The alternative though, if you include -r requirements.in in your requirements-dev.in, is that this is possible:

  1. pip-compile --upgrade requirements.in, get django==2.0
  2. pip-compile --upgrade requirements-dev.in, get django==1.9 and django-debug-toolbar
  3. dev, test, etc against 1.9, but release against 2.0.

The mismatch between dev and prod and the lack of any error or warning are largely what pip-tools helps eliminate, so to me this is an entirely unacceptable result. I much prefer to have the second step fail, which reveals that there is a problem, rather than rely on my eyes to catch the disparity between the two.

@maxnordlund

This comment has been minimized.

Show comment
Hide comment
@maxnordlund

maxnordlund Mar 7, 2017

This bit me today and also coming from a Ruby/Bundler background I like that all dependencies is in the same lock file, but I don't want to install dev dependencies in production. However this seems incompatible with how pip currently operates. That is, one requirements.txt to rule them all, but separate common/dev dependencies.

I had hoped that having dev.in -> dev.txt would solve this but as others have noted you get conflicts. And while I could have a -r somewhere it still would produce two lock files, which sooner or later will diverge.

So my question is if it would be possible to teach pip-compile to just write the dependencies for one input file, while accepting the pinned ones in another. Perhaps an example would clarify this:

# requirements.in
django
# requirements.txt
django==1.9
# dev.in
-r requirements.in
django-debug-toolbar
# dev.txt
-r requirements.txt
django-debug-toolbar==1.6
# note, no direct django dependency here, but still respect the 1.9 bound.

Here I've overloaded -r to point to the other file. Thoughts?

maxnordlund commented Mar 7, 2017

This bit me today and also coming from a Ruby/Bundler background I like that all dependencies is in the same lock file, but I don't want to install dev dependencies in production. However this seems incompatible with how pip currently operates. That is, one requirements.txt to rule them all, but separate common/dev dependencies.

I had hoped that having dev.in -> dev.txt would solve this but as others have noted you get conflicts. And while I could have a -r somewhere it still would produce two lock files, which sooner or later will diverge.

So my question is if it would be possible to teach pip-compile to just write the dependencies for one input file, while accepting the pinned ones in another. Perhaps an example would clarify this:

# requirements.in
django
# requirements.txt
django==1.9
# dev.in
-r requirements.in
django-debug-toolbar
# dev.txt
-r requirements.txt
django-debug-toolbar==1.6
# note, no direct django dependency here, but still respect the 1.9 bound.

Here I've overloaded -r to point to the other file. Thoughts?

@dogweather

This comment has been minimized.

Show comment
Hide comment
@dogweather

dogweather Mar 30, 2017

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

# Install in all environments
gem 'rails'
gem 'mysql'

# Install only in test
group 'test' do
  gem 'rspec'
end

# Install only in development
group 'development' do
  gem 'web-console'
end

dogweather commented Mar 30, 2017

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

# Install in all environments
gem 'rails'
gem 'mysql'

# Install only in test
group 'test' do
  gem 'rspec'
end

# Install only in development
group 'development' do
  gem 'web-console'
end
@dogweather

This comment has been minimized.

Show comment
Hide comment
@maxnordlund

This comment has been minimized.

Show comment
Hide comment
@maxnordlund

maxnordlund Mar 30, 2017

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

That would make it incompatible with vanilla pip, which isn't really an option for this set of tools IMO. For the official project supporting that idea see https://github.com/pypa/pipfile.

@maxnordlund This blog post answers your question, I believe: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

I've read that, it's linked in the first comment.


In the end I wrote a small python script to generate the two files, with dev having a -r pointing towards base.txt. Then I strip dev.txt of all common dependencies to ensure they cannot diverge. This also forces you to call pip-sync base.txt dev.txt, but that's no biggie and in my case the script actually runs that as well.

The sad part here is that you need another layer to get it right, either a script or make, instead of it being included in the box. The only thing I think might be good enough without changing the format too much, is the suggestion above. That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

maxnordlund commented Mar 30, 2017

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

That would make it incompatible with vanilla pip, which isn't really an option for this set of tools IMO. For the official project supporting that idea see https://github.com/pypa/pipfile.

@maxnordlund This blog post answers your question, I believe: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

I've read that, it's linked in the first comment.


In the end I wrote a small python script to generate the two files, with dev having a -r pointing towards base.txt. Then I strip dev.txt of all common dependencies to ensure they cannot diverge. This also forces you to call pip-sync base.txt dev.txt, but that's no biggie and in my case the script actually runs that as well.

The sad part here is that you need another layer to get it right, either a script or make, instead of it being included in the box. The only thing I think might be good enough without changing the format too much, is the suggestion above. That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

@davidovich

This comment has been minimized.

Show comment
Hide comment
@davidovich

davidovich Mar 30, 2017

Contributor

That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

I think this would bring value, and keep existing processing. The only change would be in the file collection phase before invoking the resolver. I believe this is not too hard to implement, I am open to a PR for this functionality (with relevant tests).

Contributor

davidovich commented Mar 30, 2017

That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

I think this would bring value, and keep existing processing. The only change would be in the file collection phase before invoking the resolver. I believe this is not too hard to implement, I am open to a PR for this functionality (with relevant tests).

@davidovich davidovich reopened this Mar 30, 2017

@dfee

This comment has been minimized.

Show comment
Hide comment
@dfee

dfee Jun 27, 2017

@jamescooke thanks for posting that article (though it was a while ago). I made one slight modification to it:

RELATIVE_ROOT=..  # relative path to project's root
%.txt: %.in
        pip-compile --output-file $@ $<
        sed -i '' "s|-e file://$(realpath $(RELATIVE_ROOT))|-e $(RELATIVE_ROOT)|" $@

i.e. this corrects the annoyance -e file:///Users/dfee/code/zebra -> -e ., making the file useful for users who don't develop / deploy from your directory.

I know this isn't the really the place to discuss your Makefile, but I've grown tired of editing requirements.txt files after pip-compileing them. Other folks have too, and there doesn't seem to be a fix on the horizon.

dfee commented Jun 27, 2017

@jamescooke thanks for posting that article (though it was a while ago). I made one slight modification to it:

RELATIVE_ROOT=..  # relative path to project's root
%.txt: %.in
        pip-compile --output-file $@ $<
        sed -i '' "s|-e file://$(realpath $(RELATIVE_ROOT))|-e $(RELATIVE_ROOT)|" $@

i.e. this corrects the annoyance -e file:///Users/dfee/code/zebra -> -e ., making the file useful for users who don't develop / deploy from your directory.

I know this isn't the really the place to discuss your Makefile, but I've grown tired of editing requirements.txt files after pip-compileing them. Other folks have too, and there doesn't seem to be a fix on the horizon.

@jamescooke

This comment has been minimized.

Show comment
Hide comment
@jamescooke

jamescooke Jun 29, 2017

Hi @dfee , thanks for sharing this suggestion 👍

I've not been able to get this working on my machine, so I won't update my article just yet. The post is on GitHub here https://github.com/jamescooke/blog/blob/master/content/1611-pip-tools-workflow.rst - feel free to open an issue / PR to discuss.

jamescooke commented Jun 29, 2017

Hi @dfee , thanks for sharing this suggestion 👍

I've not been able to get this working on my machine, so I won't update my article just yet. The post is on GitHub here https://github.com/jamescooke/blog/blob/master/content/1611-pip-tools-workflow.rst - feel free to open an issue / PR to discuss.

@vladiibine

This comment has been minimized.

Show comment
Hide comment
@vladiibine

vladiibine Aug 13, 2017

Hi guys,
Great package.

Wanted to jump in with this observation, that for me is really important:

I consider that having -r base.txt in a fils such as dev.in is the best workflow yet.
One big drawback is that this way we LOSE the comments that tell us why a dependency was installed.

For instance

# base.txt
package0==1
package1==1.4   # via package0

Then in dev.in

# in dev.in
-r base.txt
package2==3.3

Then in the resulted dev.txt

# in dev.txt
package0==1
package1==1.4      # !!!!!!!!!!!! the via comment will be missing here. I'd totally prefer this to remain here... :(
package2==3.3

Anyway, that's all from me. Whoever fixes this, please take this into consideration if you can.

vladiibine commented Aug 13, 2017

Hi guys,
Great package.

Wanted to jump in with this observation, that for me is really important:

I consider that having -r base.txt in a fils such as dev.in is the best workflow yet.
One big drawback is that this way we LOSE the comments that tell us why a dependency was installed.

For instance

# base.txt
package0==1
package1==1.4   # via package0

Then in dev.in

# in dev.in
-r base.txt
package2==3.3

Then in the resulted dev.txt

# in dev.txt
package0==1
package1==1.4      # !!!!!!!!!!!! the via comment will be missing here. I'd totally prefer this to remain here... :(
package2==3.3

Anyway, that's all from me. Whoever fixes this, please take this into consideration if you can.

@anlutro

This comment has been minimized.

Show comment
Hide comment
@anlutro

anlutro Aug 29, 2017

Why can't -r base.txt statements (as long as they're .txt, not .in) just get copied as-is to the resulting .txt file?

anlutro commented Aug 29, 2017

Why can't -r base.txt statements (as long as they're .txt, not .in) just get copied as-is to the resulting .txt file?

@samkk-nuna

This comment has been minimized.

Show comment
Hide comment
@samkk-nuna

samkk-nuna May 7, 2018

I followed @jamescooke's flow and recently ended up in a state where I had to add a constraint to my base.in to help the resolver out, because of the following:

  • Add boto3==1.7.14 to base.in
  • Add moto==1.3.3 to test.in, which starts with -r base.txt

Try compiling .txt files from these with pip-compilebase.in compiles fine, but emits a hard equality constraint python-dateutil==2.7.2 into base.txt, which then conflicts with a python-dateutil<2.7.0 constraint emitted from something in moto's dependency tree.

I've hacked around this for now by explicitly stating python-dateutil<2.7.0 in base.in, but that feels gross. Any recommendations on better workarounds, or plans to better support things like this?

samkk-nuna commented May 7, 2018

I followed @jamescooke's flow and recently ended up in a state where I had to add a constraint to my base.in to help the resolver out, because of the following:

  • Add boto3==1.7.14 to base.in
  • Add moto==1.3.3 to test.in, which starts with -r base.txt

Try compiling .txt files from these with pip-compilebase.in compiles fine, but emits a hard equality constraint python-dateutil==2.7.2 into base.txt, which then conflicts with a python-dateutil<2.7.0 constraint emitted from something in moto's dependency tree.

I've hacked around this for now by explicitly stating python-dateutil<2.7.0 in base.in, but that feels gross. Any recommendations on better workarounds, or plans to better support things like this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment