A successful pip-tools workflow for managing Python package requirements
|summary:||Using pip-tools with multiple requirements files can be difficult.
This post describes my current workflow that manages the complexity with a
In this post I present the
pip-tools workflow I've been using over a number
of projects to manage multiple inherited requirements files. At its core is a
GNU Make Makefile to
provide recipes for managing requirements and specifying the dependencies
between the requirements files.
If you are not aware of the excellent
pip-tools package it provides two commands:
pip-sync. In this post I will be focusing on using
.in files consisting of top level requirements.
pip-compile consults the PyPI index for each top level package required,
looking up the package versions available, outputting a specific list of pinned
packages in a
.txt file. This extra layer of abstraction (
containing top level requirements rather than just outputting
pip freeze) is very helpful for managing requirements, but does create
some complications which mean that a solid workflow is essential for stable
Keep requirements files in their own folder
In order to preserve sanity, I keep my project requirements in their own folder directly inside the project.
$ cd project $ ls requirements/ base.in base.txt Makefile test.in test.txt
During this post, I'll use this simple example with one set of "base" requirements and one set of "test" requirements.
.txt files in version control
.txt files are tracked in the project's revision control
system, for example
git. This allows for shipping of the compiled
files for installation, but more importantly, it presents the opportunity to
check the diff of
.txt files when upgrading packages.
I also tend to keep
.in files sorted alphabetically.
.in files to depend on
In the example project there are
I want the test requirements add to the base requirements without changing
the versions of the packages compiled for base. Therefore I set
-r require the
base.txt compiled requirements:
-r base.txt test-packages
test.in to depend on
base.txt rather than
that the top level requirements for testing and do not override the packages
needed by the main project.
Use a Makefile for common tasks
On each project that has multiple requirements files, I use a Makefile and place it in the requirements folder.
.PHONY: all check clean objects = $(wildcard *.in) outputs := $(objects:.in=.txt) all: $(outputs) %.txt: %.in pip-compile -v --output-file $@ $< test.txt: base.txt check: @which pip-compile > /dev/null clean: check - rm *.txt
Here is that same file in a current project.
NOTE that because
make requires recipes to be indented by tabs, if you
want to copy this file then it could be helpful to pull the raw file from
rather than copying and pasting out of this webpage where the tabs have not
Let's go over the key functionality provided by this Makefile:
First two definitions:
objects = $(wildcard *.in)
objectsis a list containing every
.infile in requirements folder.
outputs := $(objects:.in=.txt)
outputsis also a list made of one
.txtfilename for each
.infile in the
.txtfiles do not need to exist yet, this list tells
makewhat they should be called.
A recipe called
allto build all
allrecipe has no commands of its own - it solely depends on all the
.txtfiles in the
outputslist being built. In order to fulfil this recipe,
makewill attempt to build every
.txtfile in the
Up until now,
makedoes not know how to build a
.txtfile, so here we give it a recipe:
%.txt: %.in pip-compile -v --output-file $@ $<
The first line tells
.txtfile depends on the
.infile with the same name.
makewill check the date stamp on the two files and compare them - if the
.txtfile is older than the
.infile or does not exist, then
makewill build it.
The next line tells
makethe command to use to perform the build - it is the
pip-compilecommand with the following flags:
pip-compilewill give verbose output. I find this helpful for general watchfulness, but you may prefer to remove it.
output-file $@means "send the output to the target of the recipe", which is the
.txtfile we've asked to be made. For example when invoking
make base.txt, then
--output-file base.txtwill be passed.
$<at the end is the corresponding
.ininput file. Make matches the names using the
%sign in the recipe, so it knows to build
Now we tell
makeabout the dependency between the requirements files.
This creates a dependency chain. This is an additional recipe for
makethat it depends on
base.txt. That means that if
makeis asked to build
test.txt, then it should be updated if
base.txthave been updated.
base.inis updated, then
makeknows that it will need to recompile
base.txtin order to make
test.txt. We can see that here:
$ touch base.in # Update timestamp on base.in $ make -n test.txt # What commands will be run to build test.txt pip-compile -v --output-file base.txt base.in pip-compile -v --output-file test.txt test.in
This is exactly what we want for requirements inheritance. If the requirements in our base have changed, then we want our test file to be recompiled too because of the
-r base.txtline we added to the
Of course, this is a trivial example, but I have used multiple lines of dependency in Makefiles to manage multiple levels of inheritance in requirements files.
Finally, a recipe to help us update requirements.
check: @which pip-compile > /dev/null clean: check - rm *.txt
checkrecipe will fail if
pip-toolsis not installed.
cleanrecipe will remove all the
.txtfiles if the
checkrecipe is successful. This makes it harder to accidentally delete your requirements files without
pip-toolsalready installed to be able to build them again.
I've explained what the Makefile above does, but not how or when you would use it. So let's continue with some common workflow actions.
Build one or more requirements files
To update all requirements use the default
$ make all
To update a particular file, ask for it by name:
$ make test.txt
If make tells you that a file is up-to-date but you want to force it to be
rebuilt you should
$ make base.txt make: 'base.txt' is up to date. $ touch base.in $ make base.txt pip-compile -v --output-file base.txt base.in ...
Add a dependency
To add a dependency, locate the appropriate
.in file and add the new
package name there. The version number is only required if a particular version
of the library is required. The latest version will be chosen by default when
$ cat >> base.in ipython $ make all
Update a package
In order to update a single top level package version, remove its lines from
the compiled corresponding
.txt files. I tend to be quite "aggressive" with
this and remove every package that the top level package depended on using
sed with a pattern match.
Given that I want to update
ipython and it is not pinned in my
$ sed '/ipython/d' -i *.txt $ make all
There is no command for this removal built into the Makefile, but potentially
it could be. Ideally, it could be provided as extra functionality by
pip-tools. Beware that packages often contain each other's names as
substrings so could lead to bad matching. If in doubt review your diff and
potentially remove lines from your
.txt files manually.
The call to
make all will reevaluate the latest version for packages that
do not have corresponding lines in the
.txt file and they will be updated
Update all requirements
A full update of all requirements to the latest version (including updating all
packages that are not pinned in the
.in file with a particular version
number) can be achieved with:
$ make clean all
clean recipe will clean out all
*.txt files if you have
pip-tools installed. Then the
all recipe will rebuild them all in
A tip for working with Makefiles. If you want to see what commands will be run
by a recipe, you can use the
-n flag and inspect the commands that were
$ make -n all
Update Nov 21
For more information on the advantages and disadvantages of setting recursive
requirements to point at
.in files or
.txt files please see this Issue on the
In particular, my comment
illustrates how development requirements can become out of sync with base
.in files are used in recursion which does not happen
.txt files are used. It's for this reason, that I continue to
recommend pointing at
.txt files with
Happy requirements packing!