-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use pip-compile for a lockfile of constraints #2625
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@richafrank I took a pass here. I had a bunch of observations that are mostly further improvements we could make in the future, but I think most of them are strictly additive, so we shouldn't block on them.
One thing I'm not sure about is how these changes interact with our existing conda machinery. Do you have a high-level overview of whether/how this affects our conda install situation?
@@ -0,0 +1,63 @@ | |||
# Incompatible with earlier PIP versions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we still actually need pip and setuptools here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can investigate - I haven't looked into it. I think this is no worse than the current situation, so I'd love to merge as is and then make further improvements.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I looked into this and removed them in #2610: they are not necessary. (That PR is failing for unrelated reasons, which this one should help to fix.)
+1 on just leaving them here for now though.
numexpr>=2.6.1 | ||
|
||
# Currency Codes | ||
iso4217>=1.6.20180829 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely not a blocker for merging, but another option for the content of this file would be to move it canonically into the install_requires
of zipline's setup.py
, which would reduce the need for parsing/manipulating this file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's your thought on what input we would give to pip-compile
in lieu of requirements.in
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think then we could just use .
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, you're suggesting moving the list of libs to setup.py and making requirements.in just .
.
Since we need both requirements.in and setup.py and want one to refer to the other, my preference is to stick with them being listed in requirements.in. Then we can keep the various extras split up in separate files, and have the files well commented, instead of inlining all that in setup.py.
@@ -0,0 +1,2 @@ | |||
Cython>=0.25.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a good next step after merging this would be to move these into a pyproject.toml
.
# Linting | ||
flake8>=3.3.0 | ||
|
||
# Algo examples |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a blocker, but it seems unfortunate to have matplotlib in here. The rest of the dev requirements are pretty lightweight, but mpl requires a bunch of external dependencies. We might want to consider just making mpl fully optional, since none of our tests actually use it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, I buy that. I didn't want to make changes in this PR bigger than making the process easier, and then I figure we can iterate on making us more efficient.
@@ -256,11 +227,10 @@ def extras_requires(conda_format=False): | |||
return extras | |||
|
|||
|
|||
def setup_requirements(requirements_path, module_names, strict_bounds, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we still need the strict_bounds machinery for the conda build case? It seems like that's what this was used for originally. I don't have a good sense of how these changes interact with our existing conda machinery.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe we need it anymore. Let's look at the different cases:
- --
install_requires
when invoked by pip --
Expected output is to give wide bounds (i.e.>=
). We did that by rewriting requirements.txt's==
, so we never passedstrict_bounds
at the call site, using the default value ofFalse
.
This is now accomplished by reading requirements.in, which itself uses wide bounds.
A benefit now is that we don't include our transitive dependencies, which makes the install more flexible. - --
install_requires
when invoked by conda-build --
Same as pip case above. - --
setup_requires
when invoked by pip --
strict_bounds
wasFalse
in this case, so you'd get latest numpy/cython, which is the whole issue with needing to install the expectednumpy
version in a separate, earlier step. I don't believe it actually matters for cython, but correct me if I'm wrong. This behavior hasn't changed, now that we read from requirements_build.in without rewriting. - --
build_requires
when invoked by conda-build --
strict_bounds
wasTrue
in this case, but so wasconda_format
, which overrides any numpy version withx.x
, so that it's pinned to the version that we specified to conda-build. This way, it will be built with the same version as it requires for install, and the package is tagged with the version. Again, I don't think the behavior for cython matters, but this is a difference here, in that it used to be strict and is no longer. No longer having bothstrict_bounds
andconda_format
as ways to rewrite our requirements is a vast improvement. - --
extras_require
when invoked by pip --
Previously this usedstrict_bounds=True
, since the extras were mostly dev requirements, and the thought was to pin them exactly for consistent dev environments. Now this reads from requirements_dev.in, i.e. just lower bounds, and if you want to match the CI environment, you can install using the constraints. - --
extras_require
when invoked by conda-build --
Our conda recipe for zipline doesn't use theextras_require
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think those all make sense to me. Thanks for the rundown.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ssanderson Thanks for the review. Take a look at my response to your setup.py question about conda-build, and let me know if that doesn't clarify your general conda install question. The net impact should be just that cython isn't pinned at build time, and we don't specify all our transitive dependencies anymore.
@@ -0,0 +1,63 @@ | |||
# Incompatible with earlier PIP versions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can investigate - I haven't looked into it. I think this is no worse than the current situation, so I'd love to merge as is and then make further improvements.
numexpr>=2.6.1 | ||
|
||
# Currency Codes | ||
iso4217>=1.6.20180829 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's your thought on what input we would give to pip-compile
in lieu of requirements.in
?
# Linting | ||
flake8>=3.3.0 | ||
|
||
# Algo examples |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, I buy that. I didn't want to make changes in this PR bigger than making the process easier, and then I figure we can iterate on making us more efficient.
@@ -256,11 +227,10 @@ def extras_requires(conda_format=False): | |||
return extras | |||
|
|||
|
|||
def setup_requirements(requirements_path, module_names, strict_bounds, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe we need it anymore. Let's look at the different cases:
- --
install_requires
when invoked by pip --
Expected output is to give wide bounds (i.e.>=
). We did that by rewriting requirements.txt's==
, so we never passedstrict_bounds
at the call site, using the default value ofFalse
.
This is now accomplished by reading requirements.in, which itself uses wide bounds.
A benefit now is that we don't include our transitive dependencies, which makes the install more flexible. - --
install_requires
when invoked by conda-build --
Same as pip case above. - --
setup_requires
when invoked by pip --
strict_bounds
wasFalse
in this case, so you'd get latest numpy/cython, which is the whole issue with needing to install the expectednumpy
version in a separate, earlier step. I don't believe it actually matters for cython, but correct me if I'm wrong. This behavior hasn't changed, now that we read from requirements_build.in without rewriting. - --
build_requires
when invoked by conda-build --
strict_bounds
wasTrue
in this case, but so wasconda_format
, which overrides any numpy version withx.x
, so that it's pinned to the version that we specified to conda-build. This way, it will be built with the same version as it requires for install, and the package is tagged with the version. Again, I don't think the behavior for cython matters, but this is a difference here, in that it used to be strict and is no longer. No longer having bothstrict_bounds
andconda_format
as ways to rewrite our requirements is a vast improvement. - --
extras_require
when invoked by pip --
Previously this usedstrict_bounds=True
, since the extras were mostly dev requirements, and the thought was to pin them exactly for consistent dev environments. Now this reads from requirements_dev.in, i.e. just lower bounds, and if you want to match the CI environment, you can install using the constraints. - --
extras_require
when invoked by conda-build --
Our conda recipe for zipline doesn't use theextras_require
.
26092f5
to
ab34b1a
Compare
I know this PR is pretty much close to merge, but why aren't we using Pipenv instead of pip-tools? |
@willianpaixao we've experimented quite a bit with pipenv internally at Quantopian and found that it wasn't stable enough for our needs. |
110b5ff
to
c6b44e8
Compare
Removed cyordereddict recipe that we no longer depend on Removed strict bounds rewriting in setup.py. Now setup.py reads from requirements.in, which is not strict in its bounds Need to update case for pip-compile's output Filter out comments from lockfile, now that we have "via"
Required setup.py handling a requirements without a spec
since it's the generated lockfile
7436b5d
to
23f1580
Compare
==
to>=
. It's built from all the *.in files, i.e. including dev/test requirements.