-
-
Notifications
You must be signed in to change notification settings - Fork 613
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to Cross Compile #585
Comments
For the record, it's the responsibility of the package requiring About having the ability to compile for a specific environment, its interesting, but really hard to do well. That likely means having to trick In other word, I wouldn't expect this to be done soon. Contributions are always welcomed, but I would point toward supporting the upcomming |
Yeah, TensorFlow's packaging is a little weird. What this ends up looking like is that we logically want to specify something in
But |
You should be able to do that currently (or something alike, environment markers are allowed in |
There is no From this and other issues on Pipenv, this isn't really addressable without some even more invasive hackery, so this is probably a CANTFIX. I might poke around at this a bit on my own but it's not immediately obvious that there's a solution here that isn't ridiculously gnarly. |
But, if
Would you give me the (Am I "fighting" to keep an issue open? I think I need to consult a professional....) |
Hah, not a problem. Here's what happens:
pip-tools can deal with the package itself just fine, but it fails when it tries to grab the package to resolve dependencies. |
It's the same sort of problem as https://github.com/kennethreitz/pipenv/issues/857, though the same problems there don't come up given that pip-tools itself runs in the virtualenv rather than outside of it. One mitigation in this case could be that, for packages that do upload their dependencies to PyPI (are these the packages that use twine?), we just use the stated dependencies from PyPI rather than download the package to resolve it. This wouldn't solve the problem in full generality, but it would fix things for e.g. Though frankly Pipenv is a bit of a no-go for us anyway due to https://github.com/kennethreitz/pipenv/issues/966. |
My bad, the environment markers are simply copied to the resulting requirements.txt. It looks like it will still do the lookup and fail here. I have a hunch of how this could be fixed, maybe it wouldn't be so hard (famous last words) in our case. Although, don't hold your breath. I would need to check if PyPi actually provide an API to get those dependencies, but I doubt it. |
It's on the JSON payload. See |
I'm not sure if this API really lets you distinguish between "no dependencies" and "dependencies not published to PyPI", though. Maybe not that important in practice. |
Note to self: stop making "guesses" past 1:00 AM. |
Ah, I see it's not so straightforward in the code given how you hook into |
Actually, there's an
This line could go just after collecting all the constraints for parsing requirements and just before Here? pip-tools/piptools/scripts/compile.py Line 180 in b6a9f1f Additionally, the |
I think there's still the problem that we literally can't evaluate the transitive dependencies for a package that we can't install/download, though. The bottleneck here isn't really the evaluation – it's that unless we try to read the deps from the PyPI API (instead of using pip's approach), we don't have a way to get transitive deps at all for non-installable packages. |
No, check this out. Say I have a
Now let's say I use the following code snippet (which is pieced together from a REPL session and basically emulates what pip-compile is doing):
If you run this, you get the exact same error as reported. However, if we now filter out the constraints that don't match the specified markers in
We are telling |
Oh, hey, So, that does work, but it's not exactly what I want. Ideally, I'd like for this package (and its exclusive dependencies) to show up in my generated Imagine I started with:
I'd want something like:
For carrying through dependencies transitively, suppose I had:
Then I would want something like:
|
I see, sorry for the rat hole, carry on :) |
Proper environment marker handling from the In short, So in the current state of things, it's a dead end. If someday there's a deterministic way to know the dependencies of any package without ever having to execute possibly environment-dependent code, then it'll be doable. |
I decided to solve this for myself by just dumping the locking into Azure Pipelines and keeping a per-platform requirements.txt output. I also happen to have multiple groupings (base, testing, dev, for example). Obviously it would be nice for packages to be processable on all platforms but I decided not to wait for that to happen. https://github.com/altendky/boots |
Hi all. I came across this issue while looking for info on how to use pip-tools across mac/win32/linux. I started following the approach of running pip-compile on each platform and maintaining separate .txt files, for example:
and
What is the suggestion on compiling an add-on
I have resorted to having a single Any suggestions/plans on how to best approach this? |
I'm inviting anyone still interested in multi-environment compilation and sync workflows to pick up discussion @ #826. |
@karypid I'm using the following approach to cross-platform compatible requirements files with constraint support: Assuming the following requirements files
(Note special constraint syntax). I then invoke this (my) platform-generate.py script on each platform as follows
to get platform specific
Inspecting
From here on use pip-compile/pip-sync. Alternatively |
We are developing on Windows but deploying in a docker linux environment. My solution was to create a container that imitates our production environment, mount it locally, and generate the requirements.txt in it. Basically: FROM python:3.11-slim-bullseye
RUN pip install pip-tools
WORKDIR /app
COPY requirements.in /app/
CMD ["pip-compile", "--output-file=requirements.txt", "--strip-extras", "requirements.in"] Makefile: .PHONY: dev compile
dev:
pip-compile --output-file=requirements-dev.txt --strip-extras requirements-dev.in requirements.in && \
pip-sync requirements-dev.txt && \
black . && \
isort . --profile black
compile:
docker build -t pip-compile-env -f ../../setup/Dockerfile.compile .
powershell -command "docker run --rm -v \"$$(Get-Location):/app\" pip-compile-env"
docker rmi pip-compile-env
@echo "requirements.txt has been generated in the current directory." To prevent the image from building every time: compile:
@powershell -Command "if (-Not (docker images -q pip-compile-env)) { \
Write-Output 'Image pip-compile-env not found. Building...'; \
docker build -t pip-compile-env -f ../../setup/Dockerfile.compile .; \
} else { \
Write-Output 'Image pip-compile-env already exists. Skipping build...'; \
}"
@powershell -command "docker run --rm -v \"$$(Get-Location):/app\" pip-compile-env"
@echo "requirements.txt has been generated in the current directory." |
Support the ability to run
pip-compile
specifying the OS / architecture that should be used for resolving dependencies. Currently it uses the OS where thepip-compile
is run. This causes issues such as #333. It also means that if a package does not exist on the current OS (eg fortensorflow-gpu
on MacOS`), then compile fails.Environment Versions
MacOS
$ python -V
Python 3.5.3
$ pip --version
pip 9.0.1
$ pip-compile --version
pip-compile, version 1.9.0
Steps to replicate
tensorflow-gpu>=1.2
to requirements.inpip-compile
Expected result
requirements.txt file with pinned deps. (assuming that
--arch manylinux1_x86_64
was set)Actual result
The text was updated successfully, but these errors were encountered: