Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pixi install installs all environments in multi env setting when pypi-dependency is used #1046

Open
pavelzw opened this issue Mar 22, 2024 · 13 comments
Labels
pypi Issue related to PyPI dependencies

Comments

@pavelzw
Copy link
Contributor

pavelzw commented Mar 22, 2024

[project]
name = "test"
channels = ["conda-forge"]
platforms = ["linux-64", "osx-arm64", "osx-64", "win-64"]

[dependencies]
python = ">=3.9"
polars = ">=0.14.24,<0.21"

[pypi-dependencies]
test = {path = "."}

[feature.py39.dependencies]
python = "3.9.*"
[feature.py310.dependencies]
python = "3.10.*"
[feature.py311.dependencies]
python = "3.11.*"
[feature.py312.dependencies]
python = "3.12.*"

[environments]
py39 = ["py39"]
py310 = ["py310"]
py311 = ["py311"]
py312 = ["py312"]
❯ rm -rf .pixi pixi.lock && pixi install
# ...
❯ ls .pixi/envs
default/ py310/   py311/   py312/   py39/
@baszalmstra
Copy link
Contributor

I assume this always happens, regardless of the type of requirement. This is expected behavior because all environments have pypi-dependencies (inherited from the default feature). To be able to solve these pypi environments we need to install a python interpreter, this is done by first only installing the conda packages into their target prefixes.

@pavelzw
Copy link
Contributor Author

pavelzw commented Mar 22, 2024

Hmm, i guess this makes sense... But i still don't like this behavior as this will fill up your drive quite fast if you have a large number of environments that you want to test against 😅
In polarify this totals to 3.6 GB being used while the default environment (which is used in 99% of local development use cases) is only 309 MB large.
I planned on using

[pypi-dependencies]
polarify = {path = ".", editable = true}

to fix #524 but it's annoying that this requires all environments to be actually installed...

@pavelzw pavelzw changed the title pixi install installs all environment in multi env setting when pypi-dependency with path is used pixi install installs all environment in multi env setting when pypi-dependency is used Mar 22, 2024
@pavelzw
Copy link
Contributor Author

pavelzw commented Mar 22, 2024

Do you have envisioned another approach to get rid of pixi run postinstall that doesn't require all envs being installed?

@pavelzw pavelzw changed the title pixi install installs all environment in multi env setting when pypi-dependency is used pixi install installs all environments in multi env setting when pypi-dependency is used Mar 22, 2024
@baszalmstra
Copy link
Contributor

Ah I see your use case. The problem is that to determine the dependencies of your path based project we need python to execute the build backend. Since you have environments with different python versions, the only way to reliably lock this is to invoke python.

The only thing I can think of is that we dont need all conda dependencies to run python. However, some of them might be used by the build backend so its also not really clear which dependencies we can skip during this step..

@pavelzw
Copy link
Contributor Author

pavelzw commented Mar 22, 2024

we need python to execute the build backend

it's probably not possible to let the pypi dependencies be evaluated by pixi because the build backend could theoretically do arbitrary stuff?
At least in polarify, the pyproject.toml doesn't contain anything that's not already in pixi.toml.

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
# ...
dependencies = [
    "polars >=0.14.24,<0.21",
]

Not sure what other shenanigans hatchling supports.

@pavelzw
Copy link
Contributor Author

pavelzw commented Mar 25, 2024

The problem is that to determine the dependencies of your path based project we need python to execute the build backend.

Not sure if this is a good idea or not: how about creating a hash of pyproject.toml, setup.py and setup.cfg and if nothing changes don't try to re-resolve the lockfile and thus not download all envs?

@baszalmstra
Copy link
Contributor

Well the same goes for any source based dependencies (there are still a lot of them).

But we are indeed planning to lock the path based dependency based on the hash of the pyproject.

@pavelzw
Copy link
Contributor Author

pavelzw commented Mar 27, 2024

The problem is that to determine the dependencies of your path based project we need python to execute the build backend

Actually, how are you handling other platforms?
On macOS for example, i cannot install a linux python (because of ncurses). How is the lockfile resolved for other platforms in this case?

@ruben-arts
Copy link
Contributor

Actually, how are you handling other platforms?

Pixi uses the current platforms python to do the solving. If that is not available you get this issue: #1051

@tdejager
Copy link
Contributor

The problem is that to determine the dependencies of your path based project we need python to execute the build backend

Actually, how are you handling other platforms? On macOS for example, i cannot install a linux python (because of ncurses). How is the lockfile resolved for other platforms in this case?

Also note that the python interpreter is used for source dependencies and for the resolving we only care about the metadata. It is now kind of assumed that once created, this metadata is static across platforms. There are examples where this is not the case, at least thats what I gather from reading python threads. With https://peps.python.org/pep-0643/ we can actually check if the METADATA is static.

But I don't know if it will change the behavior much, as there is a lot of old packages, that don't have this. But going onwards from this it will help us avoid having to have the python executable at some point. UV guys helped merge this in warehouse recently.

@anjos
Copy link

anjos commented Apr 5, 2024

I'd like to mention this issue is accentuated if one uses a pyproject.toml file as pixi manifest, due to the automatic injection of the Python project's dependencies as pypi-dependencies on the default environment.

@ruben-arts
Copy link
Contributor

For documentation purpose: @anjos The auto injection is a helper. You are allowed to move it to a feature. e.g.

-[tool.pixi.pypi-dependencies]
+[tool.pixi.feature.dev.pypi-dependencies]
 test_feature_editable = { path = ".", editable = true }
 
+[tool.pixi.environments]
+dev = ["dev"]
+

Your comment still holds, just want to get this out there for users that find this issue.

@ruben-arts ruben-arts added the pypi Issue related to PyPI dependencies label Apr 29, 2024
@anjos
Copy link

anjos commented May 1, 2024

Thank you, @ruben-arts - that is what we ended up doing for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pypi Issue related to PyPI dependencies
Projects
None yet
Development

No branches or pull requests

5 participants