Replies: 3 comments 7 replies
-
This has been discussed on the Poetry tracker I think: solving Python dependencies is not a CPU- but an IO-bound task (network/disk and yes, some CPU), because the data isn't provided by the index since most of the packages still use dynamic setups and/or don't correctly declare their dependencies. Using C won't help at all 😕 UPDATE: high-level explanation here: https://python-poetry.org/docs/faq/#why-is-the-dependency-resolution-process-slow |
Beta Was this translation helpful? Give feedback.
-
By the way
do these tools actually exist, or are you just suggesting they could be created? Genuinely curious 🙂 |
Beta Was this translation helpful? Give feedback.
-
Yes, most of the time the solver is downloading distributions, extracting them, and reading the metadata. It is a heavy IO-bound task. C implementations don't help in this scenario. AFAIK, conda is using mamba, a SAT solver implemented in C, with a carefully designed package server, which calculates and caches the dependency map on the server. But I don't think we can easily adopt it, since we have to stick to the python package index. |
Beta Was this translation helpful? Give feedback.
-
Is your feature request related to a problem? Please describe.
Dependancies solving is a high performance computing task. I'm never convinced that Python is the good language for such task. For some project, PDM has take more one hour to try to solve the dependancies, and it did not succeed.
Describe the solution you'd like
I appreciate there is a file pdm.lock, that allow developer to solve dependancies with other high performance tool, where PDM fails. After we can just use
pdm install
to install dependancies.But I think that could be better if pdm is based on high performance dependancie solver with parallelization from C ecosystem.
Beta Was this translation helpful? Give feedback.
All reactions