-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Poetry is extremely slow when resolving the dependencies #2094
Comments
Could this be due to downloading packages from pypi to inspect their dependencies, when not properly specified? |
It seems so. I have checked the detailed log, poetry kept retrying to resolve the dependency for botocore, but without success. So I assume that the dependency can be eventually resolved if enough time is given. However, is there any way to get around this? BTW, I also consider it's better to give some warning if there are some dependencies are not properly specified and could not be resolved after a number of attempts. |
Hi, I encounter a similiar problem on my MacOS. Python version used is 3.7.6, Poetry is 1.0.5. I just created a new project with no dependencies so far in pyproject.toml, just initially pytest. It takes ages until the new virtualenv is setup with all 11 packages installed. Running it with -vvv does not bring any new findings. Regards, Thomas |
same here...😱 |
Same here. I just created an empty project then ran |
I'm currently using this workaround: poetry export -f requirements.txt > requirements.txt
python -m pip install -r requirements.txt
poetry install It takes a lower time space to install the package locally since all deps are already installed. Make sure to run |
Poetry being slow to resolve dependencies seems to be a reoccuring issue:
|
Maybe there is a dependency conflict. |
First of all, I want to say there is ongoing work to improve the dependency resolution. However, there is so much Poetry can do with the current state of the Python ecosystem. I invite you to read https://python-poetry.org/docs/faq/#why-is-the-dependency-resolution-process-slow to know a little more about why the dependency resolution can be slow. If you report that Poetry is slow, we would appreciate a @gagarine Could you provide the |
Take about 2min to resolve dependencies after adding newspaper3k on a fresh project. pyproject.toml
|
Hey dudes - as Sebastian implied, the root cause is the Python eco's inconsistent/incomplete way of specifying dependencies and package metadata. Unfortunately, the Pypi team is treating this as a In particular, using the Pypi json endpoint, an empty dep list could either mean "no dependencies", or "dependencies not specified". The Pypi team doesn't want to differentiate between these two cases for reasoning I don't follow. The soln is to workaround this by maintaining a sep cache from Pypi that properly handles this distinction, and perhaps refuse to use packages that don't properly specify deps. However, this latter aspect may be tough, due to long dep nests. Python's grown a lot over the decades, and it has much remaining from its early days. There's a culture of no-breaking-changes at any cost. Having to run arbitrary python code to find dependencies is fucked, but .... we can do this for each noncompliant package, and save it. |
First, it's capitalized PyPI. Second, there is no way for PyPI to know dependencies for all packages without executing arbitrary code -- which is difficult to do safely and expensive (computationally and financially). PyPI is run on donated infrastructure from sponsors, maintained by volunteers and does not have millions of dollars of funding like many other language ecosystems' package indexes. For anyone interested in further reading, here's an article written by a PyPI admin on this topic: https://dustingram.com/articles/2018/03/05/why-pypi-doesnt-know-dependencies/ |
It's not as tough as you imply. You accept some risk by running the arbitrary code, but accepting things as they are isn't the right approach. We're already forcing this on anyone who installs Python packages; it's what triggers the delays cited in this thread. I have the above repo running on a $10/month Heroku plan, and it works well. I've made the assumption that if dependencies are specified, they're specified correctly, so only check the ones that show as having no deps. This won't work every time, but does in a large majority of cases. Related: Projects like Poetry are already taking a swing at preventing this in the future: Specifying deps in |
A personal Heroku app is not going to be as valuable a target as PyPI would be. Neither is a $10/month Heroku app going to be able to support the millions of API requests that PyPI gets everyday. The problem isn't in writing a script run a setup.py file in a sandbox, but in the logistics and challenges of providing it for the entire ecosystem. "It works 90% of the time" is not an approach that can be taken by the canonical package index (which has to be used by everyone) but can be taken by specific tools (which users opt into using). Similar to how Anyway, I wanted to call out that "just blame PyPI folks because they don't care/are lazy" is straight up wrong IMO -- there are reasons that things are the way they are. That doesn't mean we shouldn't improve them, but it's important to understand why we're where we are. I'm going to step away now. |
Before you step away - Can you think of a reason PyPi shouldn't differentiate between no dependencies, and missing dependency data? If going through existing releases is too bold, what about for new ones? |
I'm new to (more serious) python and don't understand the big drama. Yet Can someone post a couple of examples where a txt file is not enough and
Cargo do it like this: https://doc.rust-lang.org/cargo/reference/specifying-dependencies.html#platform-specific-dependencies this is not enough for python? Why poetry does not create their own package repository, avoiding setup.py and using their own dependency declaration? Could take time... but a bot can automatise the pull request on most python module based on the kind of technics used in https://github.com/David-OConnor/pydeps |
I think the root cause is Python's been around for a while, and tries to maintain backwards compatibility. I agree - As a new lang, |
Not absolutely necessary, but helpful in the following scenario:
With setup.py, you can follow the DRY principle:
For requirements.txt, I'm on the one hand not sure how you denote extras at all and even if you can, you would need to repeat the requirements of a within the requirements of b. This is prone to human error. However, while creating the package, the package builder could output a textfile having those requirements. |
You mean replacing PyPI? Good luck with that. I analyzed the packages on PyPI in January (PyPI Analysis 2020):
I also gave a course about packaging in Python this year to PhD students. They simply want to share there work to a broad audience. I only mentioned poetry briefly because it is such a niche right now. Changing a big, working system is hard. It took Python 2 -> 3 about 12 years and it is still not completely finished. |
Hi, I would like to invite everone interested in how depedencies should be declared to this discussion on python.org fin swimmer |
@finswimmer I check the discussion. Seem like they are reinventing the wheel instead of copy/past something that works (composer, Cargo, ...).
For sure requirements.txt is not good.
Yes. But why making poetry if it's not to replace PyPI and requirements.txt? If poetry is compatible with PyPI, there is no incentive to add a pyproject.toml. Perhaps I don't even know I should add one. Now if every time I try to install a package that has no pyproject.toml the command line proposes me to open an issue on this project with a ready to use a template, this could speed things up. |
It'd be more productive to file an issue on https://github.com/pypa/warehouse, to ask this. There's either a good reason, or PyPI would be open to adding this functionality. In the latter case, depending on how the details work out, it might need to be standardized like pyproject.toml was before poetry adopted it, so that the entire ecosystem can depend on and utilize it. |
My two cents solution.
And now it is fine. |
For me, the issue was fixed by setting particular version of package that I want to install, not |
Running a 1: fact: no versions of boto3 match >1.16.35,<1.16.36 || >1.16.36,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42>
1: conflict: no versions of boto3 match >1.16.35,<1.16.36 || >1.16.36,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.1>
1: ! boto3 (>1.16.35,<1.16.36 || >1.16.36,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<>
1: ! which is caused by "boto3 (1.16.35) depends on botocore (>=1.19.35,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.36 || >1.16.36,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.1>
1: ! boto3 (>=1.16.35,<1.16.36 || >1.16.36,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,>
1: ! which is caused by "boto3 (1.16.36) depends on botocore (>=1.19.36,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.1>
1: ! boto3 (>=1.16.35,<1.16.37 || >1.16.37,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,>
1: ! which is caused by "boto3 (1.16.37) depends on botocore (>=1.19.37,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.1>
1: ! boto3 (>=1.16.35,<1.16.38 || >1.16.38,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,>
1: ! which is caused by "boto3 (1.16.38) depends on botocore (>=1.19.38,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,<1.16.46 || >1.1>
1: ! boto3 (>=1.16.35,<1.16.39 || >1.16.39,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,<1.16.46 || >1.16.46,>
1: ! which is caused by "boto3 (1.16.39) depends on botocore (>=1.19.39,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,<1.16.46 || >1.16.46,<1.16.47 || >1.1>
1: ! boto3 (>=1.16.35,<1.16.40 || >1.16.40,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,<1.16.46 || >1.16.46,<1.16.47 || >1.16.47,>
1: ! which is caused by "boto3 (1.16.40) depends on botocore (>=1.19.40,<1.20.0)"
1: ! thus: boto3 (>=1.16.35,<1.16.41 || >1.16.41,<1.16.42 || >1.16.42,<1.16.43 || >1.16.43,<1.16.44 || >1.16.44,<1.16.45 || >1.16.45,<1.16.46 || >1.16.46,<1.16.47 || >1.16.47,<1.16.48 || >1.1> The first line of constraints is nearly 9000 chars long (the above is truncated), each iteration appears to output one less constraint at a time, and each iteration takes 3~6 sec. What the heck? edit It's speeding up as the list gets shorter (this is some O(n^2) business...), and it ends with the following: 1: ! boto3 (>=1.16.35,<1.24.68 || >1.24.68,<1.24.69 || >1.24.69,<2) is partially satisfied by not boto3 (1.24.68)
1: ! which is caused by "boto3 (1.24.68) depends on botocore (>=1.27.68,<1.28.0)"
1: ! thus: boto3 (>=1.16.35,<1.24.69 || >1.24.69,<2) requires botocore (>=1.19.35,<1.28.0)
1: fact: boto3 (>=1.16.35,<1.24.69 || >1.24.69,<2) requires botocore (>=1.19.35,<1.28.0)
1: derived: not boto3 (>=1.16.35,!=1.24.69,<2)
1: derived: s3transfer (>=0.6.0,<0.7.0)
1: conflict: boto3 (1.24.69) depends on botocore (>=1.27.69,<1.28.0)
1: ! boto3 (1.24.69) is partially satisfied by not boto3 (>=1.16.35,<1.24.69 || >1.24.69,<2)
1: ! which is caused by "boto3 (>=1.16.35,<1.24.69 || >1.24.69,<2) requires botocore (>=1.19.35,<1.28.0)"
1: ! thus: boto3 (>=1.16.35,<2) requires botocore (>=1.19.35,<1.28.0)
1: fact: boto3 (>=1.16.35,<2) requires botocore (>=1.19.35,<1.28.0)
1: derived: botocore (>=1.19.35,<1.28.0)
<REDACTED>: 418 packages found for botocore >=1.19.35,<1.28.0 edit2 Now it's repeating all of the above while still within the same run? Seems like there's a serious amount of backtracking going on. Output of `rg -C2 'solving took' poetry_add_s3path.log`:
|
There are cases where poetry during its search first chooses Then poetry has to search through all of those versions of when you encounter such cases the pragmatic thing to do is to put an explicit constraint on |
@dimbleby That doesn't seem to always work: slack-sdk = "^3.17.2"
+boto3 = "1.17.49"
+ddtrace = "0.34.2" Yet it still appears to search through unrelated
Any suggestions? |
@Kache, It appears to search through dependencies depth-first, rather than breadth-first. As a result, you've probably got a something earlier in your pyproject.toml that depends on ddtrace, so the dependency resolver grabbed that version and tried to resolve using that, rather than the ddtrace version you've specified. I've had some success moving the dependencies I want exact version logic prioritizing earlier in the pyproject.toml file. (I also disabled IPv6, upgraded to poetry 1.2x, and have reduced the possible space for the troubling aws libraries (boto3 and awscli, for me) so those go at the very end of my dependency file and have only a few recent versions to chew through. I'm seeing dependency resolution time between 5 and 35 seconds most of the time now. |
i think if you lose internet connectivity during a download of a dependency, it seems to introduce the possibility of making subsequent retry operations hang. i did a Running the above workaround resulted in a similar hang, but this time on the lock file: sh-3.2$ poetry export -f requirements.txt > requirements.txt
Configuration file exists at /Users/me/Library/Application Support/pypoetry, reusing this directory.
Consider moving configuration to /Users/me/Library/Preferences/pypoetry, as support for the legacy directory will be removed in an upcoming release.
The lock file does not exist. Locking. it's been locking for 10-20 minutes now. hope this helps at all |
this fixed my issue: |
Re:
I've seen this mentioned a few times, if it works that's great, but why would it work? Wouldn't clearing the cache cause things to take longer? What am I missing? |
@tall-josh Because Poetry had (maybe still has?) a bug where corrupted cache entries make Poetry hang forever when trying to resolve dependencies. For me this seemed to occur if I Ctrl+C'd Poetry while it was doing an install and it was downloading packages. I observed this on 1.1.14, so perhaps it's fixed in 1.2+. |
Clearing the cache is most likely related to clearing out partial/incomplete/corrupted by concurrent usage downloads that can cause an indefinite hang. |
add this to .toml file is work for me |
This comment was marked as spam.
This comment was marked as spam.
Still have same problem on version 1.2.1 |
Hi all, This issue has gotten quite long and meandering, with many disparate causes, solutions, fixed issues, perhaps still extant bugs, and many "me too" comments all discussed. I'm going to close this issue as most of the root causes discussed within have either been solved in 1.2 or 1.3 (the changes in 1.3 are behavior changes not eligible for backport). If you are having issues with Poetry taking a long time to resolve dependencies, please first open a Discussion or start on Discord, as many of them are related to configuration and large search space (basically, you're creating exponential work for the solver and should tighten your constraints). Tooling to advise the user should be possible (if difficult to develop) in the long run, and anyone interested in tackling this hairy problem should reach out to the team via a Discussion or on Discord. Past that, please make sure to test with the latest code (both on the 1.2 branch and master branch presently) when trying to reproduce resolver issues as we are making improvements all the time, and your issue may be fixed and pending release already. Finally, good reproductions are needed for this category of issue. Many times they are related to transient network issues, pathologically bad cases due to decisions made around (low traffic, private) custom package indexes, or a corrupted cache/bad local config. Reproducing in a container with publicly available packages will mean that someone can dissect your issue and possibly fix it. If you can't reproduce it with public packages, but you can with private packages, there are still options -- everything from sharing details with the team in private, to creating 'imitation' repositories to reproduce an issue. Please refrain from commenting on this issue more if it's not a mitigation/common solution -- "me too" is not very helpful and will send dozens of emails, and if you can reproduce bad performance consistently/in a clean environment it should be an issue. If you're stuck and need help, ask for support using one of the methods mentioned above. |
I have a private repository from gitlab which is know to have some problems with ipv6, I disabled ipv6 too and add this to my Host gitlab.com
AddressFamily inet then Got a huge improvement in dependencies resolver |
I found one workaround: |
Reducing the search space is not a workaround, but a common issue (as mentioned in my comment above). That being said, the version you have written should be equivalent ( |
Actually, they a not equivalent: |
That should expand to |
-vvv
option).Issue
I created an empty project and run poetry add allennlp. It takes ages to resolve the dependencies.
The text was updated successfully, but these errors were encountered: