-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[perf] uv pip install resolution is slow when installing from VCS #3287
Comments
I think the problem might be that when you install from PyPI, they can serve you a wheel, which is a built artifact (since the uploader of the package uploaded wheels for it, for a bunch of platforms). But if you install from VCS, you're required to build the package from source. And building from source can be really long and expensive -- it completely depends on the package, we basically have to call out to their build method, which could involve compiling native code. |
And |
Ciao Charlie, thank you very much for the prompt reply.
Yes, we have some Rust plugins for Polars!
Indeed! I should've been more precise, sorry. What bugged me was the resolution time: From VCS: +Resolved 21 packages in 37.95s
Built functime @ git+https://github.com/functime-org/functime.git@0608c78118b5defd42df73b812
Built lightgbm==4.3.0
Downloaded 21 packages in 1m 48s
Installed 21 packages in 282ms From PyPI: +Resolved 21 packages in 417ms
Built lightgbm==4.3.0
Downloaded 21 packages in 30.98s
Installed 21 packages in 284ms You say that in the first case it's 38s because it has to download and build the binary? Couldn't Feel free to close the issue. |
Ahh I see! Let me take a look -- we should be able to clone the repo and read the metadata without building the wheel in this case. (But we do need to clone it, we don't do selective reads (e.g., just checkout the |
Mmm I think for me basically the entire time is spent cloning the repo. That's a bummer. Maybe a datapoint for @ibraheemdev when it comes to seeing if we can make clones any faster. |
(But I confirmed that we do read the metadata from |
Very thorough, thanks!
Why this? I am incredibly naive on the parallelisation side of things, but if you managed to collect the list of requirements from pyproject.toml then you could parallelise the build and download/installation. |
If we have a Git dependency, the first step is that we need to clone the repo. Then we read the |
FYI pip uses blobless clones with git to make performance faster: pypa/pip#9086. Maybe uv is already doing this, but thought I'd mention just in case. |
Shallow clones(—depth=1) can also give nice speed up. However they have caveats as some libraries (setuptools_scm) relies on git metadata that shallow clones will not have. pypa/pip#2432 this discusses issue more in depth. Shallow cloning if done likely needs a flag to control it to handle situations where full clone is necessary. |
For that particular repo (functime.git @ main), it seems to be downloading 285 MB for a regular clone and 231 MB for filter=blob:none. That's a surprisingly small difference. |
Yeah, not sure how much performance impact this will have in general, but one advantage of enabling it is that it is well tested in pip as it has been enabled in for ~2.5 years now. |
(We might already be doing that, I haven’t investigated deeply and can’t quite remember.) |
I sort of think we do shallow clones already, but a bunch of our git code is vendored and adapted from elsewhere so it's a little unclear — I'd need to investigate too... regardless it sounds like that's not likely to be the root problem here. |
Yeah there are separate issues tracking general Git clone performance. |
I only left this open because I think there’s a possibly-interesting thing to try here where we fetch just the pyproject.toml to extract the metadata. |
I don't know if it is useful but there is something called sparse checkout. |
Yes, you could use sparse checkout here effectively to speed things up quite substantially. Normally you'd do in the CLI in the following way:
Note the first After that, you do This recudes the clone size of functime down to I've been using a similar technique personally on very large repos at work in CI/CD for quite some time. |
I looked into this a bit and I think libgit2 doesn't support it (libgit2/libgit2#5564). I'm tempted to rethink our Git strategy a bit more holistically... Right now, we create a single copy of each repo, and then checkout commits by performing a sort of "local" clone into the build directory. I'm wondering if, instead, we should just have a separate clone for each commit, where that clone is a partial clone? It could in theory be less efficient if you're building from the same repo at multiple commits with overlapping blobs, but significantly faster when you're building from a single commit. |
Ciao! I am installing a library from VCS and I feel it should be faster.
The library in question is functime, a forecasting library I maintain. I installed first a version from a PR I am about to merge, but then I realised it's just as slow if I just install from VSC.
I took the install command from the official
pip
docs.And how much it takes from regular git repo:
As a comparison, here is much it takes form PyPI, no cache:
The text was updated successfully, but these errors were encountered: