Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Local HTTP cache is confusing for package maintainers #5670
I've seen this issue raised multiple times in different formats and mediums. Most recently on #pypa today, this question was asked:
The reason for this is that pip uses CacheControl to maintain a local HTTP cache, and it will not retry requests made within the last 10 minutes:
Unfortunately, for a lot of package authors/maintainers, this is a common workflow:
I think that this creates enough confusion about PyPI being "slow" or some type of user error that it's worth addressing in some way.
Some ideas for how this could be addressed:
Agreed that caching is great and very useful! Also for maintainers and developers! For instance when running Tox, we don't want to eagerly hit the online endpoints for each and every dependency.
Suggestion to improve cache invalidation: If using editable mode (
(5) is pretty easy to do (in fact we already do this! We just have it limited to 10 minutes instead of setting
The main reason it caches at all is just to avoid the case of where
I originally chose 10 minutes because I thought it represented a good trade off in terms of how quickly a new package is available and how rarely people release new packages. I'm not married to do the idea though, just explaining the thought process of why it is like it is.
If we do change it, then I suggest changing it in such a way that CacheControl is still caching the page, and doing a conditional GET with the
@dstufft I think everybody is on the same page on the benefits of caching! :) I really <3 that
This isn't enough, since us confused maintainers will release project A and then wonder why it isn't found in CIs and in other project B that depend on A.
For my use, it would be enough if