Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipenv lock hangs. It really does. #3827

pilkibun opened this issue Jul 7, 2019 · 10 comments


Copy link

commented Jul 7, 2019

This is a followup to #2681, which was closed without addressing the issue for some users.
Also #3812, #3829, SO and someone's blog rant.

pipenv lock downloads every available artifact of installed packages and their dependencies. It does this to calculate their hashes, even when the artifact url includes the hash in a fragment. For some large packages, such as scipy, which have large dependencies and many artifacts per version, this behavior can result in unreasonably long delays for some users (893MB vs. 50MB download). It's also bad netiquette towards pypi.


  • @connormclaud noted that lock seems peculiarly sensitive to network conditions before it disappeared by itself.
  • @Pithikos complained of pipenv lock hanging when installing scipy
  • @jackiekazil was surprised that the problem later seemed to disappear of its own.

#2681 was closed by @techalchemy with a comment suggesting the delay is due to lengthy build times (which don't affect pipenv lock), but asked users to provide steps to reproduce.

All the packages fetched have wheels.


pipenv.lock calls Resolver.resolve() which enables all artifacts

Finds acceptable hashes for all of the given InstallRequirements.
with self.repository.allow_all_wheels():
return {ireq: self.repository.get_hashes(ireq) for ireq in ireqs}

For a common setup consisting of scipy, pandas and numpy, here's the list of artifacts:

Artifacts Queued for Hash Retrieval

That's a lot of sequential network round-trips to wait through, and most of these are for different platforms than the installation. But that's not all.

Each artifact is passed to HashCache.get_hash.

def get_hash(self, location):
# if there is no location hash (i.e., md5 / sha256 / etc) we on't want to store it
hash_value = None
vcs = VcsSupport()
orig_scheme = location.scheme
new_location = copy.deepcopy(location)
if orig_scheme in vcs.all_schemes:
new_location.url = new_location.url.split("+", 1)[-1]
can_hash = new_location.hash
if can_hash:
# hash url WITH fragment
hash_value = self.get(new_location.url)

The pypi artifact urls includes the sha256 hash, parsed into new_location.hash, but it isn't used. If the HashCache doesn't already hold the location key, the artifact is downloaded, hashed and then the hash is stored. This is the root cause of the delay, and in the above examples results in downloading 893MB of artifacts, compared to the 50MBs-worth which are installed.

The problem disappears if the user is patient enough to wait it out, or if the connection is fast. However, it seems most users are surprised by the delay (me included), and expect it to be more or less instantaneous once the packages are installed.

As a quick verification, I patched the get_hash method to use the hash fragment from the url if available, and the pipenv lock run time dropped to a few seconds.

Sidenote, --verbose does not log network requests, so this was masked from view even when trying to debug.

@pilkibun pilkibun changed the title pipenv lock downloads every possible artifact for every pyver/arch of all packages pipenv lock hangs. It really does. Jul 7, 2019


This comment has been minimized.

Copy link

commented Jul 9, 2019

It seems a good solution, I'm +1 on this. Thanks for your efforts in it. Would you mind send a PR with the fix for this?


This comment has been minimized.

Copy link

commented Jul 9, 2019

I don't think I have ever felt so validated by a comment I left on github in hopes that one day in future someone would stumble upon it. (Turned out to be the near future.) <3 Thank you for this.

@pilkibun pilkibun referenced this issue Jul 9, 2019
2 of 2 tasks complete

This comment has been minimized.

Copy link

commented Jul 9, 2019

#2681 was closed by @techalchemy with a comment suggesting the delay is due to lengthy build times (which don't affect pipenv lock)

Build times absolutely affect pipenv lock, I am surprised if you investigated this you would conclude that they don't. pipenv lock resolves dependencies. In python, for anything that is not a wheel, that requires acquiring the artifact and either building it or parsing the AST of the file in question. There is simply no way around that. On slow internet connections that may not be the limiting factor, and with people building and releasing wheels more it is becoming less of an issue, but it is absolutely something that happens during locking and remains a significant contributor to the time it takes. At this point even pip builds artifacts in order to resolve dependencies simply for installation.

Dependency resolution is an NP hard problem, there is no hack or easy trick around this, and in python it is also a problem that sometimes requires building artifacts. If you have identified and can avoid extra downloads that is excellent, but I do want to be clear: building is often a part of locking.

@jackiekazil apologies if you felt invalidated by the previous responses, we are all definitely aware that locking is slow, and as I mentioned in the other threads on the topic I agree that there are likely multiple downloads occurring but am not precisely sure where and would need to see debugging info to make any progress, so if the accompanying PR here addresses that it is awesome


This comment has been minimized.

Copy link

commented Jul 10, 2019

I'm going back to pip. This issue has been around for so long, and is the main blockage for new users. Makes the whole project seem a little bit unprofessional to me.


This comment has been minimized.

Copy link

commented Jul 24, 2019

I have also gone back to pip. Every time an issue is opened, it is ignored. Pipenv, fundamentally, is unusable. Lock fails for me even with 10 relatively small packages. Its a shame because this was a great idea, only to be ruined by the locking mechanism.


This comment has been minimized.

Copy link

commented Aug 6, 2019

my very first installation by pipenv got stuck by stopless "Locking", it just a small package and has already been install succeeded.

Installation Succeeded
Pipfile.lock not found, creating…
Locking [dev-packages] dependencies…
Locking [packages] dependencies…
[====] Locking...

1 hour later, it was still there... annoying.


This comment has been minimized.

Copy link

commented Aug 14, 2019

(posting this in the hopes that some reports of hanging are due to the same issue I encountered)
I have been seeing issues with a variety of different pipenv commands, but most recently with 'pipenv lock' in a project that all my colleagues have had no issues with. This project had very few dependencies and I don't believe any of the 'large packages' (mentioned by others) were involved. On my laptop, I was able to consistently repro the 'hang' (actually it would timeout after 30 minutes) when running 'pipenv lock' while my colleagues reported success within a few seconds.

After multiple attempts to debug this, I kept seeing the same package causing problems (in my case, it was configparser, but I don't believe that's particularly relevant). I managed to replicate how pipenv runs and recreated that environment in order to run the under pdb. Under those conditions, I found that would get stuck sitting in a sleep loop buried in a call stack that looked like it was trying to do some lockfile operations. When I looked at the directory it was trying to create in order to acquire the lock, it became clear what the problem was. The directory already existed and, in fact, had been on my filesystem for months (the length of time I have been having issues).

So in short, once I found:

~/Library/Caches/pipenv/http/f/5/9/3/d $ ll
total 8
17980163 drwx------  4 michio.nikaido  staff   136B May 28 11:59 ./
17980814 -rw-------  1 michio.nikaido  staff   3.6K May 28 11:59 f593d3690af2ee7367984e87eb3821fa2a514d37d808c806b4c7719a
17980812 drwxr-xr-x  2 michio.nikaido  staff    68B May 28 11:59 f593d3690af2ee7367984e87eb3821fa2a514d37d808c806b4c7719a.lock/
17980162 drwx------  3 michio.nikaido  staff   102B May 28 11:59 ../

I moved the that subtree to another location and then was able to successfully run pipenv lock again.

So for Mac OS, something like 'find ~/Library/Caches/pipenv/http | grep lock' might help highlight issues of this sort.


This comment has been minimized.

Copy link

commented Aug 21, 2019

Well, I really cannot understand this.
@allPipenvMembers Hey, everyone, this is a blocking issue but not a trivial issue. It DO DOES DID DONE IS a blocking issue. Which really prevents even fanatics from using pipenv.

Long-time ago, pypi doesn't contain metadata of the packages. There is no other choice but downloading all packages to calculate the hash. But it is 2019 now, pypi provides what you need.

In the past, I recommended this tool to others. Said "Oh, this is the next generation of pip.", "It is perfect, it not only manage virtual envs for you but also lock your dependencies.", "Just use it." ...
But now I really cannot open my mouth again.


This comment has been minimized.

Copy link

commented Aug 22, 2019

Thanks @michio-nikaido
I confirm deleting the cache folders solved the issue for me.


This comment has been minimized.

Copy link

commented Sep 6, 2019

had the same problem. no error, verbose does nothing. really frustrating.
Fixed by putting the following into my make clean:
rm -rf ~/Library/Caches/pipenv/http/**/*.lock # fix weird pipenv bug
check for correct path for your OS by
python3 -c 'import appdirs; print(appdirs.user_cache_dir("pipenv"))'

the ** needs globstar
shopt -s globstar # allow **/*.py to work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
You can’t perform that action at this time.