Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock on cache mounts #1322

Closed
cpuguy83 opened this issue Jan 10, 2020 · 2 comments · Fixed by #1355
Closed

Deadlock on cache mounts #1322

cpuguy83 opened this issue Jan 10, 2020 · 2 comments · Fixed by #1355
Assignees
Labels

Comments

@cpuguy83
Copy link
Member

I am experiencing deadlocks related to preparing cache mounts.

Here is a stack trace from one of my build agents:
https://gist.github.com/cpuguy83/0e79cd121c780df71eb1cbbcbcc8f1e1

This has wedged most of my build agents (when they try to lookup a particular cache mount) and seems to happen rather easily.

I have setup my builds to generate stack dumps any time we end up cancelling (e.g. due to timeout) the build so I can get more of these pretty easily.

@cpuguy83
Copy link
Member Author

Comments from @tonistiigi on Slack:

Seems like a loop in https://gist.github.com/cpuguy83/0e79cd121c780df71eb1cbbcbcc8f1e1#file-dockerd-goroutines-txt-L2206-L2210

there’s a double clone going on where cacheRefShare’s Mutable ref is another CacheRef. Not sure why it doesn’t happen more frequently though. not sure if just need to handle that case in release or should do some refactoring to avoid case like this completely

@cpuguy83
Copy link
Member Author

I know where my concurrency is coming from.
I end up building several go binaries (all the moby things) simultaneously as part of the build because of stage parallelization.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants