Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
How to deal with the case that Docker data file size reaches the 100G threshold? #21611
Additional environment details (AWS, VirtualBox, physical, etc.):
Steps to reproduce the issue:
Describe the results you received:
Describe the results you expected:
Additional information you deem important (e.g. issue happens only occasionally):
Can someone give me a favor? Thx a lot
You need to increase the pool allowed for your containers. To do this, you will need to remove your var/lib/docker which will destroy all your containers and images.
Does this solve your issue? Also, try stopping docker processes, start docker again, and see if you can remove the images
Devicemapper running out of space can be very tricky indeed; We're tracking issues around this in #20272.
For Docker 1.11, there's a new option to specify a minimum amount of free space to be kept, and would prevent you from arriving in this situation; see #20786
I'm going to close this issue, because I think this is not a bug, but a support question, but feel free to continue the discussion here