Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker.raw reserving too much size #2297

Closed
aitrusgit opened this issue Dec 7, 2017 · 93 comments
Closed

Docker.raw reserving too much size #2297

aitrusgit opened this issue Dec 7, 2017 · 93 comments

Comments

@aitrusgit
Copy link

@aitrusgit aitrusgit commented Dec 7, 2017

Expected behavior

Docker.raw should not grow much larger than the space used by docker containers, images, volumes.

Actual behavior

I have very few images downloaded locally, with no active containers and just a two volumes with some data.

➜  com.docker.driver.amd64-linux docker system df
TYPE                TOTAL               ACTIVE              SIZE                RECLAIMABLE
Images              19                  0                   3.628GB             3.628GB (100%)
Containers          0                   0                   0B                  0B
Local Volumes       2                   0                   803.5MB             803.5MB (100%)
Build Cache                                                 0B                  0B

Nevertheless Docker.raw size is huge!

➜  com.docker.driver.amd64-linux ls -l Docker.raw
-rw-r--r--@ 1 cf  staff  68719476736 Dec  7 10:35 Docker.raw

Information

Diagnose info:

Docker for Mac: version: 17.11.0-ce-mac40 (b2149617ed8d7263c9c035e25fe71d147169348c)
macOS: version 10.13.1 (build: 17B1003)
logs: /tmp/F397F806-6CA7-40E3-9C36-79B5F0ED519D/20171207-103808.tar.gz
[OK]     db.git
[OK]     vmnetd
[OK]     dns
[OK]     driver.amd64-linux
[OK]     virtualization VT-X
[OK]     app
[OK]     moby
[OK]     system
[OK]     moby-syslog
[OK]     kubernetes
[OK]     db
[OK]     env
[OK]     virtualization kern.hv_support
[OK]     slirp
[OK]     osxfs
[OK]     moby-console
[OK]     logs
[OK]     docker-cli
[OK]     menubar
[OK]     disk
@aitrusgit
Copy link
Author

@aitrusgit aitrusgit commented Dec 7, 2017

I also tried to do a full system prune

➜  com.docker.driver.amd64-linux docker volume prune
WARNING! This will remove all volumes not used by at least one container.
Are you sure you want to continue? [y/N] y
Deleted Volumes:
033ce3294aa395bbb8c468eba823622d65b5b1d7f042c4a7ebf29826aed6ea40
2901af7dd8d50ae74ad761fe33cc558881d46ab323e7f6ee6eb87c3337ed8508

Total reclaimed space: 803.5MB
➜  com.docker.driver.amd64-linux docker system prune -a
WARNING! This will remove:
        - all stopped containers
        - all networks not used by at least one container
        - all images without at least one container associated to them
        - all build cache
Are you sure you want to continue? [y/N] y
Deleted Images:
untagged: postgres:9.5.4
untagged: postgres@sha256:1480f2446dabb1116fafa426ac530d2404277873a84dc4a4d0d9d4b37a5601e8
deleted: sha256:2417ea518abc0db32cf2fb0d021ce57dab2e16f480ac924000e
52afd07d3b0a4
...
deleted: sha256:b51149973e6a6c4fb1091ef34ff70002ee307e971b9982075cf226004a93c9b7

Total reclaimed space: 3.628GB

Nevertheless Docker.raw size didn't shrink even after restarting docker engine

Loading

@aitrusgit
Copy link
Author

@aitrusgit aitrusgit commented Dec 7, 2017

I even tried removing Docker.raw file manually and, after restarting docker, it gets recreated with the same size:

➜  com.docker.driver.amd64-linux ls -l Docker.raw
-rw-r--r--@ 1 cf  staff  68719476736 Dec  7 10:44 Docker.raw

Is that file size independent of my volumes, images, containers? 64 GB is quite a big chunk of data for an small SSD HD.

Loading

@djs55
Copy link
Contributor

@djs55 djs55 commented Dec 7, 2017

ls -l shows the "size" of the file, not the number of sectors allocated by the filesystem. Try ls -sl:

$ cd ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux
$ ls -sl Docker.raw 
16248952 -rw-r--r--@ 1 user  staff  ` Dec  7 13:57 Docker.raw

In my case Docker.raw is only using 16248952 filesystem sectors which is much less than the maximum file size of 68719476736 bytes. If I download some images then the 16248952 increases, and if I docker system prune then it decreases again.

I think this is working as expected so I'll close the ticket but nevertheless it is a little confusing. Perhaps we could present the current space usage in the UI somehow-- I'll escalate this suggestion to our UI team.

Thanks for the report and for using Docker for Mac!

Loading

@djs55 djs55 closed this Dec 7, 2017
@codesman
Copy link

@codesman codesman commented Jan 15, 2018

Just updated to 17.12.0-ce-mac47 on 10.13.2 on a 13" MBA, reset the preferences and deleted all the things.
Preferences say the virtual disk image size is 64GB(3.5GB allocated)
Previously, my .qcow2 file was ~6.5GB
Now ls -sl Docker.raw reads 6840456 -rw-r--r--@ 1 user staff 68719476736 Jan 14 11:03 Docker.raw - 64GB. When I use GrandPerspective, it also reports that the .raw file is 64GB in size, when it used to report that the qcow2 was 6.5GB and it had some images in it, not a fresh install. Looks like there might still be a problem here.

Loading

@brown131
Copy link

@brown131 brown131 commented Jan 15, 2018

I'm on the same version as codesman, and I too am seeing the same problem. My Docker.raw file is 64G even after I cleared all images in my environment.

Loading

@thaJeztah
Copy link
Member

@thaJeztah thaJeztah commented Jan 16, 2018

The 64GB you see is the "logical" size of the image, not the physical size; see Docker.raw consumes an insane amount of disk space! in the documentation:

Docker uses the raw format on Macs running the Apple Filesystem (APFS). APFS supports sparse files, which compress long runs of zeroes representing unused space. The output of ls is misleading, because it lists the logical size of the file rather than its physical size. To see the physical size, add the -ks switch; to see the logical size in human readable form, add -lh:

Loading

@tjsousa
Copy link

@tjsousa tjsousa commented Feb 6, 2018

Adding to this information, I believe the issue is that most tools in the Apple ecosystem still rely on information from ls to report on available disk space.

In particular, Apple's own Software Update tool will refuse to install upgrades without reclaiming that space, and removing Docker.raw effectively unblocked the situation for me (by "freeing" 16GB, my predefined size for Docker image).

Loading

@stepmuel
Copy link

@stepmuel stepmuel commented Feb 14, 2018

I was using rsync to do simple backup of my home directory when I discovered the humongous size of Docker.raw. I was able to solve my problem by moving the file to /scratch under Preferences > Disk. It also had an option to switch the size from 64GB to 32GB, which is better, but still very large. I'd rather not backup GBs of zeros or else loose data I actually do want.

Since the file doesn't actually use disk space until it is "filled", I guess this is fine. But additional options for smaller file sizes might still be beneficial in certain situations. All in all I even doubt that the benefit of preallocating all that space is worth the weird side effects and head aches many user will experience because of it. Keep in mind that many not-very-technical users are instructed to install docker for a variety of reasons.

Loading

@thaJeztah
Copy link
Member

@thaJeztah thaJeztah commented Feb 14, 2018

@stepmuel it’s a sparse file, and the usual approach for VM disk images.

This article may be of interest to you; https://gergap.wordpress.com/2013/08/10/rsync-and-sparse-files/amp/

Loading

@matschaffer
Copy link

@matschaffer matschaffer commented May 16, 2018

I suspect this also causes Finder's "available" number to do strange things. Today I seemed to gain available space by copying data onto my laptop. 😕

Loading

@YRM64
Copy link

@YRM64 YRM64 commented May 19, 2018

My comments are in congruence with statements posted by djs55; adding that it's unreasonable to think docker.raw consumes, or is reserving too much (disk?) size. Because Docker utilizes the raw format on Macs running the Apple File System (APFS), APFS supports sparse files "while compressing long runs of zeroes representing un-used space" (source: docs.docker.com).

Another option in determining actual disk usage, and complementing djs55:

$ du -h Docker.raw
2,2G Docker.raw

Loading

@sudo-bmitch
Copy link

@sudo-bmitch sudo-bmitch commented Jul 31, 2018

After you delete a large file from inside the Linux VM, will the space be reclaimed from the Docker.raw file? From this stackoverflow question I'm suspecting the answer is no. My suspicion is that Linux does it's normal filesystem delete where it just unlinks the inode reference, but the underlying sectors on the disk are not zeroed out. And without those filesystem bits being zeroed out, APFS will never reclaim the sparse bits of the file. Is there a better option to reclaim that disk space other than wiping the Docker.raw file and starting fresh?

Loading

@jonnyijapan
Copy link

@jonnyijapan jonnyijapan commented Sep 19, 2018

Still eating up way too much space, believe it or not...

Loading

@shannon-fluellen-aurea
Copy link

@shannon-fluellen-aurea shannon-fluellen-aurea commented Sep 26, 2018

While I understand that the space isn't technically being used, it's still very annoying to have my Mac think I'm running out of space because it sees docker.raw eating up a quarter of my SSD. It causes low space notifications and, as mentioned by @tjsousa, will prevent things like software updates if the system thinks there isn't enough space b/c it views docker.raw as using what ls shows it using. I feel like there must be a way to solve this on Docker's end...

Loading

@kamil-jakubowski
Copy link

@kamil-jakubowski kamil-jakubowski commented Sep 28, 2018

Same problem here.

$ docker system df
TYPE                TOTAL               ACTIVE              SIZE                RECLAIMABLE
Images              10                  6                   1.673GB             1.233GB (73%)
Containers          8                   0                   4.647GB             4.647GB (100%)
Local Volumes       5                   5                   3.529GB             0B (0%)
Build Cache         0                   0                   0B                  0B
$ ls -klsh Docker.raw
37883956 -rw-r--r--@ 1 Ciastek  staff    45G 28 wrz 15:01 Docker.raw

Docker consumes 36GB.
It does not react on deleting volumes/containers/files in containers...
Restart of docker / system does not work.
Only working - resizing of docker available space.

Loading

@ivanlara
Copy link

@ivanlara ivanlara commented Oct 3, 2018

Same issue, my 128GB Mac keeps telling me that I'm running out of space every 2 minutes or so because of this, how do I take care of this 64GB Docker.raw space illusion.

Loading

@aviris
Copy link

@aviris aviris commented Oct 3, 2018

I understand how this is supposed to work, but the bottom line is that both ls -sl and du -h docker.raw return that my docker.raw file is 64GB. Doesn't matter how many or few containers I have, as far as macOS 10.13 or 10.14 are concerned, my docker.raw file is 64GB.

Moving my docker.raw file to an external drive freed up a total of 64 GB on the internal drive.

screenshot 10-03-2018 at 3 18 19 p

Loading

@hktalent
Copy link

@hktalent hktalent commented Oct 4, 2018

$ ls -lah Docker.raw
-rw-r--r--  1 xxx  staff    60G Oct  4 16:15 Docker.raw

Loading

@jackfruhecolab
Copy link

@jackfruhecolab jackfruhecolab commented Oct 4, 2018

I have the same issue as everyone else.

Also I have an actual apple sparse image that I use for something else, that image always shows the actual used size and not the provisioned size so it would seem that Docker is not using an actual apple sparse image...

Loading

@shaneHowearth
Copy link

@shaneHowearth shaneHowearth commented Oct 4, 2018

I am experiencing the same issue but it is causing issues.

$ uname -rms
Darwin 17.7.0 x86_64
$ docker --version
Docker version 18.06.1-ce, build e68fc7a

When I try to run my postgres container I am told that I don't have enough disk space on the device. There's ~35 GB on the disk available, so it's referring to the container's disk space.

When I resize the amount of space available via docker preferences I have made 80GB available (62 GB on disk) everything is happy again, but none of my containers are close to using up the previously set disk allocation

This has only really started being an issue in the last 4-5days
s@work ~/go/src/project (develop) $ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
postgres 10.4-alpine 962ed899c609 2 months ago 72.9MB
docker.elastic.co/elasticsearch/elasticsearch 6.0.1 28259852697e 10 months ago 508MB
openlabs/docker-wkhtmltopdf-aas latest c61ee94c7fcb 3 years ago 747MB
s@work ~/go/src/project (develop) $ docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
72918cb8860c openlabs/docker-wkhtmltopdf-aas "usr/local/bin/gunic…" 24 minutes ago Up 24 minutes 0.0.0.0:8800->80/tcp tap-wkhtmltopdf-aas
b9b5789999b5 docker.elastic.co/elasticsearch/elasticsearch:6.0.1 "/usr/local/bin/dock…" 25 minutes ago Up 25 minutes 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp tap-elasticsearch
7366131f37b3 postgres:10.4-alpine "docker-entrypoint.s…" 25 minutes ago Up 25 minutes 0.0.0.0:5432->5432/tcp tap-database

Loading

@fabiosussetto
Copy link

@fabiosussetto fabiosussetto commented Oct 9, 2018

I don't understand how come this issue has been closed.
This is clearly a big problem, as Docker for Mac is eating up all disk space. The fact that the space may be just "virtually" used doesn't change a thing from the user's point of view.

In the FAQs (https://docs.docker.com/docker-for-mac/faqs/#disk-usage), it says:

"This is an illusion. [...] The output of ls is misleading, because it lists the logical size of the file rather than its physical size."

It may as well be an illusion, but I can't install software updates and MacOS is constantly complaining about running out of space.

Kindly reconsider opening this issue, as it's a very, very annoying thing that imo should not happen in stable software. I understand it may not be easy to fix, but this is Docker for Mac at the end of the day - saying that it's kind of MacOs fault is a bit of a stretch, at best.

Thanks.

Edit: ping @djs55
Would you please consider at least reopening this issue?
Regarding: "I think this is working as expected": people here have their macbooks unusable because of this disk space issue.

Loading

@mgrandi
Copy link

@mgrandi mgrandi commented Oct 9, 2018

same thing as @aviris , i just did a git clone + docker compose on https://github.com/Runscope/requestbin#readme , and it installed like 2 images, which aren't that big:

[2018-10-09 16:19:39] markgrandi@Gypaetus:~$ docker system df
TYPE                TOTAL               ACTIVE              SIZE                RECLAIMABLE
Images              3                   2                   323.4MB             58.79MB (18%)
Containers          2                   0                   96.4kB              96.4kB (100%)
Local Volumes       2                   1                   1.238kB             0B (0%)
Build Cache         0                   0                   0B                  0B

but then the file is huge, according to ls -lah and GrandPerspective

[2018-10-09 16:20:46] markgrandi@Gypaetus:~$ ls -lah /Users/markgrandi/Library/Containers/com.docker.docker/Data/vms/0/Docker.raw
-rw-r--r--  1 markgrandi  staff    60G Oct  9 16:20 /Users/markgrandi/Library/Containers/com.docker.docker/Data/vms/0/Docker.raw

even if its not actually using up that much space because of an APFS sparse file, its still causing the "low disk" notification popup every 5 minutes which is highly annoying

screenshot_1043

Loading

@jjharsch
Copy link

@jjharsch jjharsch commented Oct 11, 2018

Has anyone found a solution to this? Is this a non-issue on Docker Toolbox?

As it currents stands, Docker for Mac is making my 128gb SSD macbook unusable.

Loading

@fabiosussetto
Copy link

@fabiosussetto fabiosussetto commented Oct 11, 2018

@jjharsch
warning: you are going to lose all your containers and images if you do the following (no way around this, sigh!)

Go to Preferences > Disk and reduce the disk image size allocation. I tried yesterday and it did reduce the .raw file size on my disk

Loading

@jjharsch
Copy link

@jjharsch jjharsch commented Oct 11, 2018

Loading

@daliborfilus
Copy link

@daliborfilus daliborfilus commented Oct 14, 2018

@aviris You have the docker.raw image on some external disk? Is that disk APFS? This sparse file works only on APFS.

The same applies if you have HFS(+) volume.

I have now a similar issue. I used Migration Assistant to copy all my files from one mac to another, and the migration assistant copied WHOLE 64GB docker.raw, ignoring that it's a sparse file.

So now I have allocated 64GB (according to du and Docker for Mac GUI "64GB (64.0GB on disk)" and I don't know how to re-enable sparse nature of it without deleting it.

Loading

@leehinde
Copy link

@leehinde leehinde commented Oct 18, 2018

It'll eat up your time machine back up quickly as well. I'm in the process of moving it to an external drive/higher level folder to make blocking it in Time Machine easier. The system is sure acting like it's moving 64GB. :-)

Loading

@thaJeztah
Copy link
Member

@thaJeztah thaJeztah commented Oct 18, 2018

@leehinde you should be able to exclude it from Time Machine backups in the configuration panel (de-select the "Include VM in Time Machine Backups" checkbox)

screen shot 2018-10-18 at 18 29 47

Loading

@jjharsch
Copy link

@jjharsch jjharsch commented Oct 18, 2018

Has anyone figured out how to make the max disk image size less than 16gb?

The MacOS "Your disk is almost full" is still being triggered at ~12gb free, despite only 4.7gb being used. This is a problem as it clears caches and indexes, including the spotlight index.

screen shot 2018-10-18 at 12 48 43 pm

Loading

@onacit
Copy link

@onacit onacit commented Aug 14, 2019

The illusion keeps making side effects.
The OS itself keep complaining about running out of storage and some applications, like Outlook, go offline.

Loading

@stepmuel
Copy link

@stepmuel stepmuel commented Aug 14, 2019

Native sparse files are an exotic new feature on macOS. Using them to that extent is confusing and makes system administration harder. Since Docker used to work before APFS, there must be a way to do it the old way. If there are good reasons that justify all the hassle, please let us know, so we can at least have an informed discussion!

As far as I see it, APFS sparse files are mostly intended to speed up creation of large files. Once a section is written to, its space will never be freed again. It is not some sort of "zero compression" like the documentation has us think. Also, it makes sense for software to assume a file with a size of 64 GB actually intends to use that space at some point. Your design implies otherwise, which is controversial for those who understand, and confusing for those who don't.

Docker is used by thousands of people who just want their Mac to work. Please don't make all of us have to deal with that weirdness. Just because a file system feature exists doesn't mean you should use it.

Loading

@djs55
Copy link
Contributor

@djs55 djs55 commented Aug 19, 2019

Once a section is written to, its space will never be freed again.

In fact when images are deleted inside the Linux VM we trigger an fstrim which calls fcntl(F_PUNCHHOLE) on macOS which causes the blocks to be freed again on the host. The actual space usage of the file should therefore go up and down, just as it used to with the Docker.qcow2. (If this isn't happening in practice then there must be a bug)

The main advantages of the sparse file are:

  1. container I/O is much faster
  2. reclaiming space is instant and reliable (previously we would mark the Docker.qcow2 blocks as free and run a concurrent GC to move blocks and shrink the file. If the VM is keeps allocating then the collector will take a long time to shrink the file. The file would often get very big, exceeding the maximum configured limits)

The Docker.qcow2 code still exists and is used when the disk image is written to a non-APFS filesystem. So one way to force the use of qcow2 is to write the disk to an external drive formatted with some other filesystem.

Loading

@dmeinrath
Copy link

@dmeinrath dmeinrath commented Dec 25, 2019

ran into this last night; is the workaround to this issue to just resize the 'disk image' in preferences?

Loading

@WingsLikeEagles
Copy link

@WingsLikeEagles WingsLikeEagles commented Jan 13, 2020

How about putting an option in the Disk screen to force the use of qcow2?

Loading

@yb66
Copy link

@yb66 yb66 commented Jan 23, 2020

Just a note, resizing the disk image in preferences does more than remove containers and images, which wouldn't be so painful, but it also removes anonymous and named volumes. No warning either.

Nice work, Docker team👌 Lucky I have backups…

Loading

@sdudley
Copy link

@sdudley sdudley commented Jan 23, 2020

If the existing .raw file will not shrink for whatever reason, one brute-force solution is to take advantage of the fact that Docker for Mac is really just running a little Linux VM.

This means you can make a complete backup of the VM's /var/lib/docker contents (which includes images, containers, tags--everything!) and then restore that into a new Docker.raw file. If you prune your images before running this procedure, the .raw should have the smallest size possible.

YMMV: I last followed these instructions in April 2019 and I haven't tried it with the most recent Docker version. Be careful and make backups of everything. I also can't give any assurances that this won't accidentally break something, but I did a full restore like this and have been using it successfully for months with no issues.

Step 1: Make a backup of your Docker.raw image.

Step 2: Run the following on your Mac to open a listening socket to receive data from the Docker VM. (If you don't have netcat, brew install netcat. This method also uses pv to show progress, which is optional. You can brew install pv to use it, or else remove | pv from the commands.) Make sure you have plenty of space in /tmp (or wherever you redirect the output) to store the contents of all your Docker stuff.

nc -l 8888 | pv >/tmp/var-lib-docker.tar

Step 3: In a separate Terminal window, access the Docker VM and send all of your Docker data to the Mac host, replacing <host_ip> with the IP address of your Mac. (You must use the actual LAN IP of your Mac here, since trying to use 127.0.0.1 from within the Docker VM would refer to the Docker VM itself.)

screen ~/Library/Containers/com.docker.docker/Data/vms/0/tty
service stop docker
cd /var/lib/docker
tar cf - . | nc <host_ip> 8888

When it's done, press Control-A then D to exit screen.

Step 4: Move your Docker.raw file somewhere else, then restart Docker to force it to create a. new .raw file.

Step 5: In Terminal on the host, run this to get the host ready to pipe all of the Docker data back to a socket:

cat /tmp/var-lib-docker.tar |nc -w 5 -l 8888

Step 6: Access the Docker VM in a separate Terminal window to restore the data to /var/lib/docker.

screen ~/Library/Containers/com.docker.docker/Data/vms/0/tty
service stop docker
cd /var/lib/docker/
rm -rf *
nc -w 5 <host_ip> 8888 | tar xf -

Step 7: Once the above is done, quit and restart Docker again. All of your data should be there, with a nice and small disk image.

Notes:

I sometimes had problems with the output of screen being garbled. Instead of running screen again and specifying the path to the TTY, I found that invoking with just screen -dr worked.

Also, the above method uses netcat to send data back and forth to the host, which is reasonably fast, but it will still take some time.

There may be a way to optimize this procedure to copy data directly to/from the macOS filesystem, at least if you have file sharing set up in Docker. I have not tried to use this method for copying data, but I was able to validate that you are able to see host-mounted filesystems if you enter the correct namespace in the Docker VM.

If this actually works (I haven't tried), you might be able to use these principles to tar and untar directly to the host filesystem without needing netcat. If someone else wants to spend time investigating this, you can do the following, for example:

screen ~/Library/Containers/com.docker.docker/Data/vms/0/tty
service stop docker
ps ax | grep "containerd --config"
nsenter -t <insert_PID_of_containerd_process> -m sh
# Can now see osxfs-mounted directories! For example,
# if /Users is shared in Docker prefs, you should be able to
# run something like this:
ls -l /Users

# ... do whatever ...
 
# Exit back to main container
exit

Loading

@dmeinrath
Copy link

@dmeinrath dmeinrath commented Jan 23, 2020

i just want to reiterate here that regardless of whether or not the .raw is actually using 64gb, when trying to install,say, xcode from the mac app store, the os (or whatever is checking diskspace) considers the 64gb as not available and will not begin installation. considering the typical macbook has around ~128-256 gb hd's typically and users are likely to assume that theyre just out of space or 'man xcode requires ALOT of space', theyre likely to start deleting other apps to make space they already have rather than conclude there's a hidden file with a quantum file size they need to adjust the settings for. what im saying is there's likely lots of people not represented on this issue that are likely affected by this who have no idea and given that many of us now HAVE to use docker at our jobs, it seems setting the default size to 64gb for this platform is not a user-friendly choice.

Loading

@sdudley
Copy link

@sdudley sdudley commented Jan 23, 2020

A possibly-related issue is that if you use APFS on a machine that has Time Machine enabled, deleting a file with rm won't immediately cause that space to become visible to tools like df -h (and potentially Xcode). I believe this occurs because the Time Machine local snapshot still maintains a reference to the deleted file contents, at least until such time as it's pushed to your non-local TM storage volume and the snapshot is removed.

Or, more succinctly, deleting the old Docker.raw still may not show the space as being available to certain tools. You can work around this by running a command to delete the local TM snapshots:

sudo tmutil thinLocalSnapshots / 10000000000 4

If you have more than one volume, you'll need to run this command several times and replace "/" with the volume mount point (ie. "/Volumes/Foo").

If you run df before and after this command, especially if you've just deleted (say) a big Docker.raw file, you should be able to see the difference.

Loading

@w-A-L-L-e
Copy link

@w-A-L-L-e w-A-L-L-e commented Mar 11, 2020

IMHO virtualbox gets this feature right. You can set a max disk size but the actual virtual .vmdk will only use as much space as you have data inside of it (and the file is smaller when you do ls -l). They're doing something different than this implementation and it's the better way. On latest mac os 10.15.3 with docker v2.2.0.3 my docker.raw is indeed still using 64 GB while I only have 11 GB worth of containers in it. It might be 'virtual' 53GB that is lost but macOs itself can't cope with correctly (meaning for all purposes it does consume this much space). Indeed df -h on the file shows only 11Gb but the finder and all other tools to do disk analysis see the file as being 64Gb and thereby it is using 53Gb of space that would otherwise be available to do other things.
Screenshot 2020-03-11 at 12 38 08

Screenshot 2020-03-11 at 13 46 46

Please consider re-opening in order to work on providing an alternative (possibly configurable) way of not having docker take up 64GB ...

Loading

@sandstrom
Copy link

@sandstrom sandstrom commented Mar 18, 2020

ping @mikeparker @ebriney Would you consider opening this issue?

Clearly lots of people think this is an issue. At the very least it's a signal that you should improve your documentation (or better yet, open this issue open and search for a solution).

Loading

@filipre
Copy link

@filipre filipre commented Mar 21, 2020

I wonder why my docker instance was set up to use 64GB even though I never specified that value. Moreover, I don't think 64GB is a good default value. I reckon that it does not take up "real" memory but the whole system thinks it does and becomes unusable (apps break, unstoppable notifications, ...). I reduced it to 16GB via the settings but would prefer an even smaller value without breaking the UI. Or, Docker could re-adjust the size by itself until it hits a maximum value. It would be much more convenient.

Please consider reopening this issue.

Loading

@damienburke
Copy link

@damienburke damienburke commented Mar 21, 2020

@jjharsch You can set the max disk usage under 16GB by editing the diskSizeMiB field from ~/Library/Group\ Containers/group.com.docker/settings.json.
image
Watch out, this breaks the UI, the slider will no longer work.

wow, that really helped me a lot. Finally I can move it to a 16GB SD card... :)

Screenshot 2020-03-21 at 20 41 36

U can also achieve this "resize" via the Docker Desktop GUI. Thanks!

Loading

@partyspy
Copy link

@partyspy partyspy commented Apr 9, 2020

@jjharsch
warning: you are going to lose all your containers and images if you do the following (no way around this, sigh!)

Go to Preferences > Disk and reduce the disk image size allocation. I tried yesterday and it did reduce the .raw file size on my disk

It doesn't help in my situation though. I tried to downsize the image but it prompted that not enough space left to do the downsizing. (I didn't have much space left at that time, only a few G's), and failed. And THEN it consumed all the few G's left in the disk. Then the size of docker.raw turned to be 0. But no space was released. After that I deleted the docker.raw and uninstall docker desktop. But it seemed the so-call 64G was not released and reclaimed by the OS. I did some calculation, it appeared around 50G space was lost in my disk for good and not being calculated by OS.
Does anybody have any ideas how to reclaim the lost space?😂

Loading

@partyspy
Copy link

@partyspy partyspy commented Apr 10, 2020

image
I tried to resize the image once but it failed. Said no enough disk space left, something like that. I didnt capture the screenshot. And the docker for Mac app terminated after that.
image
Then I checked the size by ls -klsh, the Docker.raw showed 0 byte. But the former 64G had not been released. Then I tried uninstall the Docker in the Preference. It cleaned all the directories but the 64G space was still not reclaimed by the OS, even though the Docker.raw file was not there anymore.

image
I got a disk with 256G, and has consumed less than 200G in fact.

image

The system still warning out of space. About 64G space is missing.

Loading

@partyspy
Copy link

@partyspy partyspy commented Apr 10, 2020

I did some digging and it turns out it's because of Disk Drill I was using which move deleted files into a hidden directory. I managed to scan them out with DaisyDisk and docker.raw is in it. (physically not 64G but 4.8G actually)

image

That .cleverfiles/hlink.ref/ consumed a huge amount of space, with it backing up a lot of deleted files and non-deleted ones. Removing it finally released more than 50G in the disk. Thanks to the great Daisy!

Loading

@xtfer
Copy link

@xtfer xtfer commented Apr 14, 2020

This is the most egregious abuse of a users storage that I think I've seen.

Loading

@paulhennell
Copy link

@paulhennell paulhennell commented May 6, 2020

This reminds me of the time I got falsely reprimanded for eating all the doughnuts in the office.

It was totally unfair; I didn't eat all the doughnuts, I barely ate half a doughnut! Shut that complaint right down.

Ok, so I did move 16 other doughnuts into my office to make it easier for me when I wanted more doughnuts, but I hadn't eaten them. Other people were welcome to eat them from my drawer whenever, the shortage was just an illusion, I don't know why everyone complained.

Didn't even need them in the end, so threw them out a week later to stock up the drawer with cake.

//Eaten or not, docker is hoarding doughnuts.

Loading

@maof97
Copy link

@maof97 maof97 commented May 18, 2020

So no one cares to fix this?

Loading

@limiao2008
Copy link

@limiao2008 limiao2008 commented May 29, 2020

docker 资源占用情况:docker system df
清理命令:
docker system prune

docker volume prune

docker container prune

删除本地 docker镜像 :
for i in docker image ls|awk -F' ' '{ print $3 }'; do docker image rm $i ; done

Loading

@timothyjmtan
Copy link

@timothyjmtan timothyjmtan commented Jun 2, 2020

If you're using Docker for Mac, you can change the reserved space via:

  1. Click on the Docker icon
  2. Preferences > Resources > ADVANCED > Disk Image Size
  3. I changed mine from 64GB to 16GB.
  4. Click 'Apply & Restart'

Check mac storage again, and voila you will get some space back.

Loading

@paulhennell
Copy link

@paulhennell paulhennell commented Jun 2, 2020

A nice option @timothyjmtan, but then you get this warning:

Resizing to a smaller size will delete the disk image; all Docker images, containers and volumes will be lost.

So it's less getting space back and more just starting again with a more sensible reservation.

For an option that basically sends you back to the start if you want to change it, why don't they ask at the start what it should be? 64gb is a crazy default to reserve 'just in case' without asking.

Loading

@fabb
Copy link

@fabb fabb commented Jun 25, 2020

Same issue here, reopen please.

Loading

@joshwolff1
Copy link

@joshwolff1 joshwolff1 commented Jul 8, 2020

I got it down to 8GB in the Preferences of Docker Desktop

docker

Loading

@bharatkashyap
Copy link

@bharatkashyap bharatkashyap commented Jul 27, 2020

Hey guys - facing a file not found when trying to run a du -h Docker.raw - but the OS still says about 57GB is reserved by "Others". Anyone with experience figuring this out?

Loading

@docker-desktop-robot
Copy link
Collaborator

@docker-desktop-robot docker-desktop-robot commented Aug 26, 2020

Closed issues are locked after 30 days of inactivity.
This helps our team focus on active issues.

If you have found a problem that seems similar to this, please open a new issue.

Send feedback to Docker Community Slack channels #docker-for-mac or #docker-for-windows.
/lifecycle locked

Loading

@docker docker locked and limited conversation to collaborators Aug 26, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet