New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backup effectively stops after 20-25 GB while trying to backup large files #3206
Comments
Update: I stopped and restarted Duplicati, then I had to repair the database, as I got an AggregateException. Still, it's not really increasing my confidence ... so if there's any ideas on why this happens, I'd be happy to hear them. best regards |
Update #2: It stopped again 7 hours ago at around ~47 GB at a large file. |
also I should maybe mention that the /var/tmp directory (which I set explicitly via TMPDIR) grew up to ~50 GB in size, which sounds way too much IMHO. |
I'm having the same issue on Windows 7 x64, 2.0.3.6_canary_2018-04-23. The issue is caused by the system drive filling up, since Duplicati puts many files in the %LocalAppData%\Temp folder (The AggregateException that I got said it was out of space, and I only had 100MB of free space on drive C). I'm not sure why Duplicati does this, but since I have multiple VM images being backed up (total for VMs 201GB, largest VM 48GB, free space on C: 84.7GB) I guess it doesn't clean up these files during the backup. |
I'm having this issue as well with 2.0.3.6 ( |
Closing this as I think it duplicates #2465. If anyone disagrees, we can re-open this if necessary. |
Environment info
Description
I do have a backup set of ~300 GB within 300.000 files or so, from very small text files to rather large VM images.
Every time I try to backup those files with Duplicati, I run into the same problem: it works fine at my max upload speed (~10 MBps) for some hours, then it drops to very slow speed (~10 KBps) until if finally seems to completely stall.
The point where the backup stops seems to be always at around 18-25 GB (remote backup size) and always during a big file (VM image with some GBs).
I tried various combinations, like
but the result is always basically the same.
At that point I don't see any activity in terms of CPU, network or IO of the Duplicati/mono processes, so it seems to be completely stalling. Also the live profiling log shows the last entries from hours ago (see below).
After some reading through previous issues, forum posts and https://www.duplicati.com/articles/Choosing-Sizes/, I especially hoped that increasing the blocksize would be beneficial for me, but not so far. Also the SQLite database is running on an SSD (2x RAID 1 SSDs actually).
I'm running out of ideas here, so I'd really appreciate any feedback on how I can get this working or do some more analysis.
many thx!
Debug log
This Debug Log is from ~11:00 AM, so some hours no log activity ... at least not shown in the Live Profiling Log UI.
The text was updated successfully, but these errors were encountered: