S3 upload: troubles with big files (temp files) #29589
Labels
0. Needs triage
Pending check for reproducibility or if it fits our roadmap
bug
feature: object storage
needs info
This report is about the handling of an S3 upload preparation, and its failing with big files.
The trouble is that NC struggles to handle large S3 uploads.
I try to lay out where it fails and why, with eye sight to (my) human errors.
The first stumble is that the small OS disk gets filled with chunks (which I didn't expect).
Upon inspection I wonder if the process could even be half as long, too.
Thank you for your time, and all this good stuff!
index.php/settings/integrity/failed
No errors have been found.
Disks
Task
COPY from
['datadirectory']
to S3, via GUI, leaving Browser openTestfiles
Failing Bigfile: 26 GB file
Succeeding Smallfile: 300 MB file
Environment
Errors
Overall Description
['tempdirectory']
(first copy stage)['tempdirectory']
are being emptied. No file exists at S3.Timeline
This is probably unrelated to timeouts, but I took notes anyways:
Logs (NC Protocol Page)
What if I avoid "disk full"?
if both PHP and NC tempfiles are on the "plentiful" NFS mount, the web notification "FILE could not be copied" still appears, but the S3 upload succeeds, and no "disk full" error gets logged.
Wished solution
It seems plausible to skip the first non-chunked stage, because
others:
Best Regards
Manu
The text was updated successfully, but these errors were encountered: