-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[S3] "Failed to copy: multipart upload failed to upload part" using Cloudflare provider #6193
Comments
Probably some code path degrades to using presigned URLs would be my guess. |
Can you post
That is surprising as I don't think there are any relevant changes... You said the problem is intermittent? Are you sure it never happens with the old version?
What size of file? Are they bigger than --s3-upload-cutoff ? If so it is using multipart upload. I suspect this is something to do with retries on errors not being signed properly, or the error being mis-reported anyway. |
The issue no longer happens with the latest beta. I am closing the issue for now and will reopen with detailed logs or create a new one if a problem happens again. Thanks for your help. |
@ncw I am running into this issue with rclone version:
will post log shortly |
@vedantroy we've been exploring this issue in https://forum.rclone.org/t/uploading-large-files-to-r2-250-mib-with-rclone-causes-signature-errors/33267 It seems like R2 doesn't like so much concurrency and that Can you try that? |
@ncw Just had the same problem on the latest stable version of rclone:
I managed to upload the file (659MB) with the switch you mentioned, |
@dalbitresb12 Rclone could decrease the concurrency for r2 by default. Worth changing? |
@ncw Is there anyway we could keep a little bit more of concurrency so that upload speeds aren't affected that much? Changing the default sounds fine for me, just asking if we can increase upload speeds. |
Its up to cloudflare really to fix their backend.... But experiment with |
Here's a possibly-relevant R2-related data point, leading me to suspect Cloudflare may have increased customers' permitted concurrency since this thread began. I've just uploaded 400 GB from a very well connected machine (total time: just over an hour) using a 1.58.1 pre-release rclone. With roughly 82,000 files, ranging from multi-GB sizes downwards, I used |
@jpluscplusm good data point thanks. |
I also experienced something like this while trying --transfers=32 and --s3-upload-concurrency=32 uploading 1GB chunks of a 1.2TB file to R2 in the past few days The transfer with 32 was achieving 120MB/s for me |
Using the rclone latest beta from 2022/05/23, file upload using the Cloudflare R2 provider are failing. It seems to happen with medium sized file (not all file uploads).
Here is the error:
and the rclone configuration used:
Downgrading to an older beta version solves the issue. Here is a working beta version:
https://beta.rclone.org/branch/fix-5422-s3-putobject/v1.59.0-beta.6122.7a0cdbc45.fix-5422-s3-putobject/rclone-v1.59.0-beta.6122.7a0cdbc45.fix-5422-s3-putobject-linux-amd64.deb
It seems the error started to happen between May 21 and May 22.
The text was updated successfully, but these errors were encountered: