Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS S3 upload (backblaze-comatible S3 api) doesn't work with files larger ~100MB #11232

cyberduck opened this issue Nov 12, 2020 · 1 comment


Copy link

@cyberduck cyberduck commented Nov 12, 2020

3000015 created the issue

When using the AWS-compatible backblaze S3 endpoint as in their docs, it seems that small files upload fine and fast, but large files stall. The upload never starts, there's a message about "initializing large file upload" but then nothing happens anymore.
The download can't be stopped or removed, only quitting Cyberduck brings it back into a working state (file upload still fails).

Cyberduck is used as example integration app in the backblaze docs:

I'm not sure if that's an issue with CyberDuck, Backblaze, or their S3 integration, but since Cyberduck goes into a broken / unstoppable state there's at least something wrong here.

Copy link
Collaborator Author

@cyberduck cyberduck commented Nov 12, 2020

@dkocher commented

Duplicate for #11233.


@cyberduck cyberduck closed this Nov 12, 2020
@iterate-ch iterate-ch locked as resolved and limited conversation to collaborators Nov 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
1 participant