Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS S3 upload (backblaze-comatible S3 api) doesn't work with files larger ~100MB #11232

Closed
cyberduck opened this issue Nov 12, 2020 · 1 comment
Labels
bug duplicate high priority s3 AWS S3 Protocol Implementation

Comments

@cyberduck
Copy link
Collaborator

3000015 created the issue

When using the AWS-compatible backblaze S3 endpoint as in their docs, it seems that small files upload fine and fast, but large files stall. The upload never starts, there's a message about "initializing large file upload" but then nothing happens anymore.
The download can't be stopped or removed, only quitting Cyberduck brings it back into a working state (file upload still fails).

Cyberduck is used as example integration app in the backblaze docs:
https://help.backblaze.com/hc/en-us/articles/360047425453-Getting-Started-with-the-S3-Compatible-API

I'm not sure if that's an issue with CyberDuck, Backblaze, or their S3 integration, but since Cyberduck goes into a broken / unstoppable state there's at least something wrong here.

@cyberduck
Copy link
Collaborator Author

@dkocher commented

Duplicate for #11233.

@iterate-ch iterate-ch locked as resolved and limited conversation to collaborators Nov 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug duplicate high priority s3 AWS S3 Protocol Implementation
Projects
None yet
Development

No branches or pull requests

1 participant