GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
Already on GitHub? Sign in to your account
I have multi-gigabyte files that always fail uploading.
Is it possible to use S3 CORS + chunked uploads as per
I have tried adding the following options:
maxChunkSize: 1000000, // 1 MB for testing
this results in a series of successive 1MB POSTs with appropriate headers
eg.. Content-Range changes
second to last chunk:
However after the upload is complete on S3 the file is only the size of the final chunk, it is overwriting rather than appending the chunks.
Any suggestions here?
First of all I would switch to an uploader using CORS, if you are still using this project. It simplifies the code a lot.
This gem is easy to use: github.com/waynehoover/s3_direct_upload
About the chunked upload: I have never tried it, so no idea what is going wrong. Might be a good idea to ask at Stackoverflow about it.