Is chunked uploading supported? #16

robj opened this Issue Oct 23, 2012 · 1 comment


None yet
2 participants

robj commented Oct 23, 2012

I have multi-gigabyte files that always fail uploading.

Is it possible to use S3 CORS + chunked uploads as per


I have tried adding the following options:

  maxChunkSize: 1000000, //  1 MB for testing
  maxRetries: 100,
  retryTimeout: 500,

this results in a series of successive 1MB POSTs with appropriate headers

eg.. Content-Range changes

second to last chunk:

Content-Disposition:attachment; filename="VTS_01_1.VOB"
Content-Range:bytes 3000000-3999999/4014080

last chunk:

Content-Disposition:attachment; filename="VTS_01_1.VOB"
Content-Range:bytes 4000000-4014079/4014080

However after the upload is complete on S3 the file is only the size of the final chunk, it is overwriting rather than appending the chunks.

Any suggestions here?


ncri commented Oct 23, 2012

First of all I would switch to an uploader using CORS, if you are still using this project. It simplifies the code a lot.
This gem is easy to use:

About the chunked upload: I have never tried it, so no idea what is going wrong. Might be a good idea to ask at Stackoverflow about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment