Chunked file uploads

Sebastian Tschan edited this page Jul 9, 2013 · 32 revisions

Chunked file uploads are only supported by browsers with support for XHR file uploads and the Blob API, which includes Google Chrome and Mozilla Firefox 4+.

Client-side setup

To upload large files in smaller chunks, set the maxChunkSize option (see Options) to a preferred maximum chunk size in Bytes:

    maxChunkSize: 10000000 // 10 MB

For chunked uploads to work in Mozilla Firefox 4-6 (XHR upload capable Firefox versions prior to Firefox 7), the multipart option also has to be set to false - see the Options documentation on maxChunkSize for an explanation.

Server-side setup

The example PHP upload handler supports chunked uploads out of the box.

To support chunked uploads, the upload handler makes use of the Content-Range header, which is transmitted by the plugin for each chunk.

How do chunked uploads work?

If maxChunkSize is set to an integer value greater than 0, the File Upload plugin splits up files with a file size bigger than maxChunkSize into multiple blobs and submits each of these blobs to the upload url in sequential order.

The byte range of the blob is transmitted via the Content-Range header.
The file name of the blob is transmitted via the Content-Disposition header.


Chunked file uploads trigger the same callbacks as normal file uploads, e.g. the done callback (see API) will only be triggered after the last blob has been successfully uploaded.

Chunked uploads trigger additional callbacks, that can be used to track the events of individual chunked uploads:

$('#fileupload').fileupload({maxChunkSize: 100000})
    .on('fileuploadchunksend', function (e, data) {})
    .on('fileuploadchunkdone', function (e, data) {})
    .on('fileuploadchunkfail', function (e, data) {})
    .on('fileuploadchunkalways', function (e, data) {});

Note: Callbacks set as part of the $.ajax Options (e.g. success, error or complete) will be called for each AJAX request, including uploads of individual chunks.

Cross-site chunked uploads

By default, browsers don't allow all headers used for cross-site file uploads, if they are not explicitly defined as allowed with the following server-side headers:

Access-Control-Allow-Headers Content-Type, Content-Range, Content-Disposition

Resuming file uploads

Using the uploadedBytes option (see Options), it is possible to resume aborted uploads:

    maxChunkSize: 10000000, // 10 MB
    add: function (e, data) {
        var that = this;
        $.getJSON('server/php/', {file: data.files[0].name}, function (result) {
            var file = result.file;
            data.uploadedBytes = file && file.size;
      , e, data);

The above code overrides the add callback and sends a JSON request with the current file name to the server. If a file with the given name exists, the server responds with the file information including the file size, which is set as uploadedBytes option.
If uploadedBytes is set, the plugin only uploads the remaining parts of the file as blob upload.

Automatic resume

The following code snippet implements an automatic resume functionality, based on the previous code:

    /* ... settings as above plus the following ... */
    maxRetries: 100,
    retryTimeout: 500,
    fail: function (e, data) {
        // jQuery Widget Factory uses "namespace-widgetname" since version 1.10.0:
        var fu = $(this).data('blueimp-fileupload') || $(this).data('fileupload'),
            retries ='retries') || 0,
            retry = function () {
                $.getJSON('server/php/', {file: data.files[0].name})
                    .done(function (result) {
                        var file = result.file;
                        data.uploadedBytes = file && file.size;
                        // clear the previous data:
               = null;
                    .fail(function () {
                        fu._trigger('fail', e, data);
        if (data.errorThrown !== 'abort' &&
                data.uploadedBytes < data.files[0].size &&
                retries < fu.options.maxRetries) {
            retries += 1;
  'retries', retries);
            window.setTimeout(retry, retries * fu.options.retryTimeout);
  , e, data);

If the upload fails, the code above will automatically resume the file upload after retrieving the uploaded bytes.
To prevent endless loops, the number of retries can be limited with the maxRetries setting.
The retryTimeout setting defines a timeout in milliseconds, before the file upload is resumed. It is increased for every subsequent retry to extend the waiting time.

Deleting aborted chunked uploads

If you don't offer your users the option to resume aborted uploads, you might want to delete incomplete uploads from the server. The recommended way would be to do this on server-side, e.g. by a cron-job that deletes incomplete files.
However if you want a quick solution, it's possible to send a DELETE request when the chunked upload fails (e.g. when the user aborts the upload):

    maxChunkSize: 10000000, // 10 MB
    fail: function (e, data) {
            url: 'server/php/',
            dataType: 'json',
            data: {file: data.files[0].name}
            type: 'DELETE'


You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.