Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Implement blob stream #477

andrerod opened this Issue Nov 14, 2012 · 3 comments


None yet
2 participants

We should add support for blob stream including support for multiple blocks / large blobs upload. This is constantly asked by partners / customers / cloud 9 / etc. We should just go and add it.

I think this is the most recurrent topic about the SDK.

Also mentioned in #473

Related to #12 as well.

I did it on my own library as follows:

  var size = fs.statSync(file).size;
  var chunkSize = Math.pow(1024,2) * program.chunksize;
  var chunks = Math.ceil(size / chunkSize);
  var blobName = path.basename(file);

  async.timesSeries(chunks, function (n, next) {

    var start = n * chunkSize;
    var end = start + chunkSize;// - 1;
    if (n == chunks-1) {
      end = start+(size%chunkSize);
    var blockId = blobName + '--' + pad(n, 4);

    var stream = fs.createReadStream(file, {start: start, end: end});

    blobService.createBlobBlockFromStream(blockId, container, blobName, stream, end-start, function(error){
      if(error) return next(error);
      next(null, blockId);

  }, function (err, blocks) {
    if (err) return callback(err);

    var blockList = {
      LatestBlocks: blocks

    blobService.commitBlobBlocks(container, blobName, blockList, function (err) {
      if (err) return showErrorAndExit(err);


the important bits are createReadStream(file, {start: start, end: end}). This method could help to make a better createBlobBlockFromFile method that supports larger files. For createBlobFromStream you can multiplex the stream or something like that not really sure.

@andrerod andrerod closed this Oct 16, 2013

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment