Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
s3-streaming-upload is node.js library that listens to your stream and upload its data to Amazon S3 using ManagedUpload API.
CoffeeScript Shell

README.md

s3-streaming-upload Build Status Gitter chat

s3-streaming-upload is node.js library that listens to your stream and upload its data to Amazon S3.

It is heavily inspired by knox-mpu, but unlike it, it does not buffer data to disk and is build on top of official AWS SDK instead of knox.

Changes

Installation

Installation is done via NPM, by running npm install s3-streaming-upload

Features

  • Super easy to use
  • No need to know data size beforehand
  • Stream is buffered up to specified size (default 5MBs) and then uploaded to S3
  • Segments are not written to disk and memory is freed as soon as possible after upload
  • Uploading is asynchronous
  • You can react to upload status through events

Quick example

var Uploader = require('s3-streaming-upload').Uploader,
    upload = null,
    stream = require('fs').createReadStream('/etc/resolv.conf');

upload = new Uploader({
  // credentials to access AWS
  accessKey:  process.env.AWS_S3_ACCESS_KEY,
  secretKey:  process.env.AWS_S3_SECRET_KEY,
  bucket:     process.env.AWS_S3_TEST_BUCKET,
  objectName: "myUploadedFile",
  stream:     stream,
  debug:      true
});

upload.send(function (err) {
  if (err) {
    console.error('Upload error' + err);
  }
});

Setting up ACL

Pass it in objectParams to the Uploader:

upload = new Uploader({
  // credentials to access AWS
  accessKey:  process.env.AWS_API_KEY,
  secretKey:  process.env.AWS_SECRET,
  bucket:     process.env.AWS_S3_TRAFFIC_BACKUP_BUCKET,
  objectName: "myUploadedFile",
  stream:     stream,
  objectParams: {
    ACL: 'public-read'
  }
});
Something went wrong with that request. Please try again.