Skip to content
This repository has been archived by the owner on May 17, 2019. It is now read-only.
/ node-s3-utils Public archive

A Command Line Interface providing some utilities for managing AWS S3 resources

License

Notifications You must be signed in to change notification settings

sphereio/node-s3-utils

Repository files navigation

commercetools logo

s3utils

Build Status Coverage Status Dependency Status devDependency Status ![Stories in Ready](https://badge.waffle.io/sphereio/node-s3-utils.svg?label=in+progress&title=In progress)

A Command Line Interface providing some utilities for managing AWS S3 resources (e.g. converting/resizing images stored in S3 folders). It uses knox as underlying S3 client.

Table of contents

Features

  • multiple files upload
  • progress notifications
  • images resizing + uploading using imagemagick

Getting Started

Install the module

$ npm install -g node-s3-utils # make command globally available

Install imagemagick (used for image conversion)

$ apt-get install imagemagick

# or (osx)
$ brew install imagemagick

# or download installer http://cactuslab.com/imagemagick/

S3 Credentials

To be able to access AWS (S3) resources, following credentials are required

  • API key
  • API secret
  • bucket

You can provide those credentials in different ways:

  • via ENV variables S3_KEY, S3_SECRET, S3_BUCKET
  • via json file
    • by passing the path as command argument
    • if no argument is provided, it will try to lookup the file from one of the following locations
      • ./.s3-credentials.json
      • ~/.s3-credentials.json
      • /etc/.s3-credentials.json

Example:

// ~/.s3-credentials.json
{
   "key": "1111111",
   "secret": "3333333",
   "bucket": "s3-bucket-name"
}

You can generate a sample json file by executing ./create_credentials.sh

Documentation

The module is a CLI tool. To get some information just use help

$ s3utils help

The module exposes the following main commands:

  • files
  • images

Subcommands

files

Handle file resources in S3 buckets

$ s3utils help files

Available subcommands:

  • list - Lists files matching prefix and regex
  • upload - Uploads a file to S3
  • delete - Deletes files matching prefix and regex

files list

List files from S3

$ s3utils files help list

Options:

  • credentials <path> - Optional
  • prefix <name> - Required
  • regex [name] - Optional
  • logFile <path> - Optional
  • sendMetrics - Optional
  • metricsPrefix - Optional

Example

# list files with `foo/` prefix, having extension `.txt`
$ s3utils files list -c ./.s3-credentials.json -p foo/ -r 'foo\/(\w)+\.txt'

files upload

Upload one file to a bucket

$ s3utils files help upload

Options:

  • credentials <path> - Optional
  • source <path> - Required
  • target <path> - Required
  • logFile <path> - Optional
  • sendMetrics - Optional
  • metricsPrefix - Optional

Example

$ s3utils files upload -c ./.s3-credentials.json -s ./bar.txt -t foo/bar.txt

files delete

Delete files in S3

$ s3utils files help delete

Options:

  • credentials <path> - Optional
  • prefix <name> - Required
  • regex [name] - Optional
  • logFile <path> - Optional
  • dry-run - Optional
  • sendMetrics - Optional
  • metricsPrefix - Optional

Example

# delete files with `foo/` prefix, having extension `.txt`
$ s3utils files delete -c ./.s3-credentials.json -p foo/ -r 'foo\/(\w)+\.txt'

images

Handle images resources in S3

$ s3utils help images

Available subcommands:

  • convert - Convert/resize images in S3

images convert

Requires imagemagick to be installed

Subsequently downloads images from S3 source folders, converts to defined image sizes and uploads resulting files to proper target folders

$ s3utils images help convert

Options:

  • credentials <path> - Optional
  • descriptions <path> - Required
  • regex [name] - Optional
  • logFile <path> - Optional
  • sendMetrics - Optional
  • metricsPrefix - Optional

The descriptions object defines which AWS S3 folders are used, which image sizes have to be generated and if images have to be compressed. It also may define additional headers for querying content list from S3 and headers used for S3 resources.

A conversion description has to be defined in the configuration file for each of the image folder in S3 that needs to be processed

Example Converts two S3 folders (products/unprocessed and looks/unprocessed), meaning all images in those folders will be downloaded, converted/resized and uploaded to a target folder.

// descriptions.json
[
  {
    "prefix_unprocessed": "products/unprocessed", // source S3 path in bucket - stores original images before converting
    "prefix_processed": "products/processed", // target S3 path in bucket - stores original images after converting
    "prefix": "products/", // target S3 path in bucket for resized images
    "compress": true, // compress images
    "headers": { // headers used for querying content list from S3
      "max-keys": 3000 // number of elements return from AWS list query (default is 1000)
    },
    "headers_resource": { // headers used for S3 resources
      "Cache-Control": "max-age=2592000" // set max age in seconds
    },
    "formats": [ // image sizes to upload to S3
      {
        "suffix": "_thumbnail", // will be appended to the file name
        "width": 240, // width for resized image
        "height": 240 // height for resized image
      },
      {
        "suffix": "_small",
        "width": 350,
        "height": 440
      }
    ]
  },
  {
    "prefix_unprocessed": "looks/unprocessed",
    "prefix_processed": "looks/processed",
    "prefix": "looks/",
    "headers": {
      "max-keys": 3000
    },
    "formats": [
      {
        "suffix": "_thumbnail",
        "width": 240,
        "height": 240
      }
    ]
  }
]
$ s3utils images convert -c ./.s3-credentials.json -d ./descriptions.json

Development in a VM with Vagrant

We provide also a simple Vagrantfile setup to run it locally in a little VM. All required tools will be automatically installed once the box is provisioned.

$ vagrant up

Tests

Tests are written using jasmine (Behavior-Driven Development framework for testing javascript code). Thanks to jasmine-node, this test framework is also available for Node.js.

To run tests, simple execute the test task using grunt.

$ grunt test

Contributing

In lieu of a formal styleguide, take care to maintain the existing coding style. Add unit tests for any new or changed functionality. Lint and test your code using Grunt. More info here

Releasing

Releasing a new version is completely automated using the Grunt task grunt release.

grunt release // patch release
grunt release:minor // minor release
grunt release:major // major release

Styleguide

We <3 CoffeeScript here at commercetools! So please have a look at this referenced coffeescript styleguide when doing changes to the code.

License

Copyright (c) 2014 Sven Mueller Licensed under the MIT license.

About

A Command Line Interface providing some utilities for managing AWS S3 resources

Resources

License

Stars

Watchers

Forks

Packages

No packages published