Skip to content
Archive and Restore DynamoDB Tables, from the Command Line
Branch: master
Clone or download
Latest commit 8de9428 Oct 29, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
bin Support table index May 17, 2018
test Don't let paths with spaces or running dynamo server trip up the test… Jun 11, 2014
.gitignore initial version, works Jun 2, 2013
.rultor.yml #30 npmrc from home repo Oct 29, 2019
.travis.yml #master: typo Jun 12, 2014 badge Jul 3, 2019
articles.json #17 install Oct 15, 2015
package.json Merge branch 'master' into feature-https-proxy Oct 28, 2019

DevOps By

Build Status NPM version

There are two simple Node.js scripts that archive and restore an entire AWS Dynamo DB table in JSON format.

Install it first (I assume you have Node.js and Npm installed already):

$ npm install dynamo-archive

Create a user in Amazon IAM and assign a policy to it (how?):

  "Statement": [
      "Effect": "Allow",
      "Action": ["dynamodb:Scan", "dynamodb:DescribeTable"],
      "Resource": "arn:aws:dynamodb:us-east-1:019644334823:table/test"

Where 019644334823 if your AWS account number, us-east-1 is AWS region, and test is the name of your Dynamo DB table (can be a *, if you grant access to all tables).

Run it first without arguments and read the output:

$ ./bin/dynamo-archive.js

To restore a table from a JSON file run:

$ ./bin/dynamo-restore.js

Crontab automation

I'd recommend to use this simple bash script to automate backups of your Dynamo DB tables and save them to S3 (I'm using s3cmd):


#optional endpoint for DynamoDB local
declare -a TABLES=(first second third)
for t in ${TABLES[@]}
  dynamo-archive/bin/dynamo-archive.js --table=$t > $t.json
  s3cmd --no-progress put $t.json s3://$t.json
  rm $t.json
You can’t perform that action at this time.