Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Sync folders with s3 using a metadata file with md5 sums.
Fetching latest commit...
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.
bin cli
lib v0.5.1
.gitignore basics
MIT-LICENSE initial by
Rakefile initial by readme
gem-public_cert.pem cert
s3_meta_sync.gemspec use aws-sdk-v1 directly

Sync folders with s3 using a metadata file with md5 sums.


gem install s3_meta_sync


# upload local files and remove everything that is not local
s3-meta-sync <local> <bucket:folder> --key <aws-access-key> --secret <aws-secret-key>

# download files and remove everything that is not remote
s3-meta-sync <bucket:folder> <local> --region us-west-2 # no credentials required

Key and secret can also be supplied using AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY

If a downloaded file is does not match it's md5 sum in .s3-meta-sync, the whole download is aborted and no change is made.


    -k, --key KEY                    AWS access key
    -s, --secret SECRET              AWS secret key
    -r, --region REGION              AWS region if not us-standard
    -p, --parallel COUNT             Use COUNT threads for download/upload default: 10
        --ssl-none                   Do not verify ssl certs
    -z, --zip                        Zip when uploading to save bandwidth
        --no-local-changes           Do not md5 all the local files, they did not change
    -V, --verbose                    Verbose mode
    -h, --help                       Show this.
    -v, --version                    Show Version

Production setup example


s3-meta-sync company:translations translations # download current translations (will fail on corrupted translations but leave a log)
cp -R translations working # make a copy so corruption detection is used on next download
rake generate_translations
s3-meta-sync working company:translations

Download using multi-timeout:

# download translations from s3
# - timeout after 60 minutes (INT so tempdirs get cleaned up)
# - use a lockfile to not run more than once
# - on failure: print output -> cron email is sent (downloaded files are discarded)
# - on success: amend to log
multi-timeout -INT 59m -KILL 60m /usr/bin/flock -n lock sh -c '(s3-meta-sync company:translations /data/translations > /tmp/downloader.log 2>&1 && date >> /tmp/downloader.log && cat /tmp/downloader.log >> /var/log/downloader.log) || cat /tmp/downloader.log'


Michael Grosser
License: MIT

Something went wrong with that request. Please try again.