Skip to content
Automated serverless logging to S3 via SQS.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets Added the overview graph. Nov 2, 2017
.gitignore Missing package fields. Nov 2, 2017
LICENSE Initial. Oct 21, 2017 Added Nov 12, 2017
README.rst Fixed rst issues. Nov 12, 2017
requirements.txt First attempt at the environment setup. Oct 22, 2017 Fixed rst issues. Nov 12, 2017



A library to persist messages on S3 using serverless architecture. It is mainly targeted at cheaply archiving low-volume, sporadic events from applications without a need to spin additional infrastructure.

Overall idea

What it’s not

Not a replacement for general logging systems or libraries. Provides no filtering or aggregation.

AWS Alternatives


Configure boto3’s credentials as per:

Make sure you setup:

  • AWS_DEFAULT_REGION (optionally)

Take a look at

For help: python3 -h

For example (backup at midnight each Saturday from app-logs queue to app-logs-archive bucket):

sqs-s3-logger create -b app-logs-archive -q app-logs -f app-logs-backup -s 'cron(0 0 ? * SAT *)'

Sending messages to a queue

Ideally you should use another AWS IAM user with permissions restricted to getting SQS queues and writing messages.

import boto3
sqs = boto3.resource('sqs')
queue = sqs.get_queue_by_name(QueueName='<QUEUE_NAME>')


  • Maximum SQS message size is limited to 256 KB
  • There could be no more than 120,000 messages in a queue at a time.
  • SQS messages cannot persist for longer than 14 days.
  • Lambda environment has up to 512MB of ephemeral disk capacity.
  • By default it does not guarantee correct time-based ordering

You may need to adjust your CRON settings depending on your volume.


python3 test

These will use your AWS account to instantiate a temporary integration environment.

You can’t perform that action at this time.