Skip to content

johnnybui/postgres-backup-s3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

4 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Postgres Backup S3/R2

Release Build Status Docker Image Stars

Backup and restore PostgreSQL to S3/R2/S3-compatible storage with scheduled backups, encryption, and multiple database support.

โœจ Features

  • ๐Ÿ—‚๏ธ Multiple Database Support - Backup multiple databases in one container
  • โ˜๏ธ Multi-Storage Support - AWS S3, Cloudflare R2, or any S3-compatible service
  • โฐ Scheduled Backups - Flexible cron scheduling
  • ๐Ÿ” Encryption - AES-256-CBC encryption support
  • ๐Ÿ—œ๏ธ Compression - Built-in gzip/pigz compression or PostgreSQL custom format
  • ๐Ÿงน Auto Cleanup - Automatic deletion of old backups
  • โšก Parallel Backups - Optional parallel backup for multiple databases
  • ๐Ÿš€ Fast & Lightweight - Built with Bun runtime

๐Ÿ“ฆ Quick Start

Basic Backup (AWS S3)

version: '3.8'
services:
  postgres:
    image: postgres:17-alpine
    environment:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
    volumes:
      - postgres-data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

  postgres-backup:
    image: ghcr.io/johnnybui/postgres-backup-s3
    depends_on:
      - postgres
    environment:
      # Storage Configuration
      STORAGE_TYPE: S3
      S3_REGION: ap-southeast-1
      S3_BUCKET: my-db-backups
      S3_PREFIX: postgres
      S3_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID}
      S3_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY}
      
      # Postgres Configuration
      POSTGRES_HOST: postgres
      POSTGRES_PORT: 5432
      POSTGRES_DATABASE: myapp
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      
      # Backup Schedule
      SCHEDULE: "@daily"  # or "0 2 * * *" for 2 AM daily

volumes:
  postgres-data:

Run it:

docker compose up -d

๐Ÿ—„๏ธ Storage Options

AWS S3 (Standard)

environment:
  STORAGE_TYPE: S3
  S3_REGION: us-west-1
  S3_BUCKET: my-backups
  S3_ACCESS_KEY_ID: AKIAXXXXXXXX
  S3_SECRET_ACCESS_KEY: xxxxxxxxxx

Cloudflare R2 (Recommended - No Egress Fees!)

environment:
  STORAGE_TYPE: R2
  R2_ACCOUNT_ID: 1234567890abcdef1234567890abcdef  # Example Cloudflare Account ID (32 chars)
  S3_BUCKET: my-backups
  S3_ACCESS_KEY_ID: your-r2-key
  S3_SECRET_ACCESS_KEY: your-r2-secret

S3-Compatible Services

Works with Minio, DigitalOcean Spaces, Wasabi, Backblaze B2, and more.

environment:
  STORAGE_TYPE: COMPATIBLE
  S3_ENDPOINT: https://sgp1.digitaloceanspaces.com
  S3_REGION: sgp1
  S3_BUCKET: my-spaces-bucket
  S3_ACCESS_KEY_ID: your-key
  S3_SECRET_ACCESS_KEY: your-secret

Minio Example (Self-hosted)

version: '3.8'
services:
  minio:
    image: minio/minio
    command: server /data --console-address ":9001"
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: minioadmin
    ports:
      - "9000:9000"
      - "9001:9001"
    volumes:
      - minio-data:/data

  postgres-backup:
    image: ghcr.io/johnnybui/postgres-backup-s3
    depends_on:
      - minio
    environment:
      STORAGE_TYPE: COMPATIBLE
      S3_ENDPOINT: http://minio:9000
      S3_REGION: us-east-1
      S3_BUCKET: postgres-backups
      S3_ACCESS_KEY_ID: minioadmin
      S3_SECRET_ACCESS_KEY: minioadmin
      
      POSTGRES_HOST: postgres
      POSTGRES_DATABASE: myapp
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      SCHEDULE: "@hourly"

volumes:
  minio-data:

๐Ÿ“Š Storage Provider Comparison

Provider STORAGE_TYPE Required Env Vars Notes
AWS S3 S3 S3_REGION, S3_BUCKET, credentials Standard AWS S3, most reliable
Cloudflare R2 R2 R2_ACCOUNT_ID, S3_BUCKET, credentials Cheaper, no egress fees
Minio COMPATIBLE S3_ENDPOINT, S3_REGION, S3_BUCKET, credentials Self-hosted, great for local dev
DigitalOcean Spaces COMPATIBLE S3_ENDPOINT, S3_REGION, S3_BUCKET, credentials Regional endpoints
Wasabi COMPATIBLE S3_ENDPOINT, S3_REGION, S3_BUCKET, credentials Cheaper storage alternative
Backblaze B2 COMPATIBLE S3_ENDPOINT, S3_REGION, S3_BUCKET, credentials Via S3-compatible API

๐Ÿ—‚๏ธ Multiple Database Support

Backup multiple databases in a single container by providing a comma-separated list:

environment:
  # Multiple databases (comma-separated)
  POSTGRES_DATABASE: "app_db,analytics_db,logs_db"
  
  # Optional: parallel backup for speed (default: no)
  PARALLEL_BACKUP: "no"  # or "yes"

Sequential vs Parallel Backup

Sequential (default):

  • Safer, less load on database
  • Backups run one after another
  • Easier to debug if errors occur

Parallel:

  • Faster for multiple large databases
  • Higher load on database server
  • Set PARALLEL_BACKUP=yes to enable

Backup All Databases

environment:
  POSTGRES_DATABASE: "all"  # Backup all databases

๐Ÿ“… Backup Schedule

The SCHEDULE variable uses cron syntax:

# Predefined schedules
SCHEDULE: "@hourly"   # Every hour
SCHEDULE: "@daily"    # Every day at midnight
SCHEDULE: "@weekly"   # Every week
SCHEDULE: "@monthly"  # Every month

# Custom cron syntax (minute hour day month weekday)
SCHEDULE: "0 2 * * *"      # Every day at 2 AM
SCHEDULE: "0 */6 * * *"    # Every 6 hours
SCHEDULE: "0 3 * * 0"      # Every Sunday at 3 AM
SCHEDULE: "30 4 1 * *"     # 1st day of month at 4:30 AM

Leave empty or set to **None** for one-time backup.

๐Ÿ” Encryption

Encrypt backups with AES-256-CBC:

environment:
  ENCRYPTION_PASSWORD: "your-super-strong-password"

Encrypted backups will have .enc extension and require the same password for restore.

๐Ÿ—œ๏ธ Compression Options

Plain Text with Compression (Default)

environment:
  USE_CUSTOM_FORMAT: "no"
  COMPRESSION_CMD: "gzip"         # or "pigz" for parallel compression
  DECOMPRESSION_CMD: "gunzip -c"  # or "pigz -dc"

PostgreSQL Custom Format (Recommended for Large DBs)

environment:
  USE_CUSTOM_FORMAT: "yes"
  PARALLEL_JOBS: 4  # For faster parallel restore

Benefits:

  • Faster backups
  • Smaller backup files
  • Supports parallel restoration
  • Allows selective table/schema restore

Note: Custom format not available with POSTGRES_DATABASE=all

๐Ÿงน Auto Cleanup Old Backups

Automatically delete backups older than specified time:

environment:
  DELETE_OLDER_THAN: "30 days ago"  # or "7 days ago", "1 week ago", etc.

โš ๏ธ Warning: This deletes ALL files in the S3_PREFIX path, not just backups created by this tool.

๐ŸŽฏ Manual Backup Trigger

You can manually trigger a backup at any time (useful for testing or on-demand backups):

Method 1: Direct Shell Script (Recommended for Manual Trigger)

# Backup databases configured in POSTGRES_DATABASE env var
docker compose exec postgres-backup /backup.sh

# Override to backup specific databases (comma-separated)
docker compose exec -e POSTGRES_DATABASE="db1,db2,db3" postgres-backup /backup.sh

# Override to backup specific databases (space-separated)  
docker compose exec -e POSTGRES_DATABASE="db1 db2 db3" postgres-backup /backup.sh

# From host machine (if container is named 'postgres-backup')
docker exec postgres-backup /backup.sh

# Override from host
docker exec -e POSTGRES_DATABASE="specific_db" postgres-backup /backup.sh

Method 2: One-Time Container (Creates New Container)

# Unset SCHEDULE to run one-time backup via scheduler
docker compose run --rm -e SCHEDULE="" postgres-backup

# Or override specific databases
docker compose run --rm \
  -e SCHEDULE="" \
  -e POSTGRES_DATABASE="db1,db2" \
  postgres-backup

Notes:

  • Method 1 calls the backup script directly in the running container (instant)
  • Method 2 starts a new container, runs backup, then exits (slower startup)
  • The backup script automatically detects and loops through multiple databases
  • Works with both comma-separated and space-separated database lists
  • Each database is backed up sequentially with clear progress indicators
  • If SCHEDULE is set and you call the binary directly, it will wait for cron (not instant)

๐Ÿ”„ Restore Database

Use docker compose run to restore a specific backup:

Basic Restore

docker compose run --rm \
  -e BACKUP_FILE="postgres/app_db_2025-09-27T03:51:37Z.sql.gz" \
  -e POSTGRES_DATABASE="app_db" \
  postgres-backup

All other environment variables (storage config, credentials) are inherited from docker-compose.yml.

List Available Backups

# Use AWS CLI to list backups
docker compose run --rm postgres-backup \
  sh -c 'aws $AWS_ARGS s3 ls s3://$S3_BUCKET/$S3_PREFIX/'

Common Restore Scenarios

Drop and recreate database

docker compose run --rm \
  -e BACKUP_FILE="postgres/mydb_2025-09-27T03:51:37Z.sql.gz" \
  -e POSTGRES_DATABASE="mydb" \
  -e DROP_DATABASE="yes" \
  -e CREATE_DATABASE="yes" \
  -e POSTGRES_EXTRA_OPTS="" \
  postgres-backup

Restore to different database

docker compose run --rm \
  -e BACKUP_FILE="postgres/prod_db_2025-09-27T03:51:37Z.sql.gz" \
  -e POSTGRES_DATABASE="staging_db" \
  -e CREATE_DATABASE="yes" \
  postgres-backup

Restore encrypted backup

# Encryption password inherited from docker-compose.yml
docker compose run --rm \
  -e BACKUP_FILE="postgres/mydb_2025-09-27T03:51:37Z.sql.gz.enc" \
  -e POSTGRES_DATABASE="mydb" \
  postgres-backup

Restore to different server

docker compose run --rm \
  -e BACKUP_FILE="postgres/app_db_2025-09-27T03:51:37Z.sql.gz" \
  -e POSTGRES_DATABASE="app_db" \
  -e POSTGRES_HOST="prod-db.example.com" \
  -e POSTGRES_USER="prod_user" \
  -e POSTGRES_PASSWORD="prod_password" \
  postgres-backup

Important Restore Notes

  • Set POSTGRES_EXTRA_OPTS="" if backup was created with --clean flag
  • Use CREATE_DATABASE=yes when restoring to non-existent database
  • Use DROP_DATABASE=yes with caution - destroys existing data
  • Encrypted backups require same ENCRYPTION_PASSWORD

๐Ÿ“‹ Environment Variables

Storage Configuration

Variable Default Required Description
STORAGE_TYPE S3 No Storage type: S3, R2, or COMPATIBLE
S3_BUCKET Yes Bucket name
S3_PREFIX backup No Path prefix in bucket
S3_ACCESS_KEY_ID Yes Access key
S3_SECRET_ACCESS_KEY Yes Secret key

For AWS S3

Variable Default Required Description
S3_REGION us-west-1 No AWS region

For Cloudflare R2

Variable Default Required Description
R2_ACCOUNT_ID Yes Cloudflare account ID

For S3-Compatible Services

Variable Default Required Description
S3_ENDPOINT Yes Full endpoint URL
S3_REGION us-east-1 No Region

PostgreSQL Configuration

Variable Default Required Description
POSTGRES_DATABASE Yes Database name(s), comma-separated, or all
POSTGRES_HOST Yes PostgreSQL host
POSTGRES_PORT 5432 No PostgreSQL port
POSTGRES_USER Yes PostgreSQL user
POSTGRES_PASSWORD Yes PostgreSQL password
POSTGRES_EXTRA_OPTS No Extra pg_dump/psql options (e.g., --clean --if-exists)

Backup Options

Variable Default Required Description
SCHEDULE No Cron schedule or @daily, @hourly, etc.
PARALLEL_BACKUP no No Parallel backup for multiple databases (yes/no)
ENCRYPTION_PASSWORD No Password for encryption
DELETE_OLDER_THAN No Auto-delete backups older than this (e.g., 30 days ago)
USE_CUSTOM_FORMAT no No Use PostgreSQL custom format (yes/no)
COMPRESSION_CMD gzip No Compression command (e.g., pigz)
DECOMPRESSION_CMD gunzip -c No Decompression command

Restore Options

Variable Default Required Description
BACKUP_FILE For restore Path to backup file in bucket (e.g., backup/db_2025-09-27.sql.gz)
CREATE_DATABASE no No Create database if not exists (yes/no)
DROP_DATABASE no No Drop database before restore (yes/no)
PARALLEL_JOBS 1 No Parallel jobs for pg_restore with custom format

Advanced Options

Variable Default Required Description
S3_S3V4 no No Use AWS Signature Version 4 (for Minio, set to yes)

๐Ÿ—๏ธ Complete Production Example

version: '3.8'

services:
  postgres:
    image: postgres:17-alpine
    restart: unless-stopped
    environment:
      POSTGRES_USER: ${DB_USER:-postgres}
      POSTGRES_PASSWORD: ${DB_PASSWORD}
    volumes:
      - postgres-data:/var/lib/postgresql/data
    networks:
      - db-network
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres"]
      interval: 10s
      timeout: 5s
      retries: 5

  postgres-backup:
    image: ghcr.io/johnnybui/postgres-backup-s3
    restart: unless-stopped
    depends_on:
      postgres:
        condition: service_healthy
    environment:
      # Storage - Cloudflare R2
      STORAGE_TYPE: R2
      R2_ACCOUNT_ID: ${R2_ACCOUNT_ID}
      S3_BUCKET: ${BACKUP_BUCKET}
      S3_PREFIX: postgres-prod
      S3_ACCESS_KEY_ID: ${R2_ACCESS_KEY}
      S3_SECRET_ACCESS_KEY: ${R2_SECRET_KEY}
      
      # Multiple Databases
      POSTGRES_HOST: postgres
      POSTGRES_PORT: 5432
      POSTGRES_DATABASE: "app_production,analytics,logs"
      POSTGRES_USER: ${DB_USER:-postgres}
      POSTGRES_PASSWORD: ${DB_PASSWORD}
      POSTGRES_EXTRA_OPTS: "--clean --if-exists"
      
      # Backup Settings
      SCHEDULE: "0 3 * * *"  # 3 AM daily
      PARALLEL_BACKUP: "no"
      USE_CUSTOM_FORMAT: "yes"
      ENCRYPTION_PASSWORD: ${BACKUP_ENCRYPTION_KEY}
      DELETE_OLDER_THAN: "30 days ago"
    networks:
      - db-network

networks:
  db-network:
    driver: bridge

volumes:
  postgres-data:

Create .env file:

DB_USER=postgres
DB_PASSWORD=super-secure-password
R2_ACCOUNT_ID=abc123def456
R2_ACCESS_KEY=your-r2-key
R2_SECRET_KEY=your-r2-secret
BACKUP_BUCKET=prod-db-backups
BACKUP_ENCRYPTION_KEY=ultra-secure-encryption-key

๐Ÿ› ๏ธ Development

Local Build

# Install dependencies
bun install

# Run locally
bun run src/index.ts

# Build
bun run build

Build Docker Image

docker build -f Dockerfile -t postgres-backup-s3 .

๐Ÿ“ License

MIT License

Copyright for portions of project postgres-backup-s3 are held by ITBM, 2019 as part of project postgresql-backup-s3. All other copyright for project postgres-backup-s3 are held by johnnybui, 2025.

See LICENSE for full details.

๐Ÿ™ Credits

This project is inspired by and based on itbm/postgresql-backup-s3.

Key Features

  • โœ… Built with Bun + TypeScript (fast, modern)
  • โœ… Multiple database support
  • โœ… Native Cloudflare R2 support
  • โœ… S3-compatible service support (Minio, DigitalOcean Spaces, etc.)
  • โœ… Parallel backup option
  • โœ… Comprehensive error handling
  • โœ… Production-ready Docker Compose examples

๐Ÿค Contributing

Contributions welcome! Please feel free to submit a Pull Request.

๐Ÿ“ฎ Support

For issues, questions, or feature requests, please open an issue on GitHub.

About

Automated backup and restore PostgreSQL to S3/R2/S3-compatible storage with multiple database support

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages