Skip to content

Savitar-Hub/postgres-backup

Repository files navigation

Backup Postgres Database

Downloads Version Python-Version issues PyPI - Status License

Basic Usage

This simple Python package allows you to create easily the database backup of Postgres databases. You can upload them to cloud storage buckets by creating a cron job.

    from postgres_backup import Backup

    # Instantiate the backup object with Postgres database_uri
    backup = Backup()

    # Create the file for backup
    backup.create()

You should have as environment variable DATABASE_URL, which is the URI of the Postgres database. This URI has the following structure: db:engine:[//[user[:password]@][host][:port]/][dbname].

Can also specify a list of the tables for which you want to create the backup:

    backup.create(table_names=['table1', 'table2', ...])

Why?

This package has proved experience of working well for databases of small-mid size.

Doing this, you make sure you can store your database backups without relying in only one cloud provider or region.

Bucket Storage

Have provided the ability to store those backups in cloud buckets.

Google Cloud Storage

For using this functionality, you need to install the dependencies needed of the package:

    pip3 install "postgres-backup[gcs]"

This basically will install also the google package.

And then after we have the backup created, we would keep following with:

    # Upload it to google cloud storage
    backup.upload(
        provider=CloudProviders.gcs.value,
    )

Where the google_cloud_certification is a dictionary, with the key-values of the client api keys:

    google_cloud_credentials = {
      "type": "service_account",
      "project_id": "xxx-saas",
      "private_key_id": "xxxxxxxx",
      "private_key": "-----BEGIN PRIVATE KEY-----\nxxxxxxxxxx\n-----END PRIVATE KEY-----\n",
      "client_email": "xxx@xxx-saas.iam.gserviceaccount.com",
      "client_id": "xxx",
      "auth_uri": "https://accounts.google.com/o/oauth2/auth",
      "token_uri": "https://oauth2.googleapis.com/token",
      "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
      "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/xxx%xxx-saas.iam.gserviceaccount.com"
    }

Recommended to provide each key as an environmental variable:

  • GOOGLE_CLOUD_TYPE -> type
  • GOOGLE_CLOUD_PROJECT_ID -> project_id
  • GOOGLE_CLOUD_PRIVATE_KEY_ID -> private_key_id
  • GOOGLE_CLOUD_PRIVATE_KEY -> private_key
  • GOOGLE_CLOUD_CLIENT_EMAIL -> client_email
  • GOOGLE_CLOUD_CLIENT_ID -> client_id
  • GOOGLE_CLOUD_AUTH_URI -> auth_uri
  • GOOGLE_CLOUD_TOKEN_URI -> token_uri
  • GOOGLE_CLOUD_AUTH_PROVIDER_X509_CERT_URL -> auth_provider_x509_cert_url
  • GOOGLE_CLOUD_CLIENT_X509_CERT_URL -> client_x509_cert_url

Moreover PROJECT_NAME and BUCKET_NAME of the google bucket, and finally DATABASE_URL of Postgres database.

In the case that we do not have a bucket already created for storing the backups, we could add additional parameters to create it:

    from postgres_backup.schemas import CloudStorageType, CloudProviders

    backup.upload(
        provider=CloudProviders.gcs.value,
        bucket_name=bucket_name,
        create_bucket=True,
        storage_class=CloudStorageType.NEARLINE.value
    )

Amazon Web Services

For uploading into AWS after having created the backup, you need first to install the optional dependencies:

    pip3 install "postgres-backup[aws]"

After that, you can use the method of upload of the Backup as:

    # Upload it to aws storage
    backup.upload(
        provider=CloudProviders.aws.value,
    )

It requires you to have as environmental variables AWS_SERVER_PUBLIC_KEY, AWS_SERVER_PRIVATE_KEY and REGION_NAME.

About

Automatise creation of Postgres backups on multiple bucket cloud providers and regions

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published