Skip to content
Go to file

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time


The purpose of this script is to offer a generic way to backup files or databases or whatever and send those backups to remote hosts.


This script relies on the pysftp and requests modules:

sudo pip install pysftp

Sidenote: Please be aware that since we use Python 3, you have to make sure that pip installs the module for Python 3. If your system ships with Python 2 as the default interpreter, pip install pysftp will install pysftp for Python 2. In that case, you might want to install pip3 and run :

sudo pip3 install pysftp
sudo pip3 install requests

How does it work?

There are two customizable steps in this process:


This step copies files or dumps databases and puts everything in a single .tar.gz file. All the logic behind this is contained within a plugin. If you cannot find a suitable plugin (check /plugins dir) for the content you're trying to save, don't hesitate to create a pull request.

A plugin is quite simple: all it does is to run commands to create the single file, and return its complete file path. It also contains a clean function to delete temporary files created during its execution.


Once everything is ready, let's upload the files to each remote. We can either upload backup files to remote targets, or copy them in a local directory.

Remote (SFTP) target configuration

Note: SSH Public Key Authentication must be set up or the script won't connect to your remote backup targets:

  1. Generate SSH private (~/.ssh/id_rsa) & public (~/.ssh/ keys if not already done:

    ssh-keygen -t rsa
  2. Copy the public key on the remote server (replace user &

    ssh-copy-id -i ~/.ssh/ user@

    If ~/.ssh/ is your default and only ssh key, you can omit the -i option and simply use

    ssh-copy-id user@
  3. Successfully connect without any password

    ssh user@

In both remote and target modes, you have to make sure that the user has the right to write in destination directory.


Copy or rename config-sample.json to get config.json.

Note: we switched from to config.json. Use --migrate to create config.json.


You can receive an e-mail when a backups fails for whatsoever reason. Just customize the alert_emails array with one or more email addresses. A null, empty array ([]) or single empty string array value ([""]) will disable this feature.

"alert_email": ["", "", ""],

Once done, run the ./ --test-mails command to check if it works fine.

Backup profiles

You can add as many backup profiles as you wish.

"my_backup": {              // That's the backup name: no special chars nor spaces please
    "profile": "",          // That's the name of the plugin ("mysql", "filesystem" or whatever)

                            // The whole backup node is sent to the plugin:
    "databases": ["myDb"],  // here are some specific keys

Check config-sample.json for some examples: it contains a sample configuration for each plugin.


Each backup profile will then be sent/copied to every target configured. A target can either be one of the following:

Remote (SFTP)

    "type":     "remote",
    "host":     "",  // Can either be a local/remote IP address
    "port":     22,                 // Optional, default 22
    "user":     "john",
    "dir":      "/home/john/backups/",
    "days_to_keep": 7               // You can override global DAYS_TO_KEEP for each target

Local (simple copy)

    "type":     "local",
    "dir":      "/home/john/backups/",
    "days_to_keep": 7               // You can override global DAYS_TO_KEEP for each target


    "type":     "ftp",
    "host":     "",  // Can either be a local/remote IP address
    "port":     21,                 // Optional, default 21
    "user":     "john",
    "password": "j0hn",
    "dir":      "/home/john/backups/"


Note: this target needs python-swiftclient package.

Note2: hubiC dependencies does not support Python 3.2.

Note3: this is very experimental.

    "type":             "hubic",
    "dir":              "/Backups/",
    "client_id":        "",
    "client_secret":    "",
    "username":         "",
    "password":         "",
    "container":        "default",
    "days_to_keep":     7           // You can override global DAYS_TO_KEEP for each target

Note: setting days_to_keep to -1 will disable backups rotation.

Important note: the "dir" directory must only contain backups from this instance. Any other file could be deleted during backups rotation.

Once done, run the ./ --test-config command to check if everything's OK.


You can either run it in its interactive mode (default), or specify the backup profile you want to run:

# Interactive mode:

# or
./ --backup my_backup

# or all:
./ --all

You can also target a single target by specifying its index in the targets list:

./ --all --target 0

You can configure a daily cron using crontab -e: add the following line to the cron file:

0 0 * * * /home/user/Simple-Backup-Script/ --all

Plugin-specific considerations


We have to be careful about authorizations. You'll have to create a .pgpass file in the cron user's home directory with the following syntax: hostname:port:database:username:password, for example:


In doubt, you can check postgres's port with sudo netstat -plunt |grep postgres.

Then, chmod 600 /home/cron_user/.pgpass. Running ls -al /home/cron_user/ must look like this afterwards:

-rw-------  1 cron_user cron_user   [...] .pgpass


Simple python script aimed to make databases or files backups easier




No releases published


No packages published


You can’t perform that action at this time.