The purpose of this script is to offer a generic way to backup files or databases or whatever and send those backups to remote hosts.
This script relies on the pysftp
and requests
modules:
pip3 install pysftp
Sidenote: Please be aware that since we use Python 3, you have to make sure that pip
installs the module for Python 3.
If your system ships with Python 2 as the default interpreter, pip install pysftp
will install pysftp
for Python 2.
In that case, you might want to install pip3
and run :
pip3 install pysftp
pip3 install requests
Optional: if you want errors to be reported to Sentry, install its SDK:
pip3 install --upgrade sentry-sdk
There are two customizable steps in this process:
This step copies files or dumps databases and puts everything in a single .tar.gz file.
All the logic behind this is contained within a plugin. If you cannot find a suitable plugin (check /plugins
dir)
for the content you're trying to save, don't hesitate to create a pull request.
A plugin is quite simple: all it does is to run commands to create the single file, and return its complete file path.
It also contains a clean
function to delete temporary files created during its execution.
Once everything is ready, let's upload the files to each remote. We can either upload backup files to remote targets, or copy them in a local directory.
Note: SSH Public Key Authentication must be set up or the script won't connect to your remote backup targets:
-
Generate SSH private (
~/.ssh/id_rsa
) & public (~/.ssh/id_rsa.pub
) keys if not already done:ssh-keygen -t rsa
-
Copy the public key on the remote server (replace
user
&123.45.56.78
)ssh-copy-id -i ~/.ssh/id_rsa.pub user@123.45.56.78
If
~/.ssh/id_rsa.pub
is your default and only ssh key, you can omit the-i
option and simply usessh-copy-id user@123.45.56.78
-
Successfully connect without any password
ssh user@123.45.56.78
In both remote and target modes, you have to make sure that the user has the right to write in destination directory.
Copy or rename config-sample.json
to get config.json
.
Note: we switched from
config.py
toconfig.json
. Usebackup.py --migrate
to createconfig.json
.
You can receive an e-mail when a backups fails for whatsoever reason. Just customize the alert_emails
array
with one or more email addresses. A null, empty array ([]
) or single empty string array value ([""]
) will disable
this feature.
"alert_email": ["john@doe.com", "it@doe.com", "accounting@doe.com"],
Once done, run the ./backup.py --test-mails
command to check if it works fine.
Errors can be reported to Sentry:
"sentry_dsn": "https://123456789abcdef@o1237.ingest.sentry.io/1234567"
You can add as many backup profiles as you wish.
"my_backup": { // That's the backup name: no special chars nor spaces please
"profile": "", // That's the name of the plugin ("mysql", "filesystem" or whatever)
// The whole backup node is sent to the plugin:
"databases": ["myDb"], // here are some specific keys
}
Check config-sample.json
for some examples: it contains a sample configuration for each plugin.
Each backup profile will then be sent/copied to every target configured. A target can either be one of the following:
{
"type": "remote",
"host": "bkup.domain.com", // Can either be a local/remote IP address
"port": 22, // Optional, default 22
"user": "john",
"dir": "/home/john/backups/",
"days_to_keep": 7 // You can override global DAYS_TO_KEEP for each target
}
{
"type": "local",
"dir": "/home/john/backups/",
"days_to_keep": 7 // You can override global DAYS_TO_KEEP for each target
}
{
"type": "ftp",
"host": "bkup.domain.com", // Can either be a local/remote IP address
"port": 21, // Optional, default 21
"user": "john",
"password": "j0hn",
"dir": "/home/john/backups/"
}
Note: setting
days_to_keep
to-1
will disable backups rotation.
Important note: the
"dir"
directory must only contain backups from this instance. Any other file could be deleted during backups rotation.
Once done, run the ./backup.py --test-config
command to check if everything's OK.
You can either run it in its interactive mode (default), or specify the backup profile you want to run:
# Interactive mode:
./backup.py
# or
./backup.py --backup my_backup
# or all:
./backup.py --all
You can also target a single target by specifying its index in the targets list:
./backup.py --all --target 0
You can configure a daily cron using crontab -e
: add the following line to the cron file:
0 0 * * * /home/user/Simple-Backup-Script/backup.py --all
We have to be careful about authorizations. You'll have to create a .pgpass
file in the cron user's home directory with the
following syntax: hostname:port:database:username:password
, for example:
localhost:5432:db_name:db_user:password
In doubt, you can check postgres's port with sudo netstat -plunt |grep postgres
.
Then, chmod 600 /home/cron_user/.pgpass
. Running ls -al /home/cron_user/
must look like this afterwards:
-rw------- 1 cron_user cron_user [...] .pgpass
When building your profiles list, please be aware of permissions issues that may arise. Your (unprivileged) backups user should be able to access all the files that are referenced in this script's configuration file. For database dumps, it's recommended to create a specific user with limited roles (SELECT & LOCK should be fine).
For filesystem archives, you can use ACLs to grant permissions: (replace USER="backups"
with the name of your Unix user)
function set_acl() {
# set ACL on var folder if necessary
USER="backups"
FOLDER="$1"
ACL=$(getfacl "$FOLDER" | grep -E "$USER:..x$" -ic)
if [[ "$ACL" != 4 ]]; then
echo "setting missing ACLs on $FOLDER"
setfacl -R -m u:"$USER":rX "$FOLDER" && setfacl -dR -m u:"$USER":rX "$FOLDER"
fi
}
set_acl "my/path/to/backup"