Skip to content
This repository has been archived by the owner on Feb 24, 2019. It is now read-only.

How backups are handled

Doverstav edited this page Jan 3, 2018 · 1 revision

Daily backups are made of Gakusei's database. These backups are saved both locally and on a remote server. Tools used for this are pg_dump and rsync. In order to run the backup-script daily, cron is used.

The script consists of three phases:

  • Deletion of old backups
    • This is achieved by simply comparing filenames. The name given to any backup is the date of its creation. Simply compare the filename to some drop-off point and if the file is older than that, use rm to remove it.
  • Dump current database
    • Done with pg_dump -U <user> > <filename>, a short tutorial can be found here.
  • Sync files to remote server
    • User rsync to sync files to remote server. For Gakusei, rsync -a --delete <localFile> <user>@<host>:<remoteFile>, where -a copies subdirectories and keeps symlinks, timestamps etc and --delete removes locally deleted files from the remote as well. See here for a short tutorial.

Finally, as mentioned, cron is used for scheduling. crontab -e opens your cron table for editing or creates an empty one for you if you don't have one. There, you can schedule various task, see here for more info. For example, in order to run Gakusei's backup-script every day at 23:59, add the line 59 23 * * * bash <location of backupscript>.

Should disaster strike and you need to recover the database from one of the backups, you can simply use psql <database> < <backupFile>. This article has some more info. It does have some caveats, as mentioned in the article, so pg_restore might be worth looking into as well.