Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
ramirojoaquin committed Nov 19, 2018
0 parents commit dbeffcc
Show file tree
Hide file tree
Showing 14 changed files with 2,093 additions and 0 deletions.
674 changes: 674 additions & 0 deletions LICENSE

Large diffs are not rendered by default.

179 changes: 179 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
# Vesta Control Panel Borg Incremental Backups
A series of bash scripts to perform incremental backups of Vesta Control Panel users and server config, using Borg Backup as backend.

### The problem
Vesta CP provides by default a backup system, this backup system creates a tar for each user every day (by default 10 copies are saved), But this way of making backups has some disadvantages when you have a lot of users:
* Server overload. Earch time the backup is run, a complete copy of user files are saved.
* Disk space consumption. Each backup copy contains a full backup of the user files. So its very easy to run out of disk space.

### The solution
An incremental backup is one in which successive copies of the data contain only the portion that has changed since the preceding backup copy was made. This way you can store lot of backups points, without making a full backup each time.

Borg Backup does an excellent job making incremental backups. And provide very interesting features such as compression, encryption and good performance.
You can get more info at https://www.borgbackup.org/

# How the script collection works
The main backup script is designed to be run every day via cronjob. This script saves a snapshot of all users, server config and vesta directory into different Borg repositories.

This borg repositories are saved in `/backup` by default, organized in folders.

Then, different scripts are provided to automatically restore users, webs, mail domains or databases if needed.

Additionally, it is possible to archive users who are no longer active (they are saved into an offline archive directory), and in these users the incremental backups do not run. In this way we also save disk space.

# Install
I use Debian 9. I did not test it in other distros, but i think you should not find any major problem.

## Requirements
* Vesta CP running
* Borg backup

### 1- Borg install
```
apt update
apt install borgbackup
```

### 2- Install the scripts
In my case i save the scripts under `/root/scripts`.
To install the script collection run the following commands as root:
```
mkdir -p /root/scripts
cd /root/scripts
git clone https://github.com/ramirojoaquin/vestacp-borg-incremental-backups.git
```

### 3- Create directory to store logs:
```
mkdir -p /var/log/scripts/backup
```

### 4- Setup the cronjob
As root run:
```
crontab -e
```
And add the following cronjob:
```
0 4 * * * /root/scripts/vestacp-borg-incremental-backups/backup-execute.sh > /var/log/scripts/backup/backup_`date "+\%Y-\%m-\%d"`.log 2>&1
```
This cronjob will run `backup-execute.sh` every day at 4am. You can change the hour and the log locations.

# Scripts details

## Backup execution
The main backup script `./backup-execute.sh` is designed to be run every day via cronjob and it performs the following actions:

* Dump all databases to the corresponding user dirs. Using the following format `/home/userdir/db_dump/database.sql.gz`
* Creates an incremental backup archive/point of all the users, using one repository per user . Repos are stored in `/backup/borg/home/USER`
* Creates an incremental backup archive/point of config dir `/etc` and save the repo in `/backup/borg/etc`
* Creates an incremental backup archive/point of vesta directory `/usr/local/vesta` and save the repo in `/backup/borg/vesta`
* Creates an incremental backup archive/point of custom scripts `/root/scripts` and save the repo in `/backup/borg/scripts`
* Sync backup folder with a second remote server if needed.

All the paths can be customized in `config.ini` configuration file.

If no backup was executed yet, the script will initialize the corresponding borg repositories.

The name of the backup point/archive is set in the following format:
`2018-05-20` (year-month-day)

Vesta CLI commands are used to obtain all the information.

### Dump databases
`./dump-databases.sh`

Dump all databases to the corresponding user dirs. Using the following format `/home/userdir/db_dump/database.sql.gz`

This script is called by main `backup-execute.sh` but it can also be run independently.

## Backup restore usage

### Restore entire user
`./restore-user.sh 2018-03-25 user`

This script will restore the given user from a particular point in incremental backup. If the user exist is overwritten. If the user does not exist a new one is created.

* First argument is the archive/point, using the format YEAR-MONTH-DAY.
* Second argument is the username.

### Restore web domain (optional with database)
`./restore-web.sh 2018-03-25 user domain.com database`

This script will restore the given web domain from a particular point in incremental backup. The web domain must exist in the system.

* First argument is the archive/point, using the format YEAR-MONTH-DAY.
* Second argument is the username who owns the domain.
* Third argument is the web domain.
* Fourth argument is database name. This argument is optional.

### Restore database
`./restore-db.sh 2018-03-25 user database`

This script will restore the given database from a particular point in incremental backup. The database must exist in the system.

* First argument is the archive/point, using the format YEAR-MONTH-DAY.
* Second argument is the username who owns the database.
* Third argument is the database name.

### Restore mail domain
`./restore-mail.sh 2018-03-25 user domain.com`

This script will restore the given mail domain from a particular point in incremental backup. The mail domain must exist in the system.

* First argument is the archive/point, using the format YEAR-MONTH-DAY.
* Second argument is the username who owns the mail domain.
* Third argument is the mail domain.

## Offline archive scripts

### Archive user
`./archive-user.sh user`

This script will save a copy of the given user to the offline archive directory `/backup/offline`.

* First argument is the user name.

### Restore archived user
`./restore-archived-user.sh user`

This script is similar to restore-user.sh, but instead of restore from incremental backup, it will restore the given user from the offline archive.

If the user does not exist in the system, it will be created.

If the user already exist, it will be overwrited.

* First argument is the user name.

## Cleaning and disk space saving scripts

### Purge user backup
`./purge-user-backup.sh user`

This script will remove all incremental backups archives/points for the given user.

* First argument is the user name.

### Clean user repositories
`./clean-user-repos.sh`

This script performs a comparison between user repositories and current active users.

If a repository does not correspond to an active user. The following actions are executed
* The user repo is extracted into the offline archive directory `/backup/offline` for future use.
* The user repo is deleted from the user repo dir saving disk space.

# Other useful commands

You can use borg cli to check backups and manually restore.

Full documentation is available here: https://borgbackup.readthedocs.io/en/stable/index.html#

For exmaple this command will list the available incremental backup points for user admin.
`borg list /backup/borg/home/admin`

# Personal note

This is my first git project. I want to share this with all the community. Use it under your own responsability.

I am also open to changes and suggestions. You can write me to ramirojoaquin@gmail.com.
119 changes: 119 additions & 0 deletions archive-user.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
#!/bin/bash
CURRENT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )"
source $CURRENT_DIR/config.ini

# This script will save a copy of the given user to the offline archive directory
USAGE="restore-user.sh user"

# Assign arguments
USER=$1

# Set script start time
START_TIME=`date +%s`

# Set dir paths
USER_DIR=$HOME_DIR/$USER
VESTA_USER_DIR=$VESTA_DIR/data/users/$USER
ARCHIVE_USER_DIR=$ARCHIVE_DIR/$USER
ARCHIVE_VESTA_USER_DIR=$ARCHIVE_USER_DIR/vesta/$USER

##### Validations #####

if [[ -z $1 ]]; then
echo "!!!!! This script needs 1 argument: User name"
echo "---"
echo "Usage example:"
echo $USAGE
exit 1
fi

# Check if user archive exist
if [ ! -d "$USER_DIR" ]; then
echo "!!!!! User $USER does not exist in the system. Aborting..."
exit 1
fi

# Check if user exist in vesta dir
if [ ! -d "$VESTA_USER_DIR" ]; then
echo "!!!!! User $USER doest not exist in vesta directory."
exit 1
fi

# Check if user archive exist
if [ -f "$ARCHIVE_USER_DIR.tar.gz" ]; then
echo "!!!!! User archive file $ARCHIVE_USER_DIR.tar.gz already exist."
read -p "Do you want to overwrite? " -n 1 -r
echo
if [[ ! $REPLY =~ ^[Yy]$ ]]
then
[[ "$0" = "$BASH_SOURCE" ]]
echo
echo "########## PROCESS CANCELED ##########"
exit 1
fi
fi

if [ -d "$ARCHIVE_USER_DIR" ]; then
echo "!!!!! User archive directory $ARCHIVE_USER_DIR already exist."
read -p "Do you want to overwrite? " -n 2 -r
echo
if [[ ! $REPLY =~ ^[Yy]$ ]]
then
[[ "$0" = "$BASH_SOURCE" ]]
echo
echo "########## PROCESS CANCELED ##########"
exit 1
fi
fi

echo "########## USER $USER FOUND, PROCEEDING WITH ARCHIVE ##########"

echo "-- Dumping databases to user dir"
while read DATABASE ; do
# Create dir where the user databases will be stored
DESTINATION=$HOME_DIR/$USER/$DB_DUMP_DIR_NAME
mkdir -p $DESTINATION
# Clean destination
rm -f $DESTINATION/*
mysqldump $DATABASE --opt --routines | gzip > $DESTINATION/$DATABASE.sql.gz
echo "$(date +'%F %T') -- $DATABASE > $DESTINATION/$DATABASE.sql.gz"
# Fix permissions
chown -R $USER:$USER $DESTINATION
done < <(v-list-databases $USER | cut -d " " -f1 | awk '{if(NR>2)print}')

echo "-- Creating user archive directory $ARCHIVE_USER_DIR"
# First remove archive dir and file if exist
if [ -d "$ARCHIVE_USER_DIR" ]; then
rm -rf $ARCHIVE_USER_DIR
fi
if [ -f "$ARCHIVE_USER_DIR.tar.gz" ]; then
rm -rf $ARCHIVE_USER_DIR.tar.gz
fi

# Archive dir creation
mkdir -p $ARCHIVE_USER_DIR
mkdir -p $ARCHIVE_VESTA_USER_DIR

echo "-- Saving vesta config files for user $USER from $VESTA_USER_DIR to $ARCHIVE_VESTA_USER_DIR"
rsync -za $VESTA_USER_DIR/ $ARCHIVE_VESTA_USER_DIR/

echo "-- Saving user files from $USER_DIR to $ARCHIVE_USER_DIR"
rsync -za $USER_DIR/ $ARCHIVE_USER_DIR/

echo "-- Compressing $ARCHIVE_USER_DIR into $ARCHIVE_USER_DIR.tar.gz"
cd $ARCHIVE_DIR
tar -pczf $USER.tar.gz $USER

# Clean archive dir
if [ -d "$ARCHIVE_USER_DIR" ]; then
rm -rf $ARCHIVE_USER_DIR
fi

echo
echo "$(date +'%F %T') #################### USER $USER ARCHIVED INTO $ARCHIVE_USER_DIR.tar.gz ####################"

END_TIME=`date +%s`
RUN_TIME=$((END_TIME-START_TIME))

echo "-- Execution time: $(date -u -d @${RUN_TIME} +'%T')"
echo
Loading

0 comments on commit dbeffcc

Please sign in to comment.