Skip to content

cccnrc/back-clone

Repository files navigation

Back Clone

Back(up/ground) Clone: a task manager to backup your local directories to Google Cloud

Overview

This is a simple task manager that you can use to save your local directories to Google Cloud, and keep them update, with a single line command 😎

It is based on rclone


Prerequisites

  1. install rclone
curl https://rclone.org/install.sh | sudo bash
  1. setup rclone for your Google Cloud account: howtogeek guide.

As simple as:

rclone config

Installation

Just clone back-clone and make scripts executable:

git clone https://github.com/cccnrc/back-clone.git
cd ./back-clone
chmod +x ./script/*

you can also test it:

./script/rclone-launcher.sh test

(you'll know it if it works 😉)


Usage

You can choose two different ways to use back-clone:

  1. single folder backup
  2. multiple folders backup

note: to call back-clone scripts you have several solutions: take a look 😉

1. single folder backup

Just call the script with necessary arguments (see below):

./script/rclone-check-PID-launch.sh \
    $LOCAL_DIR \
    $CLOUD_DIR \
    $PROCESS_NAME
  • note: this command returns the PID of the job

Have a look at your Google Drive and you'll see them 😎

1.1. LOCAL_DIR

you have to set this to the path of the local directory you want to backup. As example:

LOCAL_DIR=/home/your-username/directory-to-backup

1.2. CLOUD_DIR

this must be set to the whole path of the cloud directory, it looks like this:

cloud-name:directory-path

  • cloud-name: corresponds to the name you set during rclone config
  • directory-path: corresponds to the pathway of the directory you want to store the files in on cloud. It will create all directories specified if they do not exists, and store all files inside LOCAL_DIR into that directory (it won't copy LOCAL_DIR as well, but only all files in it)

1.3. PROCESS_NAME

this is optional and it is used to store the logs of this backup. If you set it you will find all logs in a file created inside this logs/ directory using PROCESS_NAME: $PROCESS_NAME.back-clone.log If not set, it will use the name (not the full path) of LOCAL_DIR as PROCESS_NAME

2. multiple folders backup

You can create a TSV file with all the directories you want to backup and the relative pathways (and relative PROCESS_NAME if you wish). Then launch it with:

./script/rclone-launcher.sh $INPUT_TSV

note: this file must contain a specific header of which you find a copy in input/process.tsv-HEADER and for each folder you have to specify:

  • LOCAL_DIR: at column 1
  • CLOUD_DIR: at column 2
  • PROCESS_NAME: at column 3 (optional) the script will launch the backup for each folder and store logs into logs/ with the name of the $INPUT_TSV file (removing .tsv extension). As example, if your INPUT_TSV file is called folders-backup.tsv you will find logs into logs/folders-backup.log. Moreover, any single backup will have its logs into logs/PROCESS_NAME.log

Command

As for any script, you can use multiple choices to run it:

1. local launch

As said, to use back-clone scripts, you can be inside the cloned repository directory and executes them as:

./script/rclone-launcher.sh
./script/rclone-check-PID-launch.sh

2. external launch

You can also refers to the absolute path of the scripts from another directory. Let's say you cloned the repo inside you /home folder:

/home/back-clone/script/rclone-launcher.sh
/home/back-clone/script/rclone-check-PID-launch.sh

3. store it in PATH

If you add the script directory path to your $PATH environment variable, it will automatically find them, and you can call it simply with rclone-launcher.sh. As example:

echo "export PATH=$PATH:/home/enrico/back-clone/script" >> ~/.bashrc
source ~/.bashrc
rclone-launcher.sh test

note: the above solution stores this into your ~/.bashrc file, so it will be valid for any other terminal window you open from now on 😎


Check Status

You can check status (running/completed) of all your backup jobs with a single script.

Let's say you exported scripts to your $PATH (as explained here):

back-clone-check.sh

If you want to check the status of a single job just specify its name:

back-clone-check.sh PROCESS_NAME

Isn't that cool?! 😎


Update

You can easily use back-clone to keep your backups updated. As example, you can store the command to backup a folder into your ./bashrc file: anytime a new terminal window is open it will update that folder 😉

echo "/home/back-clone/script/rclone-launcher $INPUT_TSV" >> ~/.bashrc

You can also set a crontab job to be executed every day, or hour etc.

There are bilions of other possibilities, explore them! 🚀


Overlaps

Don't worry about backup overlaps, through the PID manager (a file called rclone-PID-map.tsv you'll find inside logs once you started your first backup) back-clone checks if the previous job (with the same PROCESS_NAME is terminated, if not you will find the PID which is running the job with a (running) flag) 😎


Closure

This is all for the moment, hope it was simple and exhaustive. 😄

Please be sure to share with us any nice change you make to our back-clone scripts, as example creating a pull request to back-clone repository.

We would love to incorporate them! 😍

Releases

No releases published

Packages

No packages published

Languages