Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backup/Restore workflow #25

Open
stratosgear opened this issue May 1, 2020 · 7 comments
Open

Backup/Restore workflow #25

stratosgear opened this issue May 1, 2020 · 7 comments

Comments

@stratosgear
Copy link

Sorry, this is not really an "issue" per say, but rather a request.

What would be your suggestion for backing up/restoring this server...?

Not as much as the potential media files (that might add up to some TBs, and they ought to be residing to separate partitions or NASes anyways) but the ~/docker folder that contains ALL the individual app settings and configurations.

@fanuch
Copy link

fanuch commented May 1, 2020

You're looking at it.

It's very common to store configuration files with git.
Since they are just text files that are receiving few changes with the possibility of needing to roll-back, or exist in multiple locations, git is perfect for storing configs.

There's GitHub or GitLab or BitBucket to name a few, or you can roll your own git server; though not recommended as all of the mentioned services allow for private repositories to house your configs.

@stratosgear
Copy link
Author

I am not so sure that you can keep these folders in git. In my 16-18 hours of usage of a docker-compose very similar to what is defined in the sample traefik2 displayed here, the /docker folder is close to 520 MB of data, already!

I'm sure that some of the files are binary files too (images and thumbnails etc)

These are NOT good candidates for a git repo.

I was hoping something along the lines of a borg backup or similar, and I was wondering if someone had testdriven any dockerized solutions, like the other suite of applications shown in the repo.

Thanks, but stuffing all that in the git repo, is not what I would consider proper backup strategy.

@fanuch
Copy link

fanuch commented May 2, 2020

Could debate with you that git can do binary files, images, and there are heaps of git repos that are much larger than you 520MB, but looks like you don't want that.

Sure , use Borg or Rsync. Both can do incremental backups and restores.

@anandslab
Copy link
Owner

@stratosgear

I can consider adding a backup system.

But your comment on docker binaries etc is easily addressed with .gitignore. My .gitignore by default ignores all the contents of the folder and I explicitly specify which files I want to be published in my git repo.

But the downside is you have to make sure you specify all important files.

@stratosgear
Copy link
Author

stratosgear commented May 2, 2020

Sure, yes, I also intended to share my backup solution when I get around implementing it (along with some other optimizations I have in mind)

The way I see it, is that all docker-compose and other config files are saved and source controlled in a git repo (pretty much as this repo is setup) but everything under ~/docker is properly backed up with a separate system (hopefully something as easy as bringing up another docker instance to do the back up).

Of course you have the issue of not being completely able to backup a set of running docker instances, but you could just docker-compose -f docker-compose-t2.yml stop && docker-compose -f docker-backup.uml run && docker-compose -f docker-compose-t2.yml up through a cron job...

Also the .gitignore solution is not ideal because you never know what kind of binary files (or any other files that you want to keep out of the git repo) a new container will be generating. By the time you find out it could have been already committed to the repo and it will be a never ending battle trying to keep that .gitignore up to date.

[Edit]: I just saw that you have reversed the usage of .gitignore with the use of the *, and that would indeed address the problem I mentioned above. Still, though, it would still require constant meddling in order to be sure that you "save" all important settings, so I still consider this as sub-optimal... :)

Thanks...

@ptoulouse
Copy link

The proposed setup put everything in the same "top level" folder:

  1. The docker-compose file
  2. The initial configuration files (authelia, traefik2/rules)
  3. The data generated by the containers

In my own environment, I set it up as 2 folders:

  1. The docker-compose and initial config files are in a folder that can be kept in github with a minimal .gitignore file. A few files need to be backed up if you are not using a private repo (e.g. .env).
  2. The data generated by the containers is in a separate folder and is backed up.

@beloso
Copy link

beloso commented Jun 28, 2020

I've just set up Duplicati service. It backs up folders to a myriad of cloud service.
I set mine up with Google Drive. Appears to be working.

I've used this image: https://github.com/linuxserver/docker-duplicati

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants