Navigation Menu

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backup doesn't complete correctly #2027

Open
nordurljosahvida opened this issue Sep 2, 2021 · 2 comments
Open

Backup doesn't complete correctly #2027

nordurljosahvida opened this issue Sep 2, 2021 · 2 comments

Comments

@nordurljosahvida
Copy link
Contributor

Long time user, several installations completed and active, have an instance with backups stalling and not completing.

They stall and after about 6H they exit without modifying or adding anything on wasabi, backups page only shows the first initial full backup and then no incremental ones or others.

  1. Is it correct that backup script / duplicity doesn't log anywhere?
  2. Is there a way to extract the exact duplicity command backup script is running and manually running it verbosely to see what's going on?

Already tried emptying out wasabi bucket, /backup/cache dir locally, and running again from scratch.

Thanks!

@nordurljosahvida
Copy link
Contributor Author

Found this in the mail alert:

About 100 lines of:

Warning: owncloud/<redacted> has negative mtime, treating as 0.

And at the end:

Warning: owncloud/<redacted> has negative mtime, treating as 0.
Attempt 1 failed. ConnectionResetError: Connection reset by peer
Attempt 2 failed. ConnectionResetError: Connection reset by peer
Attempt 3 failed. ConnectionResetError: Connection reset by peer
Attempt 4 failed. ConnectionResetError: Connection reset by peer
Giving up after 5 attempts. ConnectionResetError: Connection reset by peer
Traceback (most recent call last):
  File "management/backup.py", line 578, in <module>
    perform_backup(full_backup)
  File "management/backup.py", line 279, in perform_backup
    get_env(env))
  File "/home/openspace/mailinabox/management/utils.py", line 123, in shell
    ret = getattr(subprocess, method)(cmd_args, **kwargs)
  File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/duplicity', 'incr', '--verbosity', 'warning', '--no-print-statistics', '--archive-dir', '/home/user-data/backup/cache', '--exclude', '/home/user-data/backup', '--volsize', '250', '--gpg-options', '--cipher-algo=AES256', '/home/user-data', 's3://s3.eu-central-1.wasabisys.com/<redacted>', '--allow-source-mismatch', '--ssh-options= -i /root/.ssh/id_rsa_miab', '--rsync-options= -e "/usr/bin/ssh -oStrictHostKeyChecking=no -oBatchMode=yes -p 22 -i /root/.ssh/id_rsa_miab"']' returned non-zero exit status 50.

so is wasabi dropping the connection?

Also, tried to extrapolate the command:

/usr/bin/duplicity incr --verbosity warning --no-print-statistics --archive-dir /home/user-data/backup/cache --exclude /home/user-data/backup --volsize 250 --gpg-options --cipher-algo=AES256 /home/user-data s3://s3.eu-central-1.wasabisys.com/<redacted> --allow-source-mismatch --ssh-options="-i /root/.ssh/id_rsa_miab" --rsync-options="-e '/usr/bin/ssh -oStrictHostKeyChecking=no -oBatchMode=yes -p 22 -i /root/.ssh/id_rsa_miab'"

but i think the escaping is incorrect because:

'Check your credentials' % (len(names), str(names)))
 boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['S3HmacAuthV4Handler'] Check your credentials

Any ideas?

Thanks

@DerBunteBall
Copy link

Hi,

check this: https://github.com/mail-in-a-box/mailinabox/blob/main/management/backup.py at line 266 the full backup command starts. Keep in mind that the services need to be stopped.

My recommendation: Disable backup and do all the stuff with borg and borgmatic.

Best Regards

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants