Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Corrupted backup with max compression on (Arch) Linux [restic 0.16.0] #4523

Closed
luca-molinaro opened this issue Oct 19, 2023 · 4 comments · Fixed by #4532
Closed

Corrupted backup with max compression on (Arch) Linux [restic 0.16.0] #4523

luca-molinaro opened this issue Oct 19, 2023 · 4 comments · Fixed by #4532

Comments

@luca-molinaro
Copy link

luca-molinaro commented Oct 19, 2023

Output of restic version

restic 0.16.0 compiled with go1.20.7 on linux/amd64

Edit: I now tested this also on MacOS, here is the output of restic version:

restic 0.16.0 compiled with go1.20.6 on darwin/amd64

How did you run restic exactly?

I used the default environment variables. More info on the command i used
in the following sections. Note that I was trying to backup about 800 GB of
data, but i also tested this on the specific subfolder that was giving me problems.

The subfolder can be downloaded here:
https://drive.google.com/file/d/1O3XnVzEY6tNCVRo8_TmS1s5_JV0rGOGZ/view?usp=share_link

What backend/server/service did you use to store the repository?

I used a local repository, i tried storing the repository on different HDDs and NVMe to exclude
a possible hardware failure, but got the same result (with the same hashes) from restic check.

Expected behavior

Restic check should not return any errors after backing up the folder.

Actual behavior

Restic check output:

using temporary cache in /tmp/restic-check-cache-4239199922
enter password for repository:
repository 952e5db1 opened (version 2, compression level auto)
created new cache in /tmp/restic-check-cache-4239199922
create exclusive lock for repository
load indexes
check all packs
check snapshots, trees and blobs
[0:00] 100.00%  1 / 1 snapshots
read all data
pack a7478fd99242cd86ddac79595494c07ff54165cf920554b49cab54c4946408fc contains 1 errors: [blob cf8f8df28d1de9f932ae567f10757c743bc9bf0020c636d27e98eae06b51df21: read blob <data/cf8f8df2> from a7478fd9: wrong data returned, hash is 2bc05a41d9c650277af6b02cd4bc69d6296a6ea532798e1062ec8fecb9dd3eb8]
pack fcd1c4bbdfe01bed5501125bc1a9704f08172dbaff92b632bb0cfe0243321aa2 contains 1 errors: [blob dc82d97be7683ecd41097ab02d7b15de81e8bbcd1c476c50b254b1f458090929: read blob <data/dc82d97b> from fcd1c4bb: wrong data returned, hash is 0b093d9ae6957ba8d3a1b09130623ba517e179347c3291af8d96fe954fd748f6]
[0:04] 100.00%  149 / 149 packs
Fatal: repository contains errors

Steps to reproduce the behavior

  • download files_to_backup.tar.gz from the provided google drive link
  • unzip files_to_backup.tar.gz
  • mkdir TEST_REPO
  • restic -r TEST_REPO init
  • restic -r TEST_REPO backup --compression max files_to_backup
  • restic -r TEST_REPO check --read-data

Do you have any idea what may have caused this?

I don't exactly know, but without compression max it works on this folder, but
that may be just a coincidence.

Do you have an idea how to solve the issue?

No.

Did restic help you today? Did it make you happy in any way?

I love restic, and hope this will be fixed soon, thank you.

@MichaelEischer
Copy link
Member

Thanks a lot for the bug report! The underlying issue is a data corruption caused by the zstd library used by restic. I've opened an issue there and added minimal examples (46 and 70KB) from the backup dataset that are sufficient to reproduce the issue: klauspost/compress#875

@MichaelEischer
Copy link
Member

restic 0.15.2 is not affected by this bug.

@luca-molinaro
Copy link
Author

You're welcome, and thank you for this awesome piece of software.

@JsBergbau
Copy link
Contributor

Note: You need to run restic check --read-data to verify that you are affected. Calculating the sha256 checksum of the files will not detect this bug/error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants