Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected number of deleted filesets 1 vs 0 #1583

Open
TonySturch opened this issue Feb 18, 2016 · 33 comments

Comments

@TonySturch
Copy link

commented Feb 18, 2016

Hello -

I was wondering is someone might have some suggestions for troubleshooting my backup. It's a 2.17 TB backup to Amazon Cloud Drive, and for the last couple of weeks its been throwing the error: Unexpected number of deleted filesets 1 vs 0. I have debug and verbose turned on, but I cant find what seems to be missing. The error started while I was running 2.0.0.96 but now I'm running 2.0.0.99. I'm running windows 10 64 bit, 4.5Ghz, 16GB, 256 SSD.

When I click show, it displays this in the General Logs:
VerboseOutput: False
VerboseErrors: False
Messages: [Destination and database are synchronized, not making any changes]
Warnings: []
Errors: []

And then here are the first couple of lines from the remote log:
[
{"Name":"duplicati-bc797c670fb0e4352a6fd0847eae54fb6.dblock.zip.aes","LastAccess":"2016-01-27T03:46:03.782Z","LastModification":"2016-01-27T03:46:03.782Z","Size":1073656989,"IsFolder":false},
{"Name":"duplicati-i91f9137521584e5683ec16949837293a.dindex.zip.aes","LastAccess":"2016-01-27T03:46:04.798Z","LastModification":"2016-01-27T03:46:04.798Z","Size":784941,"IsFolder":false},
{"Name":"duplicati-i74633fd2d4be47428ba0e88694bc22a6.dindex.zip.aes","LastAccess":"2016-01-27T03:57:36.877Z","LastModification":"2016-01-27T03:57:36.877Z","Size":682125,"IsFolder":false},
{"Name":"duplicati-ifd90a204567d437d82fc86d769941817.dindex.zip.aes","LastAccess":"2016-01-27T04:03:17.73Z","LastModification":"2016-01-27T04:03:17.73Z","Size":784589,"IsFolder":false},

I have tried to repair, and also recreate the database but the error persists. I also tried a bug report, but the file never downloads, it just sits there doing nothing.

Thanks for any advice on troubleshooting this further.

Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.

@jfspan

This comment has been minimized.

Copy link

commented Feb 26, 2016

I also get this error, although not sure its the same problem as TonySturch. I updated my wife's machine (2013 Mac Air with Duplicati backup to OneDrive) to Duplicati 2.0.0.99 and I believe she closed the lid during a backup session. When I opened it, the backup gave this error. After trying to repair, then delete and repair, the backup still gives the same error as above.

I was able to get an error report. Here is a link to download it: https://onedrive.live.com/redir?resid=A60CC657F0C95379!21564&authkey=!AOm6qIDrgQQJjDY&ithint=file%2czip

Here are the main errors I get when trying to repair or run the backup:

2016-02-20 08:53: Failed while executing "Backup" with id: 1
System.Exception: Unexpected number of deleted filesets 1 vs 0
at Duplicati.Library.Main.Database.LocalDeleteDatabase+c__Iterator0.MoveNext () in :line 0
at System.Linq.Buffer1[TElement]..ctor (IEnumerable1 source) in :line 0
at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source) in :line 0
at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact) in :line 0
at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, IFilter filter) in :line 0

2016-02-20 08:36: Failed while executing "Repair" with id: 1
System.Exception: The file duplicati-b6cd27dc7174b4c248f7d59c7333195bb.dblock.zip was downloaded and had size 6830 but the size was expected to be 13249197
at Duplicati.Library.Main.AsyncDownloader+AsyncDownloaderEnumerator+AsyncDownloadedFile.get_TempFile () in :line 0
at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun (Duplicati.Library.Main.Database.LocalDatabase dbparent, Boolean updating, IFilter filter, Duplicati.Library.Main.Operation.NumberedFilterFilelistDelegate filelistfilter, Duplicati.Library.Main.Operation.BlockVolumePostProcessor blockprocessor) in :line 0
at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.Run (System.String path, IFilter filter, Duplicati.Library.Main.Operation.NumberedFilterFilelistDelegate filelistfilter, Duplicati.Library.Main.Operation.BlockVolumePostProcessor blockprocessor) in :line 0
at Duplicati.Library.Main.Operation.RepairHandler.RunRepairLocal (IFilter filter) in :line 0
at Duplicati.Library.Main.Operation.RepairHandler.Run (IFilter filter) in :line 0
at Duplicati.Library.Main.Controller+c__AnonStorey3.<>m__0 (Duplicati.Library.Main.RepairResults result) in :line 0
at Duplicati.Library.Main.Controller.RunAction[T](Duplicati.Library.Main.T result, System.String[]& paths, IFilter& filter, System.Action`1 method) in :line 0

It also appears that the backup continues to 'work' and I can restore newly backed up files, however, they restore to a 'null' folder within the Duplicati folder.

Really appreciate the work done on Duplicati and any help that can be given.

@jfspan

This comment has been minimized.

Copy link

commented Feb 29, 2016

I don't know if this is relevant, but I noticed that when I try to edit the configuration for how long to keep the backups, the settings are never 'saved'. Instead, the settings always return to 'forever', '3' and 'days'. Maybe its related to this problem?

@jfspan

This comment has been minimized.

Copy link

commented Mar 7, 2016

I might have something more to add to help diagnose the problem. I had the error happen again this morning on a different backup from the one above. It happened when I had to force stop an upload that was hanging due to a server time out. The source size is now 0 bytes and continues to give the error "Unexpected number of deleted filesets 1 vs 0" even after verifying the backup, repairing, and delete and repair operations have been done on it.

What is interesting is the report generated before and after the problem. It appears as though the backup doesn't recognize the source folders anymore? I have tried adding the source again, but this doesn't fix the problem.

This was the backup that created the errors

2016-03-07 09:18: Result
DeletedFiles: 4173
DeletedFolders: 1195
ModifiedFiles: 0
ExaminedFiles: 0
OpenedFiles: 0
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 0
SizeOfOpenedFiles: 0
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
VerboseOutput: False
VerboseErrors: False
Messages: [Stopping backup operation on request]
Warnings: []
Errors: []

2016-03-07 09:18: Message
Stopping backup operation on request

This was the last successful backup

2016-03-05 17:17: Result
DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 25
ExaminedFiles: 4173
OpenedFiles: 26
AddedFiles: 1
SizeOfModifiedFiles: 99348367
SizeOfAddedFiles: 4776806
SizeOfExaminedFiles: 2400723082
SizeOfOpenedFiles: 104356284
NotProcessedFiles: 0
AddedFolders: 1
TooLargeFiles: 0
FilesWithError: 1
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
VerboseOutput: False
VerboseErrors: False
Messages: [No remote filesets were deleted, Compacting not required]
Warnings: [Failed to process path: /Users/THS/Library/Mobile Documents/comappleNumbers/Documents/ClassList.numbers/ => Too large metadata, cannot handle more than 102400 bytes]
Errors: []

After deleting and repairing the database, Duplicati gives the error: "The file duplicati-b2f017bd20ed74a6382345b2309ca985f.dblock.zip was downloaded and had size 6832 but the size was expected to be 656431"

@GuitsBoy

This comment has been minimized.

Copy link

commented Mar 7, 2016

Thanks for the added info jfspan.

I believe my issues may have started when the process was interrupted as well.

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 14, 2016

The error with "forever" and "3 days" is fixed in source and will be in the next build.

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 14, 2016

Some additional fixes for Amazon Cloud drive are also included in the next build

@TonySturch

This comment has been minimized.

Copy link
Author

commented Mar 14, 2016

Thanks for the update. Did you mean there were some fixes in 2.0.1.3? Or in a build which has not yet been released?

I installed build 2.0.1.3 this morning and ran a backup sucessfully, with the usual warning. I then attempted a repair, and got the same warning. Then I attempted a recreate, and that choked on 3 files and make matters worse. I can no longer do a backup, as it errors out with:

Found inconsistency in the following files while validating database:

Run repair to fix it.

Unfortunately Repair does not work. It errors out with:

Failed to repair, after repair 1 blocklisthashes were missing

Any suggestions on a workaround for this? Id prefer not having to upload the 2.3TB again

@jfspan

This comment has been minimized.

Copy link

commented Mar 23, 2016

I installed build 2.0.1.8_canary_2016-03-20 and the delete and repair appeared to work, although the backup had to upload almost all the data again (checking the logs, it looked like each file was set to timestamp of 01-01-0001). However, after it completed, I received the error:
"Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 101 volumes"

Running repair doesn't do anything (the log says the remote and local db are synchronized).

My source size still shows 0 bytes.

Also, I should mention that I had originally set the backups to be kept for at least 5 years so there shouldn't have been any files marked for deletion in the backup.

@jfspan

This comment has been minimized.

Copy link

commented Mar 23, 2016

I also installed the same build on my wife's laptop to fix the same error and now its gives:
"Abort due to constraint violation UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State"

Not sure how these are related, but both gave the same "Unexpected number of deleted filesets 1 vs 0" error before.

Here is the bug report generated:
bugreport 03-23-2016.zip

@agrajaghh

This comment has been minimized.

Copy link
Contributor

commented Mar 23, 2016

I also installed the same build on my wife's laptop to fix the same error and now its gives:
"Abort due to constraint violation UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State"

this looks like #1644

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 24, 2016

@jfspan thanks for the bug report, I will look into it.
@agrajaghh Yes, I think it is the same problem.

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 24, 2016

@jfspan there appears to be a lot of errors in that rebuild of the database.
For instance: "Failed to process file: duplicati-20150919T173221Z.dlist.zip", could you try to extract that file with a zip program to see if it is damaged?

I can see that the backup initially failed with a "throttled" error, meaning that the destination prevented you from sending more data.

Are you possibly having two backups that point to the same destination folder?

@jfspan

This comment has been minimized.

Copy link

commented Mar 24, 2016

I use separate folders for each backup but I do have two or three separate backups (all managed by Duplicati) from the same machine. I break down various data groups (Documents that change more often separately from Photos and Music) and have them backup on different schedules, but each to their own folder.

I was able to open the duplicati-20150919T173221Z.dlist.zip file and it appeared to be intact. The filelsit.json was readable and listed files with hashes, etc.

Yeah, I was puzzled by the throttled error except that maybe it was being accessed by another app at the same time.

One more thing to add (hopefully helpful for debugging), when I tried to add another backup (using canary 2.0.1.8 build) on the same Macbook Air to test settings and see if another setting would work, duplicati read an extremely large dataset for what should have been a small amount of data. I selected about 11GB of data, but duplicati began reading 212GB to backup (there's only 128GB of disk space). I stopped the backup (not forced stopped) and checked the settings, tried to select a smaller batch in the configuration, but when I started again, Duplicati then read 17TB of data to backup.

@jfspan

This comment has been minimized.

Copy link

commented Mar 24, 2016

Also, just wanted to again say, thank you for your time and help since I know you're doing this very part time without compensation. We appreciate Duplicati since it is much better than the paid programs I've used previously (in every aspect).

kenkendk added a commit that referenced this issue Mar 24, 2016

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 24, 2016

I think the error with reading files in your setup is the same as in #1652.
That issue has now been fixed.

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 27, 2016

There is a new canary build that should fix most of the issues mentioned here (except the "unexpected number of remote filesets" error, which I am still trying to figure out):
https://github.com/duplicati/duplicati/releases

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 27, 2016

@TonySturch and @jfspan I think I finally found the cause of the "unexpected number of remote filesets" message.

In the bug report from @jfspan I found the volume "duplicati-20160220T012347Z.dlist.zip" is only 323 bytes. Not sure why it was created, but it contains no files, and this confuses the delete procedure and it aborts with the mentioned message.

I will make a workaround for this issue, but it would be nice to know why it happens.

@TonySturch: Could you try to recreate the database with the latest canary build? It has some additional fixes for recreating the database that will hopefully fix your issue.

@TonySturch

This comment has been minimized.

Copy link
Author

commented Mar 27, 2016

Thank you kindly, Sir. I have downloaded the latest binaries and an running a rebuild right now. May take quite some time to either complete or fail.

kenkendk added a commit that referenced this issue Mar 27, 2016

@TonySturch

This comment has been minimized.

Copy link
Author

commented Mar 28, 2016

The recreate has been running for nearly a day. It was very quick to reach about 85% to 90% complete as per the mercury bar, but has since stalled out. The process is still using about 40% to 50% CPU, so it appears to be doing something. Ill leave it going for a couple day sand see if it makes and progress. The previous version would have likely failed by this point.

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 28, 2016

It could be because it is looking for additional data that it could not find. In that case it will be busy downloading and checking files to see if the missing data is located.

@GuitsBoy

This comment has been minimized.

Copy link

commented Mar 28, 2016

I suppose that would make sense, though I have not yet seen it use much network data. Its currently using about 65% CPU (quad core 4.6Ghz) and about 15M/s disk usage to the system disk, not the volume with the archive being backed up. Doesn't look like it's moved a single pixel since this morning. Should I just sit tight and let it do it's thing?

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 28, 2016

In that case it is probably some SQL query that is not well designed and causes slowdowns during the repair. The recreate queries are the next thing @FootStark will work on.

@TonySturch

This comment has been minimized.

Copy link
Author

commented Mar 29, 2016

I found this error this morning:
The OAuth service is currently over quota, try again in a few hours

Thank you so much for all the help. Please, I really hope you don't take my postings an my being impatient. I'm only trying to help by providing any errors I came across. If there's anything I can do to help in any way, please let me know. Thank you again.

@deHoeninger

This comment has been minimized.

Copy link

commented Mar 29, 2016

Hi @TonySturch, I had the OAuth Quota message with the GDrive backend as well this morning, which finally caused my Backup to fail.

kenkendk added a commit that referenced this issue Mar 29, 2016

@kenkendk

This comment has been minimized.

Copy link
Member

commented Mar 29, 2016

@TonySturch and @deHoeninger: The OAuth service costs some money to run, and it had hit the daily cap I put on it, after a surge of new users came with the announcement of the new experimental release.

I have doubled the daily spending cap, so your backups should be running without this error again.

I have not yet figure out how to get a notification when the daily cap is hit or nearing, so feel free to drop me an email if you see the quota message again.

@GuitsBoy

This comment has been minimized.

Copy link

commented Apr 1, 2016

I have been running the backup again over the last couple days, and it finally errored out today:

Unknown header: 2632630406

If I hit show, here's what I see in the logs:

Failed to process index file: duplicati-ie6495f3b84a94962baa8b65037933fb3.dindex.zip.aes
Newtonsoft.Json.JsonReaderException: After parsing a value an unexpected character was encountered: :. Path 'blocks[458].hash', line 1, position 32198.
at Newtonsoft.Json.JsonTextReader.ParsePostValue()
at Newtonsoft.Json.JsonTextReader.ReadInternal()
at Newtonsoft.Json.JsonTextReader.Read()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.MoveNext()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.ReadVolumeProps()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.Dispose()
at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)

Failed to process index file: duplicati-i47417beddfda45b6b9912765962fc725.dindex.zip.aes
Newtonsoft.Json.JsonReaderException: Unexpected character encountered while parsing value: T. Path 'blocks[2903].hash', line 1, position 203225.
at Newtonsoft.Json.JsonTextReader.ParseValue()
at Newtonsoft.Json.JsonTextReader.Read()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.MoveNext()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.ReadVolumeProps()
at Duplicati.Library.Main.Volumes.IndexVolumeReader.IndexBlockVolumeEnumerable.IndexBlockVolumeEnumerator.BlockEnumerable.BlockEnumerator.Dispose()
at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)

Is there anything useful in there? I couldn't actually find the files they were talking about in my Amazon cloud drive. Should I simply delete the whole thing and start from scratch?

@GuitsBoy

This comment has been minimized.

Copy link

commented Apr 1, 2016

Also strange is that Duplicati continues to use 30%-40% CPU after the job has ended. Whats even more strange is that after right clicking and exiting, the icon disappears but it continues to run, all the while using 30% or more CPU.

@jfspan

This comment has been minimized.

Copy link

commented Apr 7, 2016

I am still getting errors on database repairs and the backup still doesn't appear to be completely working. Whenever I run delete and repair, there is still the error "The file duplicati-b0a3381d92b7f48edaa3a254bb1dbed4c.dblock.zip was downloaded and had size 6832 but the size was expected to be 52331007"

The error given when I run the backup is now, "The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again." (Then repeat the error given above).

However, when I try to duplicate the backup setting for a new backup, the source size jumps astronomically from what should be about 25–35GB to well over 90GB. Its as though the filters are not working correctly or being misapplied in some way. Not sure if these are connected. I have a similar problem on my wife's computer with file sizes. On hers, when I recreate a backup that isn't working, the source jumps from about 40GB to over 190GB.

Is it possible to 'roll back' to a previous backup point, and restart the backup process from there? I only have about 80-90GB of data (not quite the 2.3TB of TonySturch above) if I were to start over (and would like to avoid it if possible), but if would be great to know how to best fix this to go forward.

@sinasalek

This comment has been minimized.

Copy link

commented Aug 12, 2016

Windows 10 64bit, having similar problem. unable to perform any backup
Tried running it as administrator with full permission but nothing changed
System.Exception: Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 2 volumes at Duplicati.Library.Main.Database.LocalDeleteDatabase.<DropFilesetsFromTable>c__Iterator0.MoveNext() at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction transaction, Boolean hasVerifiedBacked, Boolean forceCompact) at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter) at Duplicati.Library.Main.Controller.<Backup>c__AnonStorey0.<>m__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, Action1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)`

System.IO.IOException: The process cannot access the file because it is being used by another process. at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) at System.IO.File.InternalMove(String sourceFileName, String destFileName, Boolean checkHost) at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, Action1 method)
at Duplicati.Library.Main.Controller.Repair(IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)`

System.Data.SQLite.SQLiteException (0x80004005): disk I/O error disk I/O error at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt) at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt) at System.Data.SQLite.SQLiteDataReader.NextResult() at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave) at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior) at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior) at Duplicati.Library.SQLiteHelper.DatabaseUpgrader.UpgradeDatabase(IDbConnection connection, String sourcefile, String schema, IList1 versions)
at Duplicati.Library.SQLiteHelper.DatabaseUpgrader.UpgradeDatabase(IDbConnection connection, String sourcefile, Type eltype)
at Duplicati.Library.Main.Database.LocalDatabase.CreateConnection(String path)
at Duplicati.Library.Main.Database.LocalBackupDatabase..ctor(String path, Options options)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.c__AnonStorey0.<>m__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, Action1 method) at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter) at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

@njfox

This comment has been minimized.

Copy link

commented Feb 15, 2018

I've been having a similar (or the same) issue on my Arch Linux machine using ACD. I've tried repairing the database but I'm still receiving these error messages on every backup:

Unexpected number of deleted filesets 18 vs 19

Feb 14, 2018 1:01 AM: Message
Re-creating missing index file for duplicati-bcf55e7939a9d404287537f5f92a996fd.dblock.zip.aes
Feb 14, 2018 1:01 AM: Message
Expected there to be a temporary fileset for synthetic filelist (123, duplicati-b10312ffaf6a74f79a238ffef3dcded11.dblock.zip.aes), but none was found?

Were we ever able to figure out the cause or how to fix it?

@jfspan

This comment has been minimized.

Copy link

commented Feb 15, 2018

@njfox Unfortunately, I was never able to get it fixed. I had to setup a new backup and deleted the old one. I assumed the issue was a bug only happening in previous versions that could not be repaired. Sorry, to hear it is still happening. I haven't had it happen again after setting up the new backups.

@njfox

This comment has been minimized.

Copy link

commented Feb 26, 2018

@kenkendk Do you have any ideas for how we can troubleshoot this issue? Would you like me to open a new issue in case these problems aren't related? The original thread is from a couple of years ago so it could be a separate bug.

@Pectojin Pectojin added the bug label Jul 29, 2018

@mr-flibble

This comment has been minimized.

Copy link
Contributor

commented Oct 17, 2018

Same problem in Duplicati - 2.0.3.12-2.0.3.12_canary_2018-10-23

I got this error during purge-broken-files

System.Exception: Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 0 volumes
   at Duplicati.Library.Main.Database.LocalDeleteDatabase.<DropFilesetsFromTable>d__6.MoveNext()
   at System.Linq.Buffer`1..ctor(IEnumerable`1 source)
   at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
   at Duplicati.Library.Main.Operation.PurgeFilesHandler.DoRun(LocalPurgeDatabase db, IFilter filter, Action`3 filtercommand, Single pgoffset, Single pgspan)
   at Duplicati.Library.Main.Operation.PurgeBrokenFilesHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.PurgeBrokenFiles(IFilter filter)
   at Duplicati.CommandLine.Commands.PurgeBrokenFiles(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

If I run recreate, i got:
The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

So my backup job is KO.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
10 participants
You can’t perform that action at this time.