Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: The process cannot access the file because another process has locked a portion of the file. #309

Open
TheBestPessimist opened this issue Dec 20, 2017 · 15 comments

Comments

Projects
None yet
8 participants
@TheBestPessimist
Copy link
Contributor

commented Dec 20, 2017

2017-12-20 12:45:50.604 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/SettingSync/metastore/meta.jfm: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.604 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/SettingSync/remotemetastore/v1/edb.log: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.646 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/SettingSync/remotemetastore/v1/meta.edb: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.646 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/SettingSync/remotemetastore/v1/meta.jfm: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.654 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/UPPS/UPPS.bin: Access is denied.
2017-12-20 12:45:50.654 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/WebCache/V01.log: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.675 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/WebCache/V01tmp.log: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.675 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/WebCache/WebCacheV01.dat: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.675 WARN OPEN_FAILURE Failed to open file for reading: open D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Microsoft/Windows/WebCache/WebCacheV01.jfm: The process cannot access the file because it is being used by another process.
2017-12-20 12:45:50.703 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk d36bb82e5c5781ad5ed0ff7c88025ae1885d1380d52a6c047126532b3a036109
2017-12-20 12:45:50.703 INFO UPLOAD_PROGRESS Uploaded chunk 1678 size 8503908, 18.56MB/s 01:32:41 7.5%
2017-12-20 12:45:50.704 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk bb2e5056e24561c6a8ff68ca9d0f92c220208b0e3e1a0d03296dee45203f7b07
2017-12-20 12:45:50.704 INFO UPLOAD_PROGRESS Uploaded chunk 1679 size 1092829, 18.56MB/s 01:32:40 7.5%
2017-12-20 12:45:50.773 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 394e20d9c257e2b2cb594673cd9b3595989d1ebb1431652351f105bb5ea147dc
2017-12-20 12:45:50.773 INFO UPLOAD_PROGRESS Uploaded chunk 1680 size 7449196, 18.58MB/s 01:32:35 7.5%
2017-12-20 12:45:50.837 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 8ffc8728ac9b007a57f203292e1a69d2ccdcda438704226edd7a34d42208fb80
2017-12-20 12:45:50.837 INFO UPLOAD_PROGRESS Uploaded chunk 1681 size 7339147, 18.59MB/s 01:32:30 7.5%
2017-12-20 12:45:50.914 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 53cc639910a9e92fd959404674b4a521a89cc46310be19d724d3117d2d0a3915
2017-12-20 12:45:50.914 INFO UPLOAD_PROGRESS Uploaded chunk 1682 size 3153920, 18.60MB/s 01:32:28 7.5%
2017-12-20 12:45:50.980 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 6e1d8565050fc03d04509ed093be2c427d5887add5a520c530747a72c34c9d5c
2017-12-20 12:45:50.980 INFO UPLOAD_PROGRESS Uploaded chunk 1683 size 3690215, 18.61MB/s 01:32:26 7.5%
2017-12-20 12:45:51.625 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 0382ffe4037fdee43dbec5b17dbd358fa323d2fd5f8de6824f930e3092dde663
2017-12-20 12:45:51.625 INFO UPLOAD_PROGRESS Uploaded chunk 1685 size 5196331, 18.58MB/s 01:32:34 7.5%
2017-12-20 12:45:51.654 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 520e9d6539a4f8d1645058948ed26784391fbd8d3bf71261842cb56926a07b93
2017-12-20 12:45:51.654 INFO UPLOAD_PROGRESS Uploaded chunk 1665 size 1546293, 18.58MB/s 01:32:33 7.5%
2017-12-20 12:45:51.750 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk 791039ab5248ef293752d6233da8fdfe1265332a675fe196b36d30785cfd8502
2017-12-20 12:45:51.750 INFO UPLOAD_PROGRESS Uploaded chunk 1686 size 5256058, 18.59MB/s 01:32:30 7.5%
2017-12-20 12:45:51.828 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk fdb754593eea29cb7e3cef2ef4dd179e14fad521122b2457d73406d586dc7f59
2017-12-20 12:45:51.828 INFO UPLOAD_PROGRESS Uploaded chunk 1687 size 5974861, 18.60MB/s 01:32:26 7.5%
2017-12-20 12:45:52.367 DEBUG CHUNK_UPLOAD Uploading was skipped for chunk b01524e43707d91f3f64a772f7028432ba0978a86f05e5bcb3b28affe80ca6f5
2017-12-20 12:45:52.367 INFO UPLOAD_PROGRESS Uploaded chunk 1684 size 5056151, 18.57MB/s 01:32:34 7.6%
2017-12-20 12:45:53.824 ERROR CHUNK_MAKER Failed to read 0 bytes: read D:\backup_software_test\duplicacy repo/c_link/Users/crist/AppData/Local/Packages/Microsoft.SkypeApp_kzf8qxf38zg5c/LocalState/DataRv/offline-storage.data-shm: The process cannot access the file because another process has locked a portion of the file.

Is it okay that duplicacy just dies when it cannot read a specific file? I thought it skips it and then carries on.

(yes i am backing up my whole C drive. i am doing this because i am trying to make a good filter file, which i intend to give it to you, if you want to add it, if not in code, at least in wiki -- as a start)

  • Note: c_link is a symlink to C:\
  • Second note: i am not using skype at all. Not the modern app and not the win32 app, so i don't know why windows runs it.
  • Note 3: as you can see, there are some files which are skipped because of read problems, but this one wasn't.
@gilbertchen

This comment has been minimized.

Copy link
Owner

commented Jan 5, 2018

The current implementation is like this: if a file can't be opened at all, it will be skipped; however, if a file can be opened but can't be read, then Duplicacy will bail out with an error. It can't simply skip this file because some bytes from this file may have been already put into a chunk and now it is in an inconsistent state.

However, in the case where no bytes have been read when a read error occurs, it may be safe to skip the file.

In your case if you enable the -vss option that particular file may be readable.

@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Jan 5, 2018

Vss didn't help!

@gilbertchen

This comment has been minimized.

Copy link
Owner

commented Jan 6, 2018

This suggested that this error is caused by the use of the LockFile function. I couldn't find any doc on VSS and LockFile, but if it didn't help in your test then probably they aren't compatible with each other.

@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Jan 6, 2018

In this case, may i suggest duplicacy handles these files same as any other non-reading processes?

Somehow i feel that a backup software should try to backup as much as possible: tell me what couldn't been backed up (give me error codes, etc. so i can give them to the developer), and then carry on.
In this way i will have a backup of at least most of my data, instead of none. (Think of someone who sets up a scheduled task for backup and doesn't check the stats at the end -- for example i would be doing this for my grandma's computer)

@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Mar 18, 2018

Again this problem, for a different file now:

2018-03-18 11:58:50.344 ERROR CHUNK_MAKER Failed to read 0 bytes: 
read C:\duplicacy repo/C__Users_link/crist/AppData/Local/NVIDIA/GLCache/bb8124ee13ed6bd9595b872889a79514/f2245e201e5b7842/8aa002a52e1fef93.bin: 
The process cannot access the file because another process has locked a portion of the file.
@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Mar 19, 2018

This problem seems to continue even after a restart, and therefore i added the whole AppData/Local/NVIDIA/ folder to my ignore list.

@ghost

This comment has been minimized.

Copy link

commented Apr 15, 2018

I've bumped into this same problem, and it;s darn annoying.
I am in agreement with The BestPessimist:
"backup software should try to backup as much as possible: tell me what couldn't been backed up (give me error codes, etc. so i can give them to the developer), and then carry on."

Anything else is a nightmare! Backup runs an hour, then hangs. You find it later, start it , hangs again, etc.

@dreamflasher

This comment has been minimized.

Copy link

commented May 4, 2018

It can't simply skip this file because some bytes from this file may have been already put into a chunk and now it is in an inconsistent state.

Shouldn't the process then be like this: At the time of the read error we are collecting for a certain chunk. At this time there must be a list of which files are in the current chunk until now. Then simply discard the current chunk, add the file which failed to read to an ignore list, and start the chunk from scratch with the files that we were able to read, skip the bad file and continue as usual.

This is a major issue, I constantly run into this files, have to put them on my filter list and restart the whole process, which takes ages with 3TB to backup, because all files need to be hashed again. (I guess that's another ticket, to not rehash if the backup failed, but skip the files we already uploaded).

@mojimba

This comment has been minimized.

Copy link

commented Aug 16, 2018

Duplicacy is also failing for me due to some locked files. This is happening with the -vss option (though I'd prefer not to use it at all so a non-admin user can run the backup jobs). I know what files are locked - they're always some process-specific memory-mapped files which I don't necessarily care if they get backed up or not.

I do have the option of messing around with include/exclude patterns to ignore these, but would prefer if there's a way I can tell Duplicacy to just ignore any file read or locked file errors and move on. Currently it's terminating with exit code 100:

Storage set to \\srv07bck01\backup$\SRV12UTIL01\duplicacy
No previous backup found
Creating a shadow copy for C:\
Shadow copy {EF0B671F-3CEF-4AED-826E-EC046AE529AB} created
Indexing C:\Backups\BackupSources
Incomplete snapshot loaded from C:\Backups\BackupSources/.duplicacy/incomplete
Listing all chunks
Skipped 38127 files from previous incomplete backup
Skipped chunk 2 size 3507432, 3.34MB/s 02:02:11 0.0%
...
Skipped chunk 27 size 1861202, 124.47MB/s 00:03:17 0.5%
Failed to read 0 bytes: read 
C:\Backups\BackupSources/.duplicacy\shadow\\Backups\BackupSources/repo-home/var/cache/Configs/repo/data077991.bin: The process cannot access the file because another process has locked a portion of the file.

The problem with include/exclude filters is that these filenames change, and may exist in different locations (which may include files which we want to be backed up). The other option for us is to do some dev work on the components that are creating these files in the first place, so they can all go into the same folder and/or use a consistent filename.

@NiJoao

This comment has been minimized.

Copy link

commented Sep 21, 2018

I'm also having this problem, and I found the dev at FFS was able to solve this stating only:
"I've fixed the issue which was related to locked byte ranges not being considered by FFS, but now they are".
Does it ring a bell to a possible solution, @gilbertchen ?

@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Dec 12, 2018

Another hit of this problem:
2018-12-12 20:00:28.118 ERROR CHUNK_MAKER Failed to read 0 bytes: read C:\duplicacy repositories\tbp-nuc/C__Users_link/crist/AppData/Local/Google/Chrome/User Data/Default/LOCK: The process cannot access the file because another process has locked a portion of the file.

@Xipherisroot

This comment has been minimized.

Copy link

commented Mar 8, 2019

I was testing 2.1.2 before purchase and ran into this same issue

"Failed to read 0 bytes: read (file in question here): The process cannot access the file because another process has locked a portion of the file."

Of all the software I've been testing so far, this is the only one to have this issue...?

@praxiq

This comment has been minimized.

Copy link

commented Apr 4, 2019

Also hitting this error. In my case the file is in my user directory: AppData/Local/Google/Chrome/User Data/Default/LOCK (not surprising that it's locked, I guess! ;)

@vladgh

This comment has been minimized.

Copy link

commented Apr 30, 2019

I was very happy with Duplicacy and it has ALL the features I want, but this is a deal breaker for me. After a few days, when I checked the logs, I saw that Duplicacy simply bails out mid backup, with only an error line in the logs. The process doesn't even exit with a non-zero exit code. If I hadn't checked the logs I had no idea that 80% of my backup was skipped.
For me the errored files are the .nst or .ost in the Outlook folder.
I tried excluding them, or the entire path, by adding it to filters, but it doesn't work. The error is still there.
This is great software, but it's a pity, because basically, unless I close Outlook, I cannot backup my computer (and not even know it).

2019-04-30 10:02:57.225 INFO UPLOAD_PROGRESS Skipped chunk 53 size 16777216, 34.36MB/s 00:00:27 20.7%
2019-04-30 10:02:57.284 INFO UPLOAD_PROGRESS Skipped chunk 54 size 5657007, 35.13MB/s 00:00:26 21.2%
2019-04-30 10:02:57.321 INFO UPLOAD_PROGRESS Skipped chunk 55 size 4015717, 35.67MB/s 00:00:26 21.5%
2019-04-30 10:02:57.425 INFO UPLOAD_PROGRESS Skipped chunk 56 size 8110700, 36.78MB/s 00:00:25 22.2%
2019-04-30 09:42:55.181 ERROR CHUNK_MAKER Failed to read 0 bytes: read C:\duplicacy/.duplicacy\shadow\\duplicacy/vlad/AppData/Local/Microsoft/Outlook/xxx@yyy.edu - zzz.nst: The process cannot access the file because another process has locked a portion of the file.
2019-04-30 09:42:55.197 INFO VSS_DELETE The shadow copy has been successfully deleted
@TheBestPessimist

This comment has been minimized.

Copy link
Contributor Author

commented Jun 19, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.