You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Application remembers downloaded files so that any new file with the same md5/sha1 as one of the already downloaded files should be skipped. However, when adding new files with the same md5/sha1 to the download queue and if that md5/sha1 isn't yet in the database, the duplicate files will be only detected if they are all are added to the download queue at the same time. A race condition is possible where duplicate files are not detected:
Two or more files are in the same json but their md5/sha1 is not yet in the database
The first file is added to the download queue and downloaded
The second file is added to the download queue after the first one has been completed - the duplicate file is also downloaded
This should be rare if at all possible since files from the same json are added to the download queue roughly at the same time, as soon as their torrents have been downloaded. However, the downloading of torrents is asynchronous and hence a race condition is possible. The longer is the download queue the smaller is the chance that a duplicate file is downloaded before the other duplicate is added to the download queue.
A bigger issue is with only indexing the json rather than downloading it because then nothing is being added to the download queue. However even in this case the issue is only visible in logs because in any case only one file with the duplicated md5/sha1 will be stored in the database (the other duplicates are overwritten).
This could be fixed by checking any new download not only against the list of already downloaded files in the db but also against files currently being downloaded, e.g. every new download could be stored in ETS and then moved to Mnesia once the download has finished.
The text was updated successfully, but these errors were encountered:
Application remembers downloaded files so that any new file with the same md5/sha1 as one of the already downloaded files should be skipped. However, when adding new files with the same md5/sha1 to the download queue and if that md5/sha1 isn't yet in the database, the duplicate files will be only detected if they are all are added to the download queue at the same time. A race condition is possible where duplicate files are not detected:
This should be rare if at all possible since files from the same json are added to the download queue roughly at the same time, as soon as their torrents have been downloaded. However, the downloading of torrents is asynchronous and hence a race condition is possible. The longer is the download queue the smaller is the chance that a duplicate file is downloaded before the other duplicate is added to the download queue.
A bigger issue is with only indexing the json rather than downloading it because then nothing is being added to the download queue. However even in this case the issue is only visible in logs because in any case only one file with the duplicated md5/sha1 will be stored in the database (the other duplicates are overwritten).
This could be fixed by checking any new download not only against the list of already downloaded files in the db but also against files currently being downloaded, e.g. every new download could be stored in ETS and then moved to Mnesia once the download has finished.
The text was updated successfully, but these errors were encountered: