-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some items not being removed from Pipeline #129
Comments
Hm. Seems near impossible to me that nobody has ever run into this problem before. But it's definitely plausible. Can you post a sequence of file system events to reproduce this problem? |
What I believe it happened is that I modified the filter settings for the server and then it started to transfer all the items again. My mistake was to not clear the database, so maybe this is the cause of the inconsistency. Btw, excellent work |
Hm, it's hard to help you if you cannot exactly define which steps you took to cause it in the first place :) But it sounds like everything has been sorted out? Or do you still have an actual question? |
I "solved" but I wanted to let u know the possible problem (even if I'm the only one :( ). The problematic items was near 450.. so I increased the MAX_FILES_IN_PIPELINE to 500 and now it process by 50 items without a problem. If the number of these synced files increases then I'll let u know. Otherway seems to be working fine. |
Ok. Let me know if you can one day reproduce this :) |
Hi, I think that there is a problem at __process_filter_queue of arbitrator.py
If the file is not beeing processed because it has been synced already to the server, the item is not removed from the pipeline queue. This prevents from adding more items to the pipeline queue at __process_pipeline_queue
Increasing the MAX_FILES_IN_PIPELINE to a really high value is the only way to solve it.
The text was updated successfully, but these errors were encountered: