Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some items not being removed from Pipeline #129

Closed
sarhugo opened this issue Nov 20, 2012 · 5 comments
Closed

Some items not being removed from Pipeline #129

sarhugo opened this issue Nov 20, 2012 · 5 comments
Labels

Comments

@sarhugo
Copy link

sarhugo commented Nov 20, 2012

Hi, I think that there is a problem at __process_filter_queue of arbitrator.py
If the file is not beeing processed because it has been synced already to the server, the item is not removed from the pipeline queue. This prevents from adding more items to the pipeline queue at __process_pipeline_queue

Increasing the MAX_FILES_IN_PIPELINE to a really high value is the only way to solve it.

@wimleers
Copy link
Owner

Hm. Seems near impossible to me that nobody has ever run into this problem before. But it's definitely plausible.

Can you post a sequence of file system events to reproduce this problem?

@sarhugo
Copy link
Author

sarhugo commented Nov 22, 2012

What I believe it happened is that I modified the filter settings for the server and then it started to transfer all the items again. My mistake was to not clear the database, so maybe this is the cause of the inconsistency.
I discovered the bug when I found that the arbitrator wasn't processing any files after some time running. When the pipeline queue is full of these "already synced files" the arbitrator falls in some kind of infinite loop (can't add any item because of capacity, but also don't remove any one).
I'm not sure if there is a reason to keep these synced files in the pipeline or is safe to remove it.

Btw, excellent work

@wimleers
Copy link
Owner

Hm, it's hard to help you if you cannot exactly define which steps you took to cause it in the first place :)

But it sounds like everything has been sorted out? Or do you still have an actual question?

@sarhugo
Copy link
Author

sarhugo commented Nov 22, 2012

I "solved" but I wanted to let u know the possible problem (even if I'm the only one :( ). The problematic items was near 450.. so I increased the MAX_FILES_IN_PIPELINE to 500 and now it process by 50 items without a problem. If the number of these synced files increases then I'll let u know. Otherway seems to be working fine.

@wimleers
Copy link
Owner

Ok. Let me know if you can one day reproduce this :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants