Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overflowing Event Queue in BatchEventProcessor Only Longs a Debug Message, Maybe #352

Closed
jkolenofferup opened this issue Jul 26, 2021 · 4 comments

Comments

@jkolenofferup
Copy link

jkolenofferup commented Jul 26, 2021

If you overflow the event_queue in BatchEventProcessor, the queue.Full event is caught and produces a debug level message in the logger.

        try:
            self.event_queue.put_nowait(user_event)
        except queue.Full:
            self.logger.debug(
                'Payload not accepted by the queue. Current size: {}'.format(str(self.event_queue.qsize()))
            )

This solution is problematic for at least two reasons:

  1. Data loss is not a debug level issue. It should be at least warn or maybe error.
  2. If the logger is not set, this code silently fails enqueuing the event. Unless the event source owns the queue, there is no way for it to know that operation failed.
@Mat001
Copy link
Contributor

Mat001 commented Jul 26, 2021

Thank you for registering the issue @jkolenofferup . We are looking into it.

@Mat001
Copy link
Contributor

Mat001 commented Jul 29, 2021

@jkolenofferup I implemented the warning log level.
Unit test was tricky for Py3.4 and PyPy to pass. They required a large number of overflowing events to trigger the queue.Full exception while other versions of Py work fine with smaller number of events overflowing.

@jkolenofferup
Copy link
Author

You could pass a small queue with max elements set to 10

event_queue = queue.Queue(maxsize=10) 
bep = event_processor.BatchEventProcessor(
            event_dispatcher,
            event_queue=self.event_queue)

You'd only need to send 11+ events to trigger the error. My guess is that default queue size is different for Py3.4 and PyPy.

@Mat001
Copy link
Contributor

Mat001 commented Aug 2, 2021

You could pass a small queue with max elements set to 10

event_queue = queue.Queue(maxsize=10) 
bep = event_processor.BatchEventProcessor(
            event_dispatcher,
            event_queue=self.event_queue)

You'd only need to send 11+ events to trigger the error. My guess is that default queue size is different for Py3.4 and PyPy.

Yes, I tried that. All Py versions except 3.4, PyPy/PyPy3 would pass fine. The problematic three would consistently fail, they wouldn't even be flakey. As I increased the number of events to pass in from one over the queue limit to 20% more, to 100% more, to 1000% more, the tests for the Py 3.4, and PyPy was becoming more stable. Very strange.
Feeding 1000 events and max queue size 10 seems to make the test very stable. I really don't know why that is, perhaps something specific about Py 3.4 and PyPy queueing/threading mechanism.

@Mat001 Mat001 closed this as completed Aug 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants