Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add retry in zmqrpc #973

Merged
merged 1 commit into from Mar 14, 2019

Conversation

Projects
None yet
2 participants
@delulu
Copy link
Contributor

delulu commented Mar 5, 2019

When I'm running Locust in distributed mode, I've noticed an exception thrown from slave agent "AssertionError: Only one greenlet can be waiting on this event".

On investigating this issue, I find out that there're multiple greenlets in SlaveLocustRunner as shown in

locust/locust/runners.py

Lines 384 to 388 in 87a9aa1

self.greenlet.spawn(self.heartbeat).link_exception(callback=self.noop)
self.greenlet.spawn(self.worker).link_exception(callback=self.noop)
self.client.send(Message("client_ready", None, self.client_id))
self.slave_state = STATE_INIT
self.greenlet.spawn(self.stats_reporter).link_exception(callback=self.noop)

and they're talking to Locust master with the same zmq socket, so there might be concurrent calls to self.client.send, then the exception above will be thrown.

We can leverage semaphore to avoid the concurrent calls, but there will be a performance cost. Since these concurrent situations are not very likely, it's better to simply add a retry for socket communication and makes Locust more robust.

@delulu delulu force-pushed the delulu:retry branch 2 times, most recently from 67db43a to 8e73cd2 Mar 5, 2019

@delulu delulu force-pushed the delulu:retry branch from 8e73cd2 to 0be48ed Mar 14, 2019

@delulu

This comment has been minimized.

Copy link
Contributor Author

delulu commented Mar 14, 2019

@cgoldberg for awareness, please have a review and let me know if any concern.

@cgoldberg cgoldberg merged commit e6d63b7 into locustio:master Mar 14, 2019

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.