-
Notifications
You must be signed in to change notification settings - Fork 232
Conversation
Removes worker threads and instead laucnhes a new thread per fetch request.
…t_fetch Conflicts: samsa/consumer/partitions.py
|
||
msg = self.queue.get(True, timeout) | ||
self._offset = msg.next_offset | ||
return msg.payload |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should either be msg
or message
throughout
Conflicts: samsa/config.py
Conflicts: samsa/config.py
There's now a handler class which will abstract threading vs gevent. The RequestHandler now accepts that class to figure out how to spawn workers. also add client attribute to ManagedBroker
@@ -325,6 +223,7 @@ def connect(self): | |||
""" | |||
self._socket = socket.create_connection((self.host, self.port), | |||
timeout=self.timeout) | |||
self._socket.settimeout(self.timeout) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
isn't this the same as the timeout
kwarg to create_connection
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yup
looks pretty good overall, thanks for all your work on this i don't think there's anything really blocking it from being merged into master at this point but there's some general cleanup we could do — i'm +1 if you want to merge now, or you can clean up from the notes above and we can merge in after — your call |
Thanks @tkaemming for your excellent notes. I think I've covered them in 7127300 and e0ec6ea but I'm rushing so will wait to merge them until I get in. |
all tests passing.
Please do take the time to do a thorough code review. We can do it together if you prefer. It also might be a good idea to get a third set of eyes on the threading code. Hopefully someone with more experience with them than I do.