New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Measuring gRPC-side queueing time #8945
Comments
I looked into this some, thinking StreamTracker would be your best bet here. But I can confirm that outboundMessageSent() is indeed called when the message is submitted for the network thread for writing, not when it is actually written. Unfortunately, I don't think there is a way through the public APIs to get exactly what you are looking for. If you wanted to hack something up locally you could do something when the Netty write Future completes. E.g. NettyClientHandler gets notified when a write has happened here:
|
Thanks, that looks promising. |
@ulfjack pls update this issue with your findings so that we can close it |
Seems like this is resolved. If not, comment, and it can be reopened. |
I'm trying to measure the queuing time inside gRPC to disambiguate between a slow service, a slow network, and a slow client.
We're setting server-side max concurrent calls per stream to some value (that we think is appropriate given the machine size and work), but we sometimes have more outgoing requests on the client than that. In that case, gRPC has to wait for an HTTP2 stream to become available, which it does by internally queuing the calls. We'd like to detect / measure that so we have more visibility into where the latency is introduced.
From reading the code and running some experiments, I've not been able to do that with current gRPC (we're on release 1.41.0). I used a simple echo client-server test, where I artificially restricted the server to a single call per stream, the client to a single channel, and also added a server-side delay of 1 second. This clearly shows client-side queueing of calls, where the call is started immediately, but only executed once the previous call completes. This all works as expected.
I then investigated whether there's any way to observe when the outgoing message is actually sent.
In principle, it looks like the onReady handler could cover this case - when an HTTP2 stream becomes available, it gets assigned to the
NettyClientStream
, which in turn triggers an onReady signal. However, this doesn't make it through theRetriableStream
- release 1.41.0 mentions #7089, which seems to have changed at HEAD; I didn't track it down but disabled retries for my tests. However, the onReady signal is also suppressed for calls that only have a single client message inClientStreamListenerImpl
. This is consistent with the isReady documentation, which explicitly says that a single-message call may always return false for isReady.It seems like onReady might work for client-streaming calls, but we need this for unary calls.
There's also the
StreamTracer
interface, but the methods aren't called at the right time. In particular, theoutboundMessageSent
method - despite what the name seems to imply - is called before the message is actually sent on the network. In fact, it's called before we even have an HTTP2 stream available.Any suggestions?
The text was updated successfully, but these errors were encountered: