-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Receiving "INTERNAL: Panic! This is a bug!" messages #4544
Comments
One more thing: |
Hi @dadadom, I think this is expected behavior and the solution is to raise your maxQueueSize. That size is incredibly low. If there was a spike in callbacks (like 10 in a short period) you would get this error. Executors don't promise that if there is a free thread, you won't get a RejectedExecutionException. The way ThreadPoolExecutor's work is that there is a group of threads all trying to pull work off the queue. But, how do they know if there is work available? They have to wait on the queue (i.e. by calling |
Panic is legitimate here. If the executor queue is full, the channel can't submit any work to it. It can't even fail the RPC, because the executor can't even run |
We are experiencing
TaskRejectedException
s although the thread pool is not exhausted.We are sending about one unary request every 1-2 seconds and the request is returned within less than 100 ms.
The message we see (on the client) is:
The call is done like this:
The executor which is set for the managed channel is configured as follows:
It seems to be weird that the queue is rejecting new entries although there are idle threads. Also, knowing that there is only one
onNext()
and oneonCompleted()
per request, why are there so many entries being scheduled on that executor? If I understand it correctly, the executor is used to process theon*()
events, correct?I am not sure if this is actually a gRPC issue or an
Executor
issue, but gRPC is the only place where we see this.What version of gRPC are you using?
grpc-java 1.12.0
What did you expect to see?
No Panic
The text was updated successfully, but these errors were encountered: