Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

429 Resource has been exhausted #59

Open
liptanbiswas opened this issue Feb 3, 2020 · 2 comments
Open

429 Resource has been exhausted #59

liptanbiswas opened this issue Feb 3, 2020 · 2 comments

Comments

@liptanbiswas
Copy link

I am getting these errors in stackdriver-metadata-agent-cluster-level deployment pod.

W0202 08:37:09.943363       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:38:02.854145       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:38:10.252510       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:39:02.854485       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:39:10.466841       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:40:02.854799       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:40:10.711573       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:41:02.855385       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:41:10.936932       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:42:01.927927       1 trace.go:898] Failed loading config; disabling tracing: open /export/hda3/trace_data/trace_config.proto: no such file or directory
I0202 08:42:02.855644       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:42:11.186921       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:43:02.855968       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:43:11.412187       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:44:02.856350       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:44:11.650193       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:45:02.856618       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:45:11.891921       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:46:02.856854       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:46:12.113489       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:47:01.928158       1 trace.go:898] Failed loading config; disabling tracing: open /export/hda3/trace_data/trace_config.proto: no such file or directory
I0202 08:47:02.857201       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:47:12.308528       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:48:02.857501       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:48:12.450212       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:49:02.857788       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:49:13.512880       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:50:02.858095       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:50:13.827352       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:51:02.858392       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:51:14.062459       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:52:01.928374       1 trace.go:898] Failed loading config; disabling tracing: open /export/hda3/trace_data/trace_config.proto: no such file or directory
I0202 08:52:02.858863       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:52:14.319897       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).
I0202 08:53:02.859160       1 binarylog.go:265] rpc: flushed binary log to ""
W0202 08:53:14.570000       1 kubernetes.go:118] Failed to publish resource metadata: rpc error: code = ResourceExhausted desc = Resource has been exhausted (e.g. check quota).

On Google cloud console I can also see
image

In each 24 hours, it is making 202,559 api calls to publish metadata. Out of which 92% fails. I am using a custom service account and it has Stackdriver Resource Metadata Writer permissions.

Any idea, why too many requests? How do I resolve it?

@hypnoglow
Copy link

Hi @liptanbiswas ,

Have you resolved the issue? We currently experience the same problem.

@liptanbiswas
Copy link
Author

@hypnoglow No, I thought it's not the right place to ask. I then asked in Slack channel but did not got any reply. Let me reopen it again.

@liptanbiswas liptanbiswas reopened this Feb 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants