Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce metric for per-exporter pressure on prometheus itself #1755

Closed
harmw opened this Issue Jun 21, 2016 · 5 comments

Comments

Projects
None yet
3 participants
@harmw
Copy link

harmw commented Jun 21, 2016

Is it possible to expose some kind of metric to point out the amount of stress a certain metrics endpoint poses on the prometheus machine itself? We have several exporters in our platform, just no simple way of figuring out which one of those is stressing prometheus the most. Some more insights in this process would be great.

@brian-brazil

This comment has been minimized.

Copy link
Member

brian-brazil commented Jun 21, 2016

What you're mostly looking for is the number of samples that an exporter is producing. Usually it's just a handful of metrics in one exporter. http://www.robustperception.io/which-are-my-biggest-metrics/

@fabxc fabxc added the kind/question label Jun 21, 2016

@fabxc

This comment has been minimized.

Copy link
Member

fabxc commented Jun 21, 2016

You can also do something along the lines of:

count by(instance) ({__name__=~".+"})

That can be expensive though.

@harmw

This comment has been minimized.

Copy link
Author

harmw commented Jun 21, 2016

Thanks, that looks useful.

However:

Error executing query: query timed out in expression evaluation

I'll see if I can do something to get the information out of prom.

@brian-brazil

This comment has been minimized.

Copy link
Member

brian-brazil commented Feb 13, 2017

scrape_samples_scraped does this now.

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 24, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 24, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.