Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose metadata as OpenMetrics #61

Closed
momorientes opened this issue Dec 28, 2021 · 10 comments
Closed

Expose metadata as OpenMetrics #61

momorientes opened this issue Dec 28, 2021 · 10 comments

Comments

@momorientes
Copy link

momorientes commented Dec 28, 2021

As discussed with @job and @robert-scheck on IRC I would appreciate to see the metadata-dict in the vrps.json exposed in the OpenMetrics format.

An example output could be:

# HELP rpki_client_roas_count Total number of ROAs
# TYPE rpki_client_roas_count gauge
rpki_client_roas_count 100020

[...]

OpenMetrics requires the software to natively answer to HTTP GET which I suppose is a no-go for rpki-client, however prometheus users would still be able to ingest the data either via a webserver or by utilizing the textfile collector of node_exporter.

If this in general is relevant to rpki-client I'll happily help with the OpenMetrics side, I sadly can't contribute any C-Code though.

@cjeker
Copy link
Member

cjeker commented Jan 10, 2022

This is currently not a priority for us but it is on the todo list.
Happy if someone else writes an output module for this (it should not be part of the JSON output but an own file instead).

@cjeker
Copy link
Member

cjeker commented Dec 17, 2022

Support to write a metrics file was added a few days ago.
The problem is that the textfile collector of node_exporter is not OpenMetrics compatible. We had to use "gauge" in place of "info" or "stateset" types. But with that hack the file can be served by node_exporter. Another option is to serve it directly from a webserver if one is running. How to serve the metrics is outside the scope of rpki-client.

Happy to get feedback on the collected metrics. Happy to extend or adjust it.

@robert-scheck
Copy link
Contributor

What would be needed to integrate this into the rpki-client container image? Just something like a minimal HTTP daemon serving this file via port 9100?

@cjeker
Copy link
Member

cjeker commented Dec 18, 2022

Yes, I think that is a decent solution for the container. Not sure if port 9100 is the right one but lets start with that.

Make sure the Content-Type of the file is application/openmetrics-text; version=1.0.0; charset=utf-8

@rfc1036
Copy link

rfc1036 commented Dec 18, 2022

You can pick a unused port here and register it: https://github.com/prometheus/prometheus/wiki/Default-port-allocations

@robert-scheck
Copy link
Contributor

But aren't the port allocations for when you actually write your own exporter (which I would like to avoid)? It does not make sense for me to blow up the tiny container image with yet another Golang exporter implementation (most of them seem to be Golang), especially as the image won't provide multiple exporters anyway. Everybody using the container image can remap the default port individually using -p 9100:12345 (at Docker/Podman).

@cjeker
Copy link
Member

cjeker commented Dec 18, 2022

@robert-scheck, I agree that the ports can be remapped. I'm just not sure if a default of 9100 is sensible. We could just grab some other random number like 80 instead ;-)

@robert-scheck
Copy link
Contributor

robert-scheck commented Dec 18, 2022

https://github.com/OpenObservability/OpenMetrics/blob/main/specification/OpenMetrics.md#iana-considerations-iana says:

The port assigned by IANA for clients exposing data is <9099 requested for historical consistency>.

@cjeker
Copy link
Member

cjeker commented Jan 20, 2023

@robert-scheck do we need to keep this open or did you adjust the rpki-client container image?

@robert-scheck
Copy link
Contributor

It didn't land in the container image yet, because reallife kept me quite busy otherwise. I created rpki-client/rpki-client-container#2 so that this one can be closed for now, as it's in rpki-client itself already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants