Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prometheus output #58

Closed
jelu opened this issue May 8, 2020 · 19 comments · Fixed by #87
Closed

Prometheus output #58

jelu opened this issue May 8, 2020 · 19 comments · Fixed by #87

Comments

@jelu
Copy link
Member

jelu commented May 8, 2020

Move here from DNS-OARC/dsc#204

@Daniel15
Copy link

Daniel15 commented Oct 27, 2021

@danhanks in DNS-OARC/dsc#204 you said that you were writing a script to run in cron that would convert the data to a format that Prometheus can use. Did you end up implementing that?

I haven't looked too far into it but I guess using https://github.com/prometheus/influxdb_exporter could be an option as well. It exposes an InfluxDB-like web API and converts all metrics to Prometheus format.

@jelu
Copy link
Member Author

jelu commented Dec 1, 2021

Hey @Daniel15, since Dan hasn't responded, are you going ahead with the development of this?

@danhanks
Copy link

danhanks commented Dec 1, 2021

@Daniel15, @jelu,

Yes, I did end up writing that script. I would be happy to contribute it, just need to get permission from my employer.

@jelu
Copy link
Member Author

jelu commented Dec 1, 2021

@danhanks Great to hear, hope you can share. I would be very nice if this then can be turned into an output module for dsc-datatool.

@danhanks
Copy link

danhanks commented Dec 1, 2021

I can't recall all the reasons why, but I ended up writing it as a standalone daemon/exporter that watches (via inotify) for json datafiles generated by dsc. It parses those files, and generates a bunch of metrics that can then be scraped regularly by Prometheus. I imagine some of this code could also be re-purposed into an output module for dsc-datatool, but I need to do more reading about dsc-datatool to see how that would work.

@danhanks
Copy link

@Daniel15 @jelu,

I have received approval to contribute this code. Where in the repo would it make sense to put it? contrib, maybe?

@jelu
Copy link
Member Author

jelu commented Dec 15, 2021

@danhanks Just put it anywhere really, or do a gist. Will look at reworking it into a plugin once I'm back from holidays.

@jelu
Copy link
Member Author

jelu commented Jan 10, 2022

@danhanks Back from holidays, did you put the code somewhere?

@danhanks
Copy link

@jelu Thanks for the reminder. Here you go: https://gist.github.com/danhanks/9c59734f380ac56a8c1bdb7bec54bdb4

Let me know if you have any questions.

@jelu
Copy link
Member Author

jelu commented Jan 14, 2022

Thanks @danhanks.

By the looks of it it's not something I can add as crontrib right away, nor make into a dsc-datatool module.

The script seems to be very specific for your needs and your setup. It assumes a lot of things that might not be what others want/have. Someone would need to work a bit on it to make it more suitable for anyone, like command line options for most stuff and maybe use dnspython to get DNS numbers to text conversion rather then hardcoded lists.

An output module could be made but it would likely solely depend on node_exporter or some other mechanic for delivering the stats to Prometheus. While the format are similar they are not the same, things like grouping, help text, histogram and summaries are not something that is done in InfluxDB - that was done in Grafana. I could probably create an output module quite quickly but I would need someone with Prometheus knowledge and a setup to test it.

Do any of you (@danhanks @Daniel15) have time/want to take on any of this?

@jguidini
Copy link

Hi All!
A prometheus exporter for DSC is great!! I'm looking for this some time. Here I cannot use InfluxDB (like needed by dsc-datatool), so grafana cannot be used to show data.
I can test some sort of software, if needed.
Thanks all by idea and effort!

@jelu
Copy link
Member Author

jelu commented Jan 17, 2022

@jguidini Are you able to use Prometheus node_exporter in your setup?

@jguidini
Copy link

@jelu Yes.

@jelu
Copy link
Member Author

jelu commented Jan 18, 2022

@jguidini Please try this branch with the node_exporter.

You'll need to use the new output --output ";Prometheus;file=<file>" and put the file somewhere before moving it to node exporter directory (as describe in their link above).

This generates output as:

# TYPE pcap_stats counter
pcap_stats{server="test-server",node="test-node",pcap_stat="filter_received",ifname="eth0"} 5625 1563520560000
pcap_stats{server="test-server",node="test-node",pcap_stat="kernel_dropped",ifname="eth0"} 731 1563520560000
pcap_stats{server="test-server",node="test-node",pcap_stat="pkts_captured",ifname="eth0"} 4894 1563520560000

@jelu
Copy link
Member Author

jelu commented Jan 18, 2022

@jguidini if you run into problem or need help setting it up maybe it's easier if we talk on OARC's Mattermost, find me here: https://chat.dns-oarc.net/community/channels/oarc-software.

@jguidini
Copy link

@jelu I'm had installed a prometheus and configure a node_exporter to an DNS server in our site. I'd installed dsc-datatool and generated a file in Prometheus style. On grafana added your dashboards, but now I'm working on how to node_exporter read the file generated by dsc-datatools (like --collector.textfile.directory /opt/prometheus/data) to prometheus collect them.

@jguidini
Copy link

@jelu I found the error, from debug log on node_exporter:

Jan 18 16:39:15 bee10 node_exporter[4794]: ts=2022-01-18T19:39:15.328Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="failed to parse textfile data from \"/opt/prometheus/data/datatool.prom\": text format parsing error in line 120: invalid escape sequence '\\='"

From file (datatool.prom):

   120 asn_all{server="bee10",node="recursivo",ipversion="IPv4",asn="PzQ\="} 60 1642521960000
   121 asn_all{server="bee10",node="recursivo",ipversion="IPv6",asn="PzY\="} 27 1642521960000
   122 country_code{server="bee10",node="recursivo",countrycode="BR"} 125 1642521960000
   123 country_code{server="bee10",node="recursivo",countrycode="PzQ\="} 60 1642521960000
   124 country_code{server="bee10",node="recursivo",countrycode="PzY\="} 27 1642521960000

To test I removed \= on all file (old friend sed), then from node_exporter log:

Jan 18 16:45:15 bee10 node_exporter[5934]: ts=2022-01-18T19:45:15.368Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="failed to parse textfile data from \"/opt/prometheus/data/datatool.prom\": text format parsing error in line 4054: invalid escape sequence '\\ '"

Again sed on some entries.. and another error:

Jan 18 16:50:01 bee10 node_exporter[6873]: ts=2022-01-18T19:50:01.268Z caller=textfile.go:219 level=error collector=textfile msg="failed to collect textfile data" file=datatool.prom err="textfile \"/opt/prometheus/data/datatool.prom\" contains unsupported client-side timestamps, skipping entire file"

Now i'm not know how to solve this.

@jelu
Copy link
Member Author

jelu commented Jan 19, 2022

@jguidini I've fixed the quoting, you can git pull to get the updated code.

I've also removed timestamps because while the format supports it, I found Note: Timestamps are not supported. in node_exporter docs 😕

I see that you've joined our Mattermost, so lets continue discussions there 🙂

jelu added a commit to jelu/dsc-datatool that referenced this issue Feb 1, 2022
- Fix DNS-OARC#58: Add Prometheus output
- InfluxDB: fix timestamp option
- Update Copyright year
jelu added a commit to jelu/dsc-datatool that referenced this issue Feb 1, 2022
- Fix DNS-OARC#58: Add Prometheus output
- InfluxDB: fix timestamp option
- Update Copyright year
@jelu jelu closed this as completed in #87 Feb 1, 2022
@jelu
Copy link
Member Author

jelu commented Feb 1, 2022

Is there anyone here that is up for writing a guide how to set this up?

   Prometheus' node_exporter
       This output can be used together with Prometheus' node_exporter's Textfile Collector
       to automate statistics gathering but some specific setup and  requirements  must  be
       meet.

       You  must  hide  the timestamp with option timestamp=hide because timestamps are not
       supported by the Textfile Collector.

       You must make sure only one XML file from a server+node combination is processed  at
       a time.  Because otherwise you will have multiple data point from the same metric in
       the files generated and because the Textfile Collector does not  support  timestamps
       it cannot separate the measurements.

       You  must  make sure that only one file (per server+node combo) is generated for the
       Textfile Collector to read, and it should be the same between  runs.   See  Textfile
       Collectors documentation how to setup that atomically.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants