Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot extract fields that starts with underscore in tag_keys #6705

Closed
MitranLiviuMarian opened this issue Nov 22, 2019 · 3 comments · Fixed by #6744
Closed

Cannot extract fields that starts with underscore in tag_keys #6705

MitranLiviuMarian opened this issue Nov 22, 2019 · 3 comments · Fixed by #6744
Labels
bug unexpected problem or unintended behavior
Milestone

Comments

@MitranLiviuMarian
Copy link

Relevant telegraf.conf:

[agent]
interval = "10s"
flush_interval = "10s"
debug = true
quiet = false
logfile = ""
metric_buffer_limit = 50000
metric_batch_size = 1

[[outputs.kafka]]
brokers = ["kafka-collect:9092"]
data_format = "json"
topic = "collect-telegraf-prometheus0"
client_id = "telegraf-prometheus0"

[[inputs.http]]
interval = "10s"
urls = ["http://my-release-prometheus-server/api/v1/query?query=container_cpu_usage_seconds_total"]
json_query = "data.result"
data_format = "json"
tag_keys = ["metric_pod", "metric_instance", "metric___name__"]

System info:

I have a Kubernetes Cluster with a Prometheus Operator deployed. In the same Cluster i am trying to deploy a Telegraf which will collect metrics from Prometheus using the HTTP API and passing PromQL queries. For that i have used the http input plugin from Telegraf and the json format.
Because Telegraf is ignoring by default string fields, i am using tag_keys to extract the metrics that i want from the HTTP API response.

A mandatory field that i want to extract is the metric name, which can be found in the metric labels.
The problem is that, this field is named "__name__", and Telegraf is not able to parse it (maybe because in the tag_keys, underscore is used to navigate in the json payload)

The metric that is returned by calling the Prometheus HTTP API looks like this:
{
"status": "success",
"data": {
"result": [
{
"metric": {
"__name__": "container_memory_usage_bytes",
"beta_kubernetes_io_arch": "amd64",
"beta_kubernetes_io_os": "linux",
"container": "acm",
"instance": "kind-control-plane",
"pod": "acm-758b774686-5fbfs",
},
"value": [
1574155650.109,
"211066880"
]
}
]
}
}

Steps to reproduce:

  1. Deploy Prometheus Operator
  2. Deploy Telegraf with the above config
  3. The output will contain all the fields except the "__name__" field.

Expected behavior:

The output should look something like this:
{
"fields": {
"value_0": 1574271339.87
},
"name": "http",
"tags": {
"host": "telegraf-prometheus0-7cbbb585b4-tldlv",
"metric_instance": "kind-control-plane",
"metric_pod": "etcd-kind-control-plane",
"url": "http://my-release-prometheus-se rver/api/v1/query?query=container_cpu_usage_seconds_total",
"__name__": "some_metric_name"
},
"timestamp": 1574271340
}

Actual behavior:

The output without the __name__ field
{
"fields": {
"value_0": 1574271339.87
},
"name": "http",
"tags": {
"host": "telegraf-prometheus0-7cbbb585b4-tldlv",
"metric_instance": "kind-control-plane",
"metric_pod": "etcd-kind-control-plane",
"url": "http://my-release-prometheus-se rver/api/v1/query?query=container_cpu_usage_seconds_total"
},
"timestamp": 1574271340
}

Additional info:

@danielnelson
Copy link
Contributor

It appears the final field is being trimmed of underscore. As a workaround you could specify/collect the field as metric___name.

@danielnelson danielnelson added the bug unexpected problem or unintended behavior label Nov 23, 2019
@danielnelson danielnelson added this to the 1.13.0 milestone Nov 23, 2019
@MitranLiviuMarian
Copy link
Author

Hello Daniel
Using this format "metric___name" worked for me. Thank you!

@danielnelson
Copy link
Contributor

Thanks for the update, let's keep this open for now since I think we ought to fix this odd behavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug unexpected problem or unintended behavior
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants