Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prometheus sends packages with different Content-Type in HTTP header #4560

Closed
waldauf opened this Issue Aug 29, 2018 · 3 comments

Comments

Projects
None yet
2 participants
@waldauf
Copy link

waldauf commented Aug 29, 2018

Proposal

Globaly Promehteus supports remote_write to InfluxDB. If I use direct write into InfluxDB everything works great. Scheme:

[Prometheus: remote_write into InfluxDB] ---> [InfluxDB]

I captured sent packages by Prometheus to InfluxDB by tcpdump. All packages are sent wiht Context-Type: text/plain in HTTP header.

Example of one captured package:

# tcpdump -i ens160 -A -s 0 -ttttnnvvS 'dst port 8086 and src k8s.sw.lab and (((ip[2:2] - ((ip[0]&0xf)<<2)) - ((tcp[12]&0xf0)>>2)) != 0)'
16:02:10.326345 IP k8sm.sw.lab.46838 > tick.sw.lab.8086: Flags [P.], seq 863959075:863961110, ack 855948242, win 1413, options [nop,nop,TS val 1641754224 ecr 91267025], length 2035
E..'jA@.?..D
...
.......3~.#3.......De.....
a..p.p..POST /write?consistency=any&db=telegraf_k8s HTTP/1.1
Host: 10.22.20.17:8086
User-Agent: telegraf
Transfer-Encoding: chunked
Content-Type: text/plain; charset=utf-8
Accept-Encoding: gzip

72a
docker_container_mem,annotation.io.kubernetes.container.hash=5e4e5a97,... total_pgfault=83005582i 1535464922000000000




In my case is not possible to write directly into InfluxDB because I need to save metrics into Kafka first. That's why I included stream processor Benthos (tried Telegraf too) in the flow - between Prometheus and Kafka.

Scheme:

[Prometheus: remote_write into InfluxDB] ---> [http_server input - Benthos - kafka output] ---> [Kafka] ---> [Telegraf] ---> [InfluxDB]

But all packages sent by Prometheus have binary content and it's not possible to work with them. In HTTP header is set Content-Type: application/x-protobuf:

# tcpdump -i ens160 -A -s 0 -ttttnnvvS 'dst port 4197 and src k8s.sw.lab and (((ip[2:2] - ((ip[0]&0xf)<<2)) - ((tcp[12]&0xf0)>>2)) != 0)'
Host: 10.22.20.15:4197
User-Agent: Go-http-client/1.1
Content-Length: 6891
Content-Encoding: snappy
Content-Type: application/x-protobuf
X-Prometheus-Remote-Write-Version: 0.1.0

....L
..
!
.__name__..container_tasks_state
 
.beta_kubernetes_io_arch..amd64
.
.bF".0os..linux
.
..[.m....POD
.

datacenter..k8s-lab
..
.id.}/kubepods/besteffort/pod3e4f7e20-aac9-11e8-9ec2-0050568fadab/97cf5928643b576f686de352160d7b02e42054087f3d265771ecb452bbd71e3b
#
.image..k8s.gcr.io/pause-amd64:3.1
.
.instance..k8sm
 
.job..kube)-H-nodes-cadvisor
.
.6J..host%. .k8sm
q
...<ik8s_POD_prom01-...etheus-alertmanager-744764c4d9-t9dns_monitoring_3e4f7e2r+.._0
.
  .s.space.
.@.
;
.pod)../......
.
.EAL..running........,
..y..y..y..y..y..y..y..y..y.ry...Qy..sleep2z................................stopped...............................r........uni.O  .ruptibleY......  ..  ..  .53f829-aj. .q513fa45ca21b62368c3abedc421baf05494f4369697afe4d586f16f587fa2047
#
.image..k8s.gcr.io/pause-amd64:3.1
~
~

For me it looks like Prometheus decides which *Content-Type* choose and use. That's why I'm not possible to include stream processor between Prometheus and InfluxDB.

Can I ask you for help how to set Prometheus to keep Content-Type: text/plain?

Environment

  • System information: Linux 4.4.0-116-generic x86_64
  • Prometheus version: 2.2.1
  • Prometheus configuration file: If you think that config file is relevant I'll send it.
  • Logs: If you think that log file is relevant I'll send it.
@brian-brazil

This comment has been minimized.

Copy link
Member

brian-brazil commented Aug 29, 2018

Remote write doesn't use gzip or text/plain, that's some other packet you've captured. All remote write is via snappy-compressed protobufs, and if you're using remote write then https://github.com/prometheus/prometheus/tree/master/documentation/examples/remote_storage/example_write_adapter will help you get going.

@waldauf

This comment has been minimized.

Copy link
Author

waldauf commented Aug 29, 2018

Hello Brian,

yes, you're right. Before a while I found out that my tcpdump captured packages from another source - exactly as you wrote.

I'll check your link. Thx for your answer.

@waldauf waldauf closed this Aug 29, 2018

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 22, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 22, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.