-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Metricbeat Kafka module does not publish consumergroup metrics (Cluster mode) #4285
Comments
Thanks a lot for the detailed steps on how to reproduce it. |
@ruflin Hi Nicolas. Do you have any ideas when this bug will be fixed? |
we get the same problem |
Got the same problem when updated Kafka cluster from 0.10.0.1 to 0.10.2.1 and reporting of consumer metrics stopped |
Same problem, something new? |
It doesn't work for me even for single node of kafka. Have to send partitions with offsetFetchRequest to get something back. |
I confirm that consumergroup metricset is broken with new kafka version. |
Running Metricbeats version 5.3.0 using Elasticsearch 5.3 and Logstash 5.3, I get the following JSON output:
It seems that most of those fields are represented in the Metricbeats consumergroup reference. Am I missing something? The bug doesn't seem to be affecting me. |
I'm closing this one as I believe it was solved in the meantime. |
Test setup:
partition
andconsumergroup
metricsets. Send metrics to elastic search. turn on debug logging in the metricbeats configuration for *bootstrap_servers => "server1,server2,server3"
so that logstash connects to the cluster.Sample logstash configuration for producer:
Sample logstash configuration for consumer:
Test Results:
You will notice in the metricbeats logs that you are receiving events for the
partition
metricset but not for theconsumergroup
metricset.For a more detailed discussion of this issue you can check the following thread on the forum:
https://discuss.elastic.co/t/metricbeat-kafka-module-consumergroup-metric-set-does-not-report-metrics/83623
The text was updated successfully, but these errors were encountered: