New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add kafka integration 0.0.2 #428
Conversation
ceb2299
to
4f31ce3
Compare
💚 Build SucceededExpand to view the summary
Build stats
|
4f31ce3
to
4c3d3db
Compare
dev/packages/alpha/kafka/0.0.2/dataset/log/agent/stream/log.yml.hbs
Outdated
Show resolved
Hide resolved
required: false | ||
show_user: true | ||
default: | ||
- 'localhost:8778' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This module has some metricsets that need to connect to Kafka, and some others that need to connect to Jolokia. I am defining two "hosts" variables, one for each kind of metricset.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are there any metricsets that need to connect to both? If not, I think we should specify two different inputs instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, right, makes sense, I didn't think on this possibility, I will give a try, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How can I define two different inputs for the same module? I have tried with input: kafka_jolokia/metrics
, but then it configures metricbeat with module: kafka_jolokia
, that doesn't exist. And with input: kafka/jolokia_metrics
it doesn't create any configuration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you should leave the main manifest.yml as is (2 inputs: logs and kafka/metrics) and go deeper to datasets. agent/stream/stream.yml.hbs
is the configuration of metricbeat that is delivered to the agent. If you want to take a look at integration with 2 stream configurations, you can take a look at "o365".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have pushed the change for the approach with three integrations (not very tested yet).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the late answer. When you say 3 integrations, you mean 3 inputs? Could you share a screenshot on how it looks now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I think I mean three packages. Screenshots are updated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Umm, is it possible to have three different datasources in the same package? Would it work the same as the three packages I am adding now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We had a discussion offline about this, we have decided to go on by now only with the metricsets intended to monitor brokers, so I will remove the consumer and producer metricsets from this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Checked in the UI and looks good. We need to continue the discussion around how to structure this package as a follow up.
@mtojek If this is merged, could you take over the "transition" to package-registry/integrations and cleanup here?
|
||
The `broker`, `consumergroup` and `partition` metricsets are tested with Kafka 0.10.2.1, 1.1.0, 2.1.1, and 2.2.2. | ||
|
||
<!-- TODO: Add a link to Jolokia "input" in Metricbeat --> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This shows up in the UI, but can also be fixed later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have fixed this and a couple of typos.
Yes, once it's merged. |
Add kafka packages.
One particularity of the Metricbeat module for Kafka is that it monitors Kafka brokers, producers and consumers. Kafka brokers are monitored using the kafka protocol, and Jolokia for JMX metrics. Producers and consumers and monitored using Jolokia.
Related to these particularities, I have divided the kafka metricsets into three groups after these thoughts:
kafka
, to monitor the Kafka brokers, it uses the Filebeat module, and theconsumergroup
,broker
andpartition
metricsets of the Metricbeat module.kafka_java_consumer
, to monitor Kafka Java consumers, it uses theconsumer
metricset of the Kafka module.kafka_java_producer
, to monitor Kafka Java producers, it uses theproducer
metricset of the Kafka module.This PR includes only a package for
kafka
, we will have to discuss if we also wantkafka_java_consumer
andkafka_java_producer
as packages or as data sources, or following any other strategy.Some screenshots:
If we include
kafka
,kafka_java_consumer
, andkafka_java_producer
as three different packages, they appear like this (this PR only includes thekafka
one):Integration for Kafka brokers. The configuration for the
broker
metricset has a setting for the jolokia endpoint:Kafka logs can be configured as files inside the Kafka home directory, following the config of the current Filebeat module: