Skip to content

confluentinc/kafka-connect-monitoring-sandbox

 
 

Repository files navigation

Kafka Connect Monitoring Sandbox

The purpose of this repository is to provide a quick bootstrap way to set up Kafka Connect with Confluent Platform and Confluent Cloud. In addition, it offers monitoring services through Prometheus and Grafana for both Confluent Platform components and Confluent Cloud. You can also configure alerting for your sandbox through AlertManager to test out a fully functional alerts.

Architecture

architecture

Sandbox Setup

Required software:

  • IDE (e.g. Intellij)
  • JDK 11
  • Docker, Docker Compose
  • Make

A. Download Kafka Connectors into connector-plugins:

Go to Confluent Hub to download any needed connectors and add it to the connector-plugins/ directory.

Confluent Cloud

Credentials

A. Create an env file (template) with the Confluent Cloud IDs and API Keys:

confluent login 
confluent environment list
confluent kafka cluster list

and set CCLOUD_ENV & CCLOUD_CLUSTER values in the env file.

B. Then describe your Kafka cluster to get the bootstrap servers URL:

confluent kafka cluster describe ${CCLOUD_CLUSTER}

C. Similarly, enable Schema Registry in your environment, and get the details:

confluent schema-registry cluster describe

and set CCLOUD_SR & CCLOUD_SR_URL values in the env file.

D. Once IDs are defined, create API Keys:

Create API Keys for CCloud Exporter (Monitoring):

make ccloud-exporter-api-keys

and set CCLOUD_EXPORTER_API_KEY & CCLOUD_EXPORTER_API_SECRET values in the env file.

Create API Keys for Kafka Connect:

make ccloud-connect-api-keys

and set CCLOUD_CONNECT_API_KEY &CCLOUD_CONNECT_API_SECRET values in the env file.

and set values on env file.

Create API Keys for Applications:

Create Service Account:

make ccloud-app-service-account

and save ID as CCLOUD_SERVICE_ACCOUNT in the env file.

E. Create ACLs for Service Account:

make ccloud-app-acl

and finally create API Keys for the applications:

make ccloud-app-api-key

and set values on CCLOUD_API_KEY & CCLOUD_API_SECRET env file.

Run Self-Managed Connectors to Confluent Cloud

G. You can create topics with the Confluent CLI:

make ccloud-topic

H. Start Docker Compose

Verify that you have the env var $CP_VERSION set to the preferred CP version first.

make up

I. Deploy Datagen Connectors:

make ccloud-datagen-users

Insert your CCLOUD_SR_URL, CCLOUD_SR_API_KEY and CCLOUD_SR_API_SECRET values in the datagen-users-schema.json file.

make ccloud-datagen-users-schema

J. Deploy MYSQL JDBC Connectors:

make ccloud-jdbc-bulk-mode-source
make ccloud-jdbc-incremental-mode-source
make ccloud-jdbc-timestamp-mode-source
make ccloud-jdbc-incremental-timestamp-source
make ccloud-jdbc-mysql
make ccloud-jdbc-mysql-custom-query

I. Deploy SQLServer Debezium Connectors:

First, create the following topics with 1 partition in your confluent cloud cluster.

  • server1.dbo.orders

  • server1.dbo.customers

  • server1.dbo.products

  • server1.dbo.products_on_hand

Then load the sql tables into the sqlserver database.

make load-sqlserver

Next start the debezium SQLServer connector

make ccloud-sqlserver

Insert your CCLOUD_SR_URL, CCLOUD_SR_API_KEY and CCLOUD_SR_API_SECRET values in the jdbc-sink-schema.json file.

make ccloud-jdbc-sink-schema

Finally, to monitor CCloud Cluster: go to Grafana http://localhost:3000 and check metrics:

grafana

Local Confluent Platform deployment

A. You can create topics with the kafka-utility command line:

make local-topic

B. Deploy one of the Datagen connectors:

make local-datagen-commercials
make local-datagen-inventory

C. Deploy MySQL JDBC connectors:

make local-jdbc-mysql
make local-jdbc-mysql-custom-query
make local-jdbc-sink

Monitoring & Alerting

You can view the connector metrics and Kafka broker metrics on Grafana

Visit http://localhost:3000/

  • Username: admin
  • Password: admin

Click on the dashboards icon on the left side panel and select the Kafka Connect V2 dashboard.

grafana

To configure alerting, visit the alertmanager directory under the root directory of this project. One can set up the SMTP server in alertmanager.yml. Currently, mailhog is the SMTP server that alertmanager is using for notifications.

Custom alerts can be added in PromQL format to the alertrules.yml file in the alertmanager directory. Can visit the alerts in the UI at http://localhost:9093 grafana

Prometheus has alertmanager set up as its alerting target. Any metrics that have an alert threshold applied to it, will trigger an alert if that threshold is crossed and send it out to the alert notifications.

Alert notifications can be found in the mailhog UI at http://localhost:8025

grafana

How to Add Connectors

If you want to add a connector to this sandbox to test it out and see how it can be monitored, you can do so by doing the following.

  • If the desired connector's plugin does not come with the Kafka Connect cluster, download and install the jar in the connector-plugins/ folder.

  • Once the plugin has been added, create the connector config file and place it in either the ccloud or local directory depending on which cluster you want to read/write the data to.

  • Deploy the newly added connector. Feel free to add the curl command to the Makefile to easily reuse it for the future.

References

  • For a quick reference on what connector-configs should look like, visit the examples github repo. Here you can find examples of almost all the open source supported connect configs.

About

This repo allows you to set up a local kafka connect instance with a full monitoring and alerting suite using Prometheus, Grafana and AlertManager.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Makefile 62.2%
  • TSQL 24.5%
  • Shell 13.3%