Skip to content

Run Confluent Kafka locally and connect it with mySQL databases to automatically stream data from MySQL db onto Kafka

Notifications You must be signed in to change notification settings

manaschubby/kafka-mysql

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

kafka-mysql

Run Confluent Kafka locally and connect it with mySQL databases to automatically stream data from MySQL db onto Kafka

About confluent-kafka

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apache KafkaTM brokers >= v0.8, Confluent Cloud and Confluent Platform. The client is:

  • Reliable - It's a wrapper around librdkafka (provided automatically via binary wheels) which is widely deployed in a diverse set of production scenarios. It's tested using the same set of system tests as the Java client and more. It's supported by Confluent.

  • Performant - Performance is a key design consideration. Maximum throughput is on par with the Java client for larger message sizes (where the overhead of the Python interpreter has less impact). Latency is on par with the Java client.

  • Future proof - Confluent, founded by the creators of Kafka, is building a streaming platform with Apache Kafka at its core. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform.

Usage

  • Clone the repository using
git clone https://github.com/manaschubby/kafka-mysql.git
  • Start the docker containers using
docker-compose up -d
  • Wait about 10-15 minutes on a good internet-connection to pull all the required docker image files of Confluent Kafka.

  • Once the containers are running ensure that all containers are running continuosly (Especially connect and broker).

  • Once connect has started, restart the consumer. (It will automatically restart at continous intervals until connect is not up, however to save time you can manually restart).

  • You can also run

    python3 consumer.py getting_started.ini

    in the consumer container terminal to manually start consumer.

  • It will automatically initialize the JDBC connector and soon you will see events logged on the consumer logs.

Any changes to the MySQL DB shall reflect there.

  • To change the MySQL DB instance open the docker-compose.yaml and change the ENVIRONMENT VARIABLES in the consumer service.

About

Run Confluent Kafka locally and connect it with mySQL databases to automatically stream data from MySQL db onto Kafka

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published