Skip to content
Simple Flink + Kafka application
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
src/main Remove rebalance, not needed in this simple application Oct 12, 2016
.gitignore Initial Commit Oct 12, 2016
LICENSE Initial Commit Oct 12, 2016 Initial Commit Oct 12, 2016
pom.xml Initial Commit Oct 12, 2016

Apache Flink and Apache Kafka

This project is use a simple Flink job to show how to integrate Apache Kafka to Flink using the Flink Connector for Kafka.

Start Kafka and Create Topic

curl -O
tar -xzf kafka_2.11-
cd kafka_2.11-

Kafka uses ZooKeeper, if you do not have Zookeeper running, you can start it using the following command:

./bin/ config/

Start a Kafka broker by running the following command in a new terminal:

./bin/ config/

In another terminal, run the following command to create a Kafka topic called flink-demo:

./bin/ --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic flink-demo

Build and Run the Application

In the project folder:

$ mvn clean package 

And run the Flink Consumer:

$ mvn exec:java -Dexec.mainClass=com.grallandco.demos.ReadFromKafka

and Producer:

mvn exec:java -Dexec.mainClass=com.grallandco.demos.WriteToKafka

You should see messages printed in the Consumer console.

You can run this application directly in a Flink cluster.

You can’t perform that action at this time.