This example uses spring-cloud-stream to produce and consume messages from Kafka, with partitioned topic.
cd spring-stream-poc-producer/
./gradlew clean build
cd spring-stream-poc-consumer/
./gradlew clean build
docker-compose up
Run with four partitions:
docker-compose -f docker-compose-four.yaml up
See kafka quickstart.
Create two partitioned topics:
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 3 --partitions 2 --topic test
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 3 --partitions 4 --topic test4
execute second consumer in a different console:
cd spring-stream-poc-consumer/
./gradlew bootRun
execute one producer in a console:
cd spring-stream-poc-producer/
SPRING_PROFILES_ACTIVE=four ./gradlew bootRun
execute first consumer in a different console:
cd spring-stream-poc-consumer/
SPRING_PROFILES_ACTIVE=four ./gradlew bootRun
execute second consumer in a different console:
cd spring-stream-poc-consumer/
SPRING_PROFILES_ACTIVE=four ./gradlew bootRun
execute third consumer in a different console:
cd spring-stream-poc-consumer/
SPRING_PROFILES_ACTIVE=four ./gradlew bootRun
execute fourth consumer in a different console:
cd spring-stream-poc-consumer/
SPRING_PROFILES_ACTIVE=four ./gradlew bootRun