Skip to content
an awesome-kafka docker setup script, setup kafka cluster automaticlly
Shell Dockerfile
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.env
.gitignore
Dockerfile
LICENSE
README.md
del-exited-containers.sh
docker-compose.yml
download-kafka.sh
kafka.sh
resolve-hostname.sh
start-kafka.sh
test.sh
topic.sh

README.md

awesome-kafka-docker

An awesome-kafka docker setup script, setup kafka cluster automaticlly

Pre-Requisites

Kafka Versions

You can modify KAFKA_VERSION and SCALA_VERSION in donwload_kafka.sh first as you like.

Current KAFKA_VERSION=2.1.0, SCALA_VERSION=2.12(Recommend)

Any kafka version >= 0.8.11 and scala version >= 2.10 will be fine.

Setup

build

# download kafka
sh download_kafka.sh

# create docker network
docker network create mykafka

# build images
sh kafka.sh build

start

sh kafka.sh start

scale to n kafka containers

sh kafka.sh [n]
# eg: sh kafka.sh 3

stop kafka and zookeeper

sh kafka.sh stop

Topics

create topic

# create topic test001 with 1 partition and 1 replication-factor
sh topic.sh test001 create 

# create topic test002 with 3 partitions and 2 replication-factor
sh topic.sh test002 create 3 2

describe topic

sh topic.sh test001 describe

show all topics

sh topic.sh all list

Produce and Consume

open 2 Terminal tabs, one for comsumer, another for producer

Produce

in tab1, try:

# produce messages to topic test001
sh topic.sh test001 produce

# then you can type anyting you like when you see `>` and hit Enter

Consume

in tab2, try:

sh topic.sh test001 consume

# you will see the `strings` you typed from tab1

Tests

We use Kafkacat for testing, you have to install it first.

If you have any network problems here, please open an issue.

Resolve hostname

# add docker ip-host map to /etc/hosts, so that you can connect to inner kafka by HOST_IP:HOST_PORT
# of course /etc/hosts will be backup
sh resolve-hostname.sh

consume

sh test.sh p

produce

sh test.sh c

Also, you can have a try with Python Kafka API.

Features

  • auto create topic when consuming or producing, default partition is 1
  • alpine java8, very small image, easy to scale out
  • remote kafka connection

TODO

  • Docker Swarm

LICENSE

MIT

You can’t perform that action at this time.