Skip to content


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Kafka protocol support in Python
branch: master

This branch is 12 commits ahead, 408 commits behind mumrah:master

Failed to load latest commit information.
kafka-src @ 15bb396
kafka Commit in seek if autocommit
test Merge pull request #109 from mrtheb/develop
.gitignore Change auto_commit to False in SimpleConsumer
.travis.yml Remove Python 2.6 from CI builds, ref #57 In memory of @rdiomar, 2014 Fix #44 Add missing exception class
LICENSE Updating year in LICENSE Update Merge branch 'master' into multihosts Merge branch 'master' into conn_refactor Update load_example update version in to q3
tox.ini Fix py26 compatibility issue, add mock to tox

Kafka Python client

Build Status

This module provides low-level protocol support for Apache Kafka as well as high-level consumer and producer classes. Request batching is supported by the protocol as well as broker-aware request routing. Gzip and Snappy compression is also supported for message sets.

Compatible with Apache Kafka 0.8.1


Copyright 2013, David Arthur under Apache License, v2.0. See LICENSE


The current version of this package is 0.9.0 and is compatible with Kafka brokers running version 0.8.1.


High level

from kafka.client import KafkaClient
from kafka.consumer import SimpleConsumer
from kafka.producer import SimpleProducer, KeyedProducer

kafka = KafkaClient("localhost:9092")

# To send messages synchronously
producer = SimpleProducer(kafka)
producer.send_messages("my-topic", "some message")
producer.send_messages("my-topic", "this method", "is variadic")

# To send messages asynchronously
producer = SimpleProducer(kafka, async=True)
producer.send_messages("my-topic", "async message")

# To wait for acknowledgements
# ACK_AFTER_LOCAL_WRITE : server will wait till the data is written to
#                         a local log before sending response
# ACK_AFTER_CLUSTER_COMMIT : server will block until the message is committed
#                            by all in sync replicas before sending a response
producer = SimpleProducer(kafka, async=False,

response = producer.send_messages("my-topic", "async message")

if response:

# To send messages in batch. You can use any of the available
# producers for doing this. The following producer will collect
# messages in batch and send them to Kafka after 20 messages are
# collected or every 60 seconds
# Notes:
# * If the producer dies before the messages are sent, there will be losses
# * Call producer.stop() to send the messages and cleanup
producer = SimpleProducer(kafka, batch_send=True,

# To consume messages
consumer = SimpleConsumer(kafka, "my-group", "my-topic")
for message in consumer:


Keyed messages

from kafka.client import KafkaClient
from kafka.producer import KeyedProducer
from kafka.partitioner import HashedPartitioner, RoundRobinPartitioner

kafka = KafkaClient("localhost:9092")

# HashedPartitioner is default
producer = KeyedProducer(kafka)
producer.send("my-topic", "key1", "some message")
producer.send("my-topic", "key2", "this methode")

producer = KeyedProducer(kafka, partitioner=RoundRobinPartitioner)

Multiprocess consumer

from kafka.client import KafkaClient
from kafka.consumer import MultiProcessConsumer

kafka = KafkaClient("localhost:9092")

# This will split the number of partitions among two processes
consumer = MultiProcessConsumer(kafka, "my-group", "my-topic", num_procs=2)

# This will spawn processes such that each handles 2 partitions max
consumer = MultiProcessConsumer(kafka, "my-group", "my-topic",

for message in consumer:

for message in consumer.get_messages(count=5, block=True, timeout=4):

Low level

from kafka.client import KafkaClient
kafka = KafkaClient("localhost:9092")
req = ProduceRequest(topic="my-topic", partition=1,
    messages=[KafkaProdocol.encode_message("some message")])
resps = kafka.send_produce_request(payloads=[req], fail_on_error=True)

resps[0].topic      # "my-topic"
resps[0].partition  # 1
resps[0].error      # 0 (hopefully)
resps[0].offset     # offset of the first message sent in this request


Install with your favorite package manager


git clone
pip install ./kafka-python


git clone
easy_install ./kafka-python

Using directly:

git clone
cd kafka-python
python install

Optional Snappy install

Download and build Snappy from

tar xzvf snappy-1.0.5.tar.gz
cd snappy-1.0.5
sudo make install

Install the python-snappy module

pip install python-snappy


Run the unit tests

These are broken at the moment

tox ./test/


python -m test.test_unit

Run the integration tests

First, checkout the Kafka source

git submodule init
git submodule update
cd kafka-src
./sbt update
./sbt package
./sbt assembly-package-dependency

And then run the tests. This will actually start up real local Zookeeper instance and Kafka brokers, and send messages in using the client.

tox ./test/


python -m test.test_integration
Something went wrong with that request. Please try again.