A modular, schema-aware messaging engine built in Python with Kafka and Avro, using Redpanda as the Kafka backend.
This project includes:
- ✅ Avro serialization with Schema Registry
 - ✅ Producer and consumer interfaces
 - ✅ Batch support, delivery reports, tombstone messages
 - ✅ Fully configurable via 
.env(see.env.example) - ✅ Managed using Poetry
 
- Python 3.11+
 - Poetry for dependency management
 - Redpanda (Kafka-compatible streaming platform)
 - Confluent Kafka Python Client
 - Avro + Schema Registry
 
Redpanda is used to run a Kafka-compatible stack locally with minimal effort.
git clone https://github.com/shiningflash/kafka-python-messaging-engine.git
cd kafka-python-messaging-enginedocker compose up -dThis starts:
- Redpanda Kafka Broker at
 localhost:19092- Redpanda Schema Registry at
 http://localhost:18081- Redpanda Console UI at
 http://localhost:8080
- Kafka Broker: 
localhost:19092 - Schema Registry: 
http://localhost:18081 - Redpanda Console: http://localhost:8080
 
Copy and modify your environment variables:
cp .env.example .envEdit .env to suit your local or dev settings.
User Avro schema lives in:
src/schemas/user.avsc
Update the schema and .env accordingly if you modify fields.
poetry installpoetry env activatepython -m src.producer_appThis will:
- Connect to Redpanda
 - Prompt you to input user data
 - Send serialized Avro messages to Kafka
 
python -m src.consumer_appThis will:
- Connect to the same topic
 - Consume and deserialize Avro messages
 - Support batch reads (configurable in 
.env) 
To get familiar with Kafka, its role in streaming pipelines, and design concepts, check the beginner-friendly guide:
Pull requests welcome! Please:
- Keep code clean and modular
 - Write meaningful commit messages
 - Follow PEP8 and type hinting conventions
 
MIT License — feel free to use, fork, and contribute.