This is a simple example of how to use Faust to stream data from Kafka. The data is a simple JSON object with a timestamp and a random number. The data is generated by a Python script and sent to Kafka. Faust is used to read the data from Kafka.
- Python 3.6+
- Docker
- Docker Compose
- Clone the repository
- Run
docker-compose up -d
to start the Kafka cluster - Run
pip install -r requirements.txt
to install the Python dependencies - Run
python main.py
to start the producer - Run
faust -A faust_stream worker -l info
to start the Faust worker
The producer will generate a random number every second and send it to Kafka. The Faust worker will read the data from Kafka and print it to the console.
Making a Kafka consumer with the help of Faust module.
import faust
app= faust.App('agents-demo')
greetings_topic = app.topic('greetings', value_type=str, value_serializer='raw')
@app.agent(greetings_topic)
async def greet(stream):
async for greeting in stream:
print(greeting)
if __name__ == '__main__':
app.main()
Now after completing the code, we will start the Faust worker.
faust -A faust_stream worker -l info
Now we will test the consumer by sending some data to the topic through docker cli
docker exec -it cli-tools kafka-console-producer --topic=greetings --bootstrap-server broker0:29092
Now lets make a producer with the help of faust module.
# Producer
@app.timer(interval=1.0)
async def send_greeting():
await greetings_topic.send(value='Hello, World!')
Start the file again with the help of the following command
faust -A faust_stream worker -l info
Now you will see the hello world message in the consumer every second.
https://github.com/SohaibAnwaar/Kafka-Faust-Python
- Sohaib Anwaar : https://www.sohaibanwaar.com
- gmail : sohaibanwaar36@gmail.com
- linkedin : Have Some Professional Talk here
- Stack Overflow : Get my help Here
- Kaggle : View my master-pieces here
- Github : View my code here