- Collects data from HTTP server and sends it to Kafka producer
- Accepts JSON POST request at /event?name=[kafka_topic_name].
- The JSON data should match the correspoding Avro schema.
- Python specific dependencies are in the requirements.txt
- Confluent Kafka with schema registry installation file
- Fullfill all the above dependencies
- Add the avro schema you want to send in the schema folder with the extenstion '.avsc.json'.
- Create a new file run.env & copy the contents of the run.env.sample into run.env and specify the desired environment variables.
- Run the kafka instance.
- Run the HTTP server
./run.sh
. - Now, you can send POST request to the endpoint /event with query param [kafka_topic_name].
- Eg:
curl -v -H 'Content-Type:application/json' 'localhost:8000/event?name=acesslog' -d '{"msg":"good"}'