This repo help us to know how to publish and consume data to and from kafka confluent in json format
conda create -p venv python==3.8 -y
conda activate venv/
pip install -r requirements.txt
- Create Cluster keys
- Create Topic with same name as folder in sample_data directory
- Create Schema keys
API_KEY
API_SECRET_KEY
BOOTSTRAP_SERVER
SCHEMA_REGISTRY_API_KEY
SCHEMA_REGISTRY_API_SECRET
ENDPOINT_SCHEMA_URL
database name - sensor
MONGO_DB_URL
Create .env file in root dir of your project if it is not available paste the below content and update the credentials
API_KEY=<API_KEY>
API_SECRET_KEY=<API_SECRET_KEY>
BOOTSTRAP_SERVER=<BOOTSTRAP_SERVER>
SCHEMA_REGISTRY_API_KEY=<SCHEMA_REGISTRY_API_KEY>
SCHEMA_REGISTRY_API_SECRET=<SCHEMA_REGISTRY_API_SECRET>
ENDPOINT_SCHEMA_URL=<ENDPOINT_SCHEMA_URL>
MONGO_DB_URL=<MONGO_DB_URL>
docker build -t sensor-streaming-pipeline:latest .
docker run -it -v $(pwd)/logs:/logs --env-file=$(pwd)/.env sensor-streaming-pipeline:latest