A multi-service IoT data processing system built with Scala, Kafka, and Docker.
This project consists of several microservices:
- iot-simulator - Generates realistic IoT sensor events (temperature, humidity, pressure)
- iot-storage - Consumes IoT events and stores them in Kafka topics for analytics
- alert-handler - Processes alerts and sends notifications for critical conditions
- alert-selector - Filters specific alerts based on conditions (to be implemented)
- service-metrics - Collects data from Kafka and stores in MinIO S3 (to be implemented)
- service-analytics - Loads S3 data and exposes via Grafana (to be implemented)
- spark-analyzer - Advanced analytics with Spark (to be implemented)
- Docker and Docker Compose
- SBT (for local development)
- Start all services:
docker-compose up --build-
Monitor the logs to see:
- IoT events being generated by the simulator
- Events being processed and stored by iot-storage
- Alerts being handled by alert-handler
-
Access Kafka UI at http://localhost:8080 to monitor topics and messages
The system creates and uses these Kafka topics:
iot-events- Raw IoT sensor data from simulatoriot-storage- Processed events for storageiot-analytics- Enriched events for analyticsfiltered-alerts- Filtered alerts for notification
{
"deviceId": "sensor-001",
"timestamp": 1672531200000,
"temperature": 23.5,
"humidity": 45.2,
"pressure": 1013.25,
"location": {
"latitude": 48.8566,
"longitude": 2.3522
},
"status": "active"
}Alerts are generated for:
- Temperature: > 40°C (WARNING), > 45°C (CRITICAL), < 0°C (WARNING), < -10°C (CRITICAL)
- Humidity: > 80% (WARNING), > 90% (CRITICAL), < 20% (WARNING)
- Pressure: < 980 hPa (WARNING), < 970 hPa (CRITICAL), > 1030 hPa (WARNING)
- Device Status: "critical" status generates CRITICAL alert
# Build specific service
cd iot-simulator
sbt compile
# Run specific service locally (requires Kafka running)
sbt run# Compile all services
sbt compile
# Test specific service
cd alert-handler
sbt test- Implement
alert-selectorservice for advanced filtering - Add MinIO S3 storage integration in
service-metrics - Create Grafana dashboards in
service-analytics - Add Spark analytics in
spark-analyzer - Add proper error handling and monitoring
- Implement authentication and security