AI-powered break optimization platform for factory workers. Detects fatigue in real time via computer vision and recommends optimal break schedules to reduce errors and workplace accidents.
Camera (ml/camera.py)
│ POST /update
▼
FastAPI Backend (back/server.py)
├── GET /telemetry
├── POST /update
└── WS /ws
│
▼
Next.js Frontend (frontend/)
├── / → Shift overview dashboard
├── /analysis → Per-station fatigue analysis
└── /current-shift → Live fatigue signal chart
Uses MediaPipe Face Landmarker at 30+ FPS to detect two fatigue signals:
- Eye closure — both eyes must be closed for 8 consecutive frames (
EYE_CLOSED_THRESHOLD = 0.012) - Yawning — mouth open for 20 consecutive frames (
MOUTH_OPEN_THRESHOLD = 0.04)
On a new fatigue event, the camera POSTs to the backend:
{ "status": "UNFOCUSED (EYES CLOSED & YAWNING)", "fatigue_level": 90.0, "timestamp": "14:35:22" }FastAPI w/ three endpoints:
| Method | Route | Description |
|---|---|---|
GET |
/telemetry |
Current worker state |
POST |
/update |
Receive fatigue event from camera; broadcast via WebSocket |
WS |
/ws |
Real-time WebSocket channel for the frontend |
Only UNFOCUSED events update the internal state. Each WebSocket message has the shape:
{ "type": "webhook_update", "received_at": 1710000000.123, "accepted": true, "payload": { ... } }Next.js 16, React 19, TypeScript, Tailwind CSS, Plotly.js.
/— Overview of all active shifts and stations. Expand any station to see its fatigue curve and AI-optimized break schedule./analysis— Select a shift to view an interactive floor map and per-station AI diagnostics (risk summary, detection logic, recommendation, expected impact)./current-shift— Live Plotly line chart updated every second via WebSocket. Signal weights: yawn = 1, eyes closed = 2, both = 3.



