A demonstration of agentic AI + data analytics for retail demand forecasting and autonomous stock replenishment. When a sale event is submitted, specialized AI agents analyze sell-out velocity, stock coverage, and demand forecasts to produce an actionable replenishment recommendation.
┌─────────────┐ POST /events/sale ┌──────────────────────┐
│ Next.js │ ──────────────────────────►│ FastAPI Backend │
│ Frontend │◄──── SSE /runs/:id/stream ─│ (Orchestrator) │
│ :3000 │ │ :8000 │
└─────────────┘ └──────────┬───────────┘
│
┌───────────────────────┼───────────────────────┐
│ │ │
┌─────────▼─────────┐ ┌──────────▼──────────┐ ┌─────────▼─────────┐
│ Sellout Agent │ │ Stock Agent │ │ Forecast Agent │
│ (Velocity & │ │ (Coverage & │ │ (Demand │
│ Trend Analysis) │ │ Risk Assessment) │ │ Prediction) │
└─────────┬─────────┘ └──────────┬──────────┘ └─────────┬─────────┘
│ │ │
┌─────────▼─────────┐ ┌──────────▼──────────┐ ┌─────────▼─────────┐
│ Eventhouse Mock │ │ Warehouse Mock │ │ Lakehouse Mock │
│ (KQL/Sales Data) │ │ (Stock Position) │ │ (ML Forecast) │
└───────────────────┘ └─────────────────────┘ └───────────────────┘
│
┌─────────▼─────────┐
│ PostgreSQL │
│ (Run State) │
└───────────────────┘
- Docker Desktop (Docker Compose included)
# Linux/macOS
bash scripts/run-demo.sh
# Windows PowerShell
.\scripts\run-demo.ps1This will:
- Start PostgreSQL, the API, and the Web frontend via Docker Compose
- Wait for all services to be healthy
- Run a happy-path test to verify everything works
| Service | URL |
|---|---|
| Frontend | http://localhost:3000 |
| API Docs | http://localhost:8000/docs |
| Health | http://localhost:8000/health |
Select a store, SKU, quantity, and scenario (normal / ruptura / promoção) in the UI.
The orchestrator runs three agents sequentially:
| Agent | Data Source | Output |
|---|---|---|
| Sellout | Eventhouse (KQL) | Sales velocity, trend, anomaly |
| Stock | Warehouse (SQL) | Coverage days, risk level |
| Forecast | Lakehouse (ML) | 4-week demand forecast, confidence |
The orchestrator combines agent outputs to produce:
- Replenishment recommendation (action, quantity, source CD, urgency)
- Rationale (human-readable explanation)
- Simulated actions (stock updates, order creation, alerts)
- KPI impact (coverage days before/after, risk before/after)
The UI shows a live timeline via SSE as each agent completes, with detailed outputs and metrics.
All requests (except /health) require:
Authorization: Bearer demo-token-2026
| Method | Path | Description |
|---|---|---|
| POST | /events/sale |
Submit a sale event |
| GET | /runs/{traceId} |
Get full pipeline run result |
| GET | /runs/{traceId}/stream |
SSE stream of pipeline execution |
| GET | /data/stores |
List all stores |
| GET | /data/skus |
List all SKUs |
| GET | /data/stock/{cd}/{sku} |
Check stock position |
| GET | /health |
Service health check |
# Submit a sale event
curl -X POST http://localhost:8000/events/sale \
-H "Authorization: Bearer demo-token-2026" \
-H "Content-Type: application/json" \
-d '{"storeId":"STORE-SP-001","skuId":"SKU-0007","qty":120,"scenario":"normal"}'
# Response: {"traceId":"...","eventId":"...","status":"accepted"}
# Get result
curl http://localhost:8000/runs/{traceId} \
-H "Authorization: Bearer demo-token-2026"/apps/web/ Next.js frontend (TypeScript + Tailwind)
/services/api/ FastAPI backend (Python)
/packages/agents/ Agent implementations (sellout, stock, forecast, orchestrator)
/packages/data_mocks/ Mock data connectors (Eventhouse, Warehouse, Lakehouse)
/contracts/ JSON Schema contracts
/scripts/ Demo scripts (run, happy-path, reset)
/docs/ Architecture documentation
/tests/ Playwright e2e tests
All data contracts are defined in /contracts/ as JSON Schema:
- SaleEvent — Input sale event with store, SKU, quantity, timestamp, scenario
- AgentResult — Standardized agent output with inputs, outputs, citations, metrics
- OrchestratorDecision — Final decision with recommendation, rationale, actions, KPIs
See docs/contracts.md for detailed documentation.
- All backend logs are structured (JSON) with
traceIdandeventIdcorrelation - Each agent reports
latencyMs,dataPointsProcessedin metrics - The SSE stream provides real-time status per agent
- Pipeline runs are persisted in PostgreSQL for post-hoc analysis
The system uses deterministic synthetic data (seeded with Random(42)) ensuring reproducible results:
- 3 stores (SP, RJ, BH)
- 20 SKUs across electronics, food, beverages, cleaning, hygiene
- 2 distribution centers
- 30 days of sales history per store/SKU
- 4-week demand forecasts
If any agent fails, the system falls back to mock deterministic results.
cd services/api
pip install -r requirements.txt
PYTHONPATH=../.. uvicorn app.main:app --reload --port 8000cd apps/web
npm install
npm run dev# Linux/macOS
bash scripts/reset.sh
# Windows
docker compose down -v| Layer | Technology |
|---|---|
| Frontend | Next.js 15, React 19, Tailwind CSS v4 |
| Backend | Python 3.13, FastAPI, Pydantic |
| Database | PostgreSQL 16 |
| Streaming | Server-Sent Events (SSE) |
| Container | Docker Compose |
| Logging | structlog (structured JSON) |
MIT