Portfolio-grade backend project for monitoring websites and APIs. The service stores projects and monitors, executes recurring health checks, opens and resolves incidents, and sends outbound webhook notifications when a monitor changes state.
- Python
- Django + Django REST Framework
- PostgreSQL
- Redis
- Celery
- SQLite for lightweight local mode
- Docker Compose for full infra mode
- Authenticated REST API for project and monitor management
- Public status page endpoint for portfolio demos
- Async monitor execution with
CeleryandRedis - Incident lifecycle management with failure thresholds
- Outbound webhook notifications with HMAC signature support
- Fast local startup without Docker using
SQLiteand in-process Celery execution - Dockerized full environment with web, worker, beat, Postgres, and Redis
Project: tenant-style container owned by a userMonitor: HTTP endpoint check with interval, timeout, expected status codes, keyword, and latency thresholdMonitorCheck: historical execution result for every runIncident: open/resolved outage record created after repeated failuresNotificationEndpoint: webhook destination for incident events
POST /api/auth/token/obtain auth tokenGET /api/dashboard/summary/high-level metricsGET|POST /api/projects/GET|POST /api/monitors/POST /api/monitors/{id}/run/queue an immediate checkGET /api/checks/GET /api/incidents/GET|POST /api/notification-endpoints/GET /service overview with useful linksGET /status/{project_slug}/public status snapshotGET /health/liveness check
- Create a virtual environment:
python -m venv .venv - Activate it in PowerShell:
.\.venv\Scripts\Activate.ps1 - Install dependencies:
pip install -r requirements.txt - If
.envdoes not exist, copy it from.env.example:Copy-Item .env.example .env - Run migrations:
python manage.py migrate - Create demo data:
python manage.py seed_demo - Start the API:
python manage.py runserver
Local mode uses:
SQLiteinstead ofPostgreSQL- in-memory cache instead of
Redis CELERY_TASK_ALWAYS_EAGER=true, soPOST /api/monitors/{id}/run/executes immediately without a workerpython manage.py run_due_checksfor manual execution of scheduled checks during development
After startup:
- Root:
http://127.0.0.1:8000/ - API:
http://127.0.0.1:8000/api/ - Health:
http://127.0.0.1:8000/health/ - Public status page:
http://127.0.0.1:8000/status/portfolio-monitoring/
Use the token printed by seed_demo in:
Authorization: Token <token>
- Copy
.env.docker.exampleto.env. - Start the stack:
docker compose up --build - Seed demo data inside the web container:
docker compose exec web python manage.py seed_demo
This mode uses real PostgreSQL, Redis, Celery worker, and Celery beat.
{
"project": 1,
"name": "Main API",
"monitor_type": "http",
"method": "GET",
"url": "https://api.example.com/health",
"headers": {
"Accept": "application/json"
},
"expected_status_codes": [200],
"expected_keyword": "ok",
"timeout_seconds": 10,
"check_interval_seconds": 60,
"failure_threshold": 3,
"expected_response_time_ms": 1200,
"is_active": true
}Celery beatrunsmonitoring.run_due_checksevery minute.- Due monitors are reserved before dispatch to reduce duplicate execution.
perform_monitor_checkstores the raw result, updates current monitor state, and opens or resolves incidents.- Notification endpoints receive JSON payloads and an optional
X-Uptime-SignatureHMAC header. - In non-Docker local mode,
.envis loaded automatically by Django settings. - In non-Docker local mode, immediate monitor runs work without a separate Celery process.
The repository includes basic API tests for token issuance and the public status endpoint. After installing dependencies, run:
python manage.py test