A modern centralized logging system for Python apps using the latest observability stack:
- Python with structlog (structured logging)
- Loki (log aggregation & storage)
- Promtail (log shipper)
- Grafana (visualization & dashboards)
This tutorial will teach you:
- β Structured Logging with Python structlog
- β Log Aggregation using Loki
- β Log Shipping with Promtail
- β Real-time Visualization in Grafana
- β Business Logic Simulation for testing
make up
# or
docker-compose up --build
python-logging-loki/
βββ app/ # π Python application
β βββ main.py # Entry point & request simulation
β βββ business_logic.py # Simulated business operations
β βββ log_config.py # structlog configuration
βββ config/ # βοΈ Service configurations
β βββ loki-config.yml # Loki configuration
β βββ promtail-config.yml # Promtail log collection rules
β βββ grafana-datasources.yml # Grafana data sources
β βββ grafana-dashboards.yml # Dashboard provisioning
βββ dashboards/ # π Grafana dashboards
β βββ structlog-dashboard.json
βββ docker-compose.yml # π³ Orchestration
βββ Dockerfile # π¦ Python app container
graph TD
A[Python App] -->|JSON Logs| B[Docker Logs]
B -->|Scrape| C[Promtail]
C -->|Ship| D[Loki]
D -->|Query| E[Grafana]
E -->|Dashboard| F[User]
Click to expand
graph TB
subgraph "Python Application Container"
A1[structlog Logger] --> A2[JSON Processor]
A2 --> A3[Mask Sensitive Data]
A3 --> A4[Normalize Fields]
A4 --> A5[stdout/stderr]
end
subgraph "Docker Engine"
A5 --> B1[Docker JSON Driver]
B1 --> B2[Log Files<br>/var/lib/docker/containers/...]
end
subgraph "Promtail Container"
B2 --> C1[File Discovery]
C1 --> C2[JSON Parser]
C2 --> C3[Label Extraction]
C3 --> C4[Stream Processing]
end
subgraph "Loki Container"
C4 --> D1[Log Ingestion API]
D1 --> D2[Index Creation]
D2 --> D3[Chunk Storage]
D3 --> D4[Query Engine]
end
subgraph "Grafana Container"
D4 --> E1[LogQL Queries]
E1 --> E2[Data Processing]
E2 --> E3[Dashboard Panels]
E3 --> E4[User Interface]
end
style A1 fill:#e1f5fe
style B2 fill:#f3e5f5
style C4 fill:#e8f5e8
style D3 fill:#fff3e0
style E4 fill:#fce4ec
π Code Execution
β
π§ structlog Processors:
β’ mask_sensitive_processor() β Hide passwords/tokens
β’ normalize_high_cardinality() β Replace UUIDs with {uid}
β’ TimeStamper() β Add ISO timestamp
β’ JSONRenderer() β Convert to JSON
β
π€ Output to stdout/stderr
Example Log Output:
{
"xtime": "2024-01-15T10:30:45.123456",
"level": "info",
"msg": "User login successful",
"request_id": "req-123",
"user_id": 456,
"method": "POST",
"path": "/api/v1/auth/login",
"password": "***MASKED***"
}
π³ Docker Container
β
π JSON File Driver
β
πΎ File Storage: /var/lib/docker/containers/{container_id}/{container_id}-json.log
Docker Log Format:
{
"log": "{\"xtime\":\"2024-01-15T10:30:45.123456\",\"level\":\"info\"...}\n",
"stream": "stdout",
"time": "2024-01-15T10:30:45.123456789Z"
}
π Docker Service Discovery
β
π File Monitoring (/var/lib/docker/containers/**/*.log)
β
π·οΈ Label Extraction from Docker containers:
β’ container_name
β’ logging_jobname (from labels)
β’ logging="promtail" (filter)
β
π JSON Parsing & Stream Processing
β
π HTTP Push to Loki API
Promtail Processing:
- Discovery: Auto-detect containers with
logging: "promtail"
label - Parsing: Extract JSON from Docker's nested format
- Labeling: Add metadata (job, container, etc.)
- Streaming: Real-time push to Loki
π¨ HTTP API Ingestion (/loki/api/v1/push)
β
π·οΈ Index Creation (based on labels):
β’ job="jobname-auth-service"
β’ container_name="auth-service"
β’ level="info"
β
π¦ Chunk Creation (grouped by time + labels)
β
πΎ Storage (local filesystem or cloud)
Loki Storage Structure:
chunks/
βββ fake/
β βββ {chunk-id}/
β βββ {time-range}-{hash}.gz # Compressed log data
β βββ index # Label index
π LogQL Query:
{job="jobname-auth-service"} |= "login" | json | level="info"
β
π Query Engine Processing
β
π Panel Rendering:
β’ Time series graphs
β’ Log tables
β’ Stat panels
β
π₯οΈ Dashboard Display
1. Python structlog:
logger.info("Order processed",
order_id="order-550e8400-e29b-41d4-a716-446655440000",
user_id=123,
amount=99.99,
payment_method="credit_card")
2. After structlog processing:
{
"xtime": "2024-01-15T10:30:45.123456",
"level": "info",
"msg": "Order processed",
"order_id": "order-{uid}", // β Normalized!
"user_id": 123,
"amount": 99.99,
"payment_method": "credit_card"
}
3. Docker wrapping:
{
"log": "{\"xtime\":\"2024-01-15T10:30:45.123456\",\"level\":\"info\"...}\n",
"stream": "stdout",
"time": "2024-01-15T10:30:45.123456789Z"
}
4. Promtail adds labels:
{
"streams": [{
"stream": {
"job": "jobname-auth-service",
"container_name": "auth-service",
"level": "info"
},
"values": [["1705315845123456000", "{\"xtime\":\"2024-01-15T10:30:45.123456\"...}"]]
}]
}
5. Grafana LogQL query:
{job="jobname-auth-service"}
|= "Order processed"
| json
| amount > 50
π Throughput Capacity:
βββββββββββββββ¬βββββββββββββββ¬ββββββββββββββ
β Component β Logs/Second β Bottleneck β
βββββββββββββββΌβββββββββββββββΌββββββββββββββ€
β structlog β 10,000+ β CPU β
β Docker β 5,000+ β Disk I/O β
β Promtail β 3,000+ β Network β
β Loki β 2,000+ β Storage β
β Grafana β 1,000+ β UI Render β
βββββββββββββββ΄βββββββββββββββ΄ββββββββββββββ
- π Python App β Generates structured JSON logs using structlog
- π Promtail β Reads logs from Docker containers in real-time
- ποΈ Loki β Stores and indexes logs for fast querying
- π Grafana β Displays logs in interactive dashboards
- Docker & Docker Compose installed
- Port 3000 (Grafana) available
# Clone this repository
cd python-logging-loki
# Start all services
docker-compose up --build
- π Grafana: http://localhost:3000
- π Dashboard: Pre-provisioned and ready to use
- π Loki: http://localhost:3100 (API)
- π Promtail: http://localhost:9080 (metrics)
The dashboard instantly shows:
- β Request logs with response times
- β Error tracking and alerts
- β Business operation metrics
- β User activity patterns
This app simulates real-world business scenarios:
# User registration with validation
simulate_user_registration(user_data)
# Authentication with security logging
simulate_authentication(username, password)
# Order processing with inventory & payment
simulate_order_processing(order_data)
# File upload with virus scanning
simulate_file_upload(filename, file_size)
# Data analytics with performance monitoring
simulate_data_analytics(query_type)
- INFO: Successful operations
- WARNING: Business logic warnings
- ERROR: System/business errors
- DEBUG: Development details
- structlog: Structured JSON logging
- Rotating logs: Auto cleanup (10MB files)
- Sensitive data masking: Password/token masking
- Request tracing: UUID-based request tracking
# Key features:
- Docker log discovery
- JSON parsing
- Label extraction
- Health check filtering
- Timestamp parsing
- Retention: 30 days by default
- Indexing: Optimized for JSON logs
- Performance: Great for development
- Auto-provisioning: Data sources & dashboards
- Anonymous access: No login required
- Custom dashboard: Pre-built for structlog
- π Loki Documentation
- π Promtail Configuration
- π Grafana Dashboards
- π Structlog Guide
- π Docker Compose
- π Logging with docker promtail and grafana loki
- π Docker SD Configs on Promtail
- πΊ YouTube: 6 Easy Ways to Improve your Log Dashboards with Grafana and Loki
- π Blog: Setup Grafana and Loki