Forget LeetCode; let's build systems. This project is a full-stack, real-time geospatial visualization engine designed to handle high-concurrency driver pings and transform them into a live demand heatmap.
The goal? Create "hype" and operational visibility by showing exactly where the action is happening across the city in real-time.
The Challenge: How do you process thousands of GPS pings per second from moving drivers and update a web map without crashing the database or lagging the UI?
The Solution: Instead of a traditional "Save to DB -> Poll DB" approach, I built a reactive stream:
- Decoupling with Kafka: Driver pings are offloaded immediately to a message broker. The API stays fast because it doesn't wait for processing.
- Geospatial Aggregation: Used Geohashes to group coordinates into "buckets" in Redis. This reduces the data sent to the frontend from "thousands of points" to "density counts per area."
- Push, Don't Pull: Used WebSockets (Django Channels/Daphne) to push updates to the map only when data changes.
-
Ingestion: Django REST Framework — Validates and pushes pings to Kafka.
-
Message Broker: Apache Kafka — Buffers incoming location data.
-
State Store: Redis — Stores aggregated geohash counts for sub-millisecond lookups.
-
The Bridge: Django Channels (ASGI) — Manages persistent WebSocket connections.
-
Visualization: React + Leaflet.heat — Decodes hashes and renders the "glow."
The HeatMapConsumer handles the WebSocket lifecycle. It joins a broadcast group (live_heatmap) so that a single worker can update thousands of connected clients simultaneously.
To keep the payload light, the backend sends geohash strings. The frontend uses a Custom Hook and ngeohash to decode these back into Lat/Lng coordinates on the fly.
Configured Daphne to handle the ASGI protocol, allowing the server to manage long-lived WebSocket connections while standard HTTP requests still flow through the REST API.
- Python 3.12+
- Node.js & NPM
- Redis (Running on
localhost:6379) - Kafka (Running on
localhost:9092)
cd backend
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
python manage.py runserver
cd frontend
npm install
npm run dev
- Dynamic Precision: Adjust geohash length based on map zoom level.
- Historical Replay: Use PostGIS to store and "replay" peak hours.
- Auth: Secure the WebSocket connection with JWT.
Built during a 6-hour flow state (8PM - 2AM). Far more satisfying than solving TwoSum.
