During disasters, information arrives too late, from too many places, and without coordination. Satellite images sit unused, tweets go unread, sensors raise alerts in isolation, and citizens file reports that take hours to validate.
We asked a simple question:
What if AI could continuously listen, reason, validate, and act β all in real time?
PulseAI was built to answer that question.
Our inspiration was to create a living, streaming intelligence system that:
- Understands disasters as they unfold
- Correlates citizen reports, satellite imagery, sensors, and social media
- Automatically decides when deeper analysis is required
- Triggers the right response workflows without human delay
PulseAI is an AI-native disaster response platform built on:
- Flink for real-time streaming intelligence
- ADK (Agent Development Kit) for multi-agent reasoning
- MCP (Model Context Protocol) for secure, tool-driven execution
- Google Cloud (BigQuery, GCS, Cloud Run) for scalable execution
At a high level, PulseAI performs four continuous intelligence loops:
- Citizen Case Analysis
- Satellite Damage Detection
- Sensor Anomaly Detection
- Tweet Classification & Verification
All of these flows ensure:
- Deterministic execution
- No duplicated actions
- Fully auditable AI decisions
PulseAI is event-driven:
- Every incoming signal (tweet, sensor alert, image upload, citizen case) enters Kafka
- Flink Compute Pools process these streams
- AI Agents reason over the data
- MCP Tools execute real-world actions (queries, jobs, writes)
- Results are persisted back to Kafka and BigQuery
- Citizen submits a disaster case
- Case is summarized and pushed into a Kafka topic
- Flink AI Streaming Agent consumes the case
- The agent performs the following:
- Queries tweet alerts via BigQuery MCP
- Queries sensor alerts via BigQuery MCP
- Forecasts required emergency resources
- Fetches available nearby resources
- Runs a custom resource allocation UDF
- The agent produces:
- A complete disaster assessment
- A resource allocation plan
- Results are:
- Written to BigQuery
- Published back to Kafka for dashboards and responders
-
A satellite image is uploaded to Google Cloud Storage
-
A file-creation event is published to Pub/Sub
-
Kafka ingests the event into Flink
-
The AI Streaming Agent:
- Fetches pre-disaster & post-disaster image pairs
- Queries contextual alerts (tweets + sensors)
-
The agent evaluates:
Is damage analysis required?
-
Only if required, the agent:
- Calls a custom MCP Server
- Triggers the Satellite Damage Detection ADK Agent
-
The damage agent:
- Submits a Cloud Run Job
- Performs image analysis
- Summarizes damage severity
-
Results are persisted into BigQuery
- No blocking
- No retries
- No duplicate job triggers
-
Live sensor data enters Kafka
-
Flink runs
ML_DETECT_ANOMALIES -
If an anomaly is detected:
- An emergency alert is generated
-
If not:
- The event is ignored
This ensures early warning detection without overwhelming responders.
- Tweets stream into Kafka
- Flink AI Agent:
- Classifies tweet criticality
- If critical:
- Tweets are summarized
- Added to emergency alerts
- Otherwise:
- Ignored
This prevents misinformation from polluting response decisions.
Role: Central coordinator
- Routes requests across all domain agents
- Prevents duplicate actions
- Maintains session-level reasoning
Role: Deep image analysis
- Receives image URLs
- Triggers Cloud Run job
- Performs damage detection
- Produces summarized results
Role: Converts raw inference into insights
- Fetches inference data
- Produces human-readable summaries
- Stores results in BigQuery
Role: Data ingestion & enrichment
- Collects case metadata
- Uploads assets to GCS
- Prepares inputs for downstream agents
Role: Citizen case normalization
- Converts raw citizen input into structured intelligence
Role: Human presence detection
- Runs detection models for trapped survivors
- Feeds results into rescue prioritization
| Layer | Technology |
|---|---|
| Streaming | Flink |
| Messaging | Confluent Kafka |
| AI Agents | ADK |
| Tooling | MCP |
| Storage | BigQuery, GCS |
| Compute | Cloud Run Jobs |
| ML | BigQuery ML, Flink ML |
| Language | Python |
- Preventing duplicate AI tool execution
- Making LLMs deterministic inside streams
- Orchestrating ML + UDF + Agents together
- Designing fire-and-forget MCP tools
- Ensuring low-latency disaster decisions
- A true streaming AI system, not batch AI
- Deterministic AI reasoning in Flink
- MCP-based real-world action execution
- End-to-end disaster intelligence in seconds
- A design that can scale nationally
- Streaming + Agents is the future of AI systems
- AI must be controlled, not reactive
- Deterministic orchestration beats raw LLM power
- Real-world AI needs strong guardrails
- Drone & live video feeds
- Multilingual citizen reporting
- Government-grade dashboards
- Open APIs for NGOs
- Predictive disaster modeling





