IndyVerse Response is a hackathon-built AI emergency coordination prototype that uses an iPhone camera/audio feed to detect incidents like falls, person-down events, distress signals, and possible altercations, then generates a live incident card, recommends responders, and suggests the most appropriate medical destination through a dashboard.
This project is designed as a real-time emergency monitoring and response-support prototype for hackathon use.
It is not a production dispatch platform.
It is an AI-assisted system that:
- ingests live video/audio from an iPhone
- runs emergency detection
- creates a structured incident
- recommends responders
- recommends an appropriate hospital / urgent care destination
- displays everything in a live dashboard
- can optionally narrate alerts with ElevenLabs voice
- can use Gemini API for polished incident summaries
- can store incidents in MongoDB Atlas
- can be deployed to Vultr if local networking becomes limiting
The intended demo is:
iPhone camera -> backend -> AI detection -> incident engine -> Gemini summary -> dashboard -> responder/hospital recommendation -> optional ElevenLabs voice alert
Emergency situations often depend on a bystander noticing the event and calling for help.
That creates delays in cases like:
- falls
- collapse
- person lying motionless
- distress calls
- public altercations
Our prototype explores how AI can act as a faster detection and coordination layer by transforming raw camera/audio input into operationally useful incident alerts.
IndyVerse Response turns a live or prerecorded feed into an incident workflow.
The system:
- captures camera frames from an iPhone
- sends them to a backend over local Wi-Fi or hosted endpoint
- runs computer vision / audio detection
- generates a structured incident
- scores severity
- recommends responders
- recommends a medical destination
- optionally uses Gemini to create a clear incident summary
- optionally uses ElevenLabs to create a spoken alert
- pushes the incident to a real-time dashboard for human review
- stores incident history in MongoDB Atlas
This keeps a human operator in the loop.
- mobile camera page on iPhone
- real-time or near-real-time frame streaming
- backend ingestion
- incident detection for:
- fall / collapse
- person down
- motionless person
- possible altercation (stretch)
- optional distress audio support
- incident card generation
- severity scoring
- responder recommendation
- hospital recommendation
- live dashboard
- operator confirm / reject
- basic incident history / analytics
- optional Gemini summary generation
- optional ElevenLabs voice alert generation
- optional Vultr deployment
- MongoDB Atlas persistence
- actual emergency dispatch integration
- production auth
- deep cloud orchestration
- perfect action recognition
- large custom-trained ML system
- hospital system integration
- forced blockchain usage
- Safari / Chrome web app
- captures camera feed
- sends frames to backend
- backend server receives frames
- AI model processes them
- incident engine makes decisions
- Gemini API can summarize incidents
- ElevenLabs can narrate alerts
- MongoDB Atlas stores incidents
- dashboard frontend displays alerts
- same Wi-Fi for local demo
- local IP address for quick testing
- real-time socket connection
- optional hosted endpoint via Vultr
- React
- TypeScript
- Vite
- Tailwind CSS
- Socket.IO client
- Leaflet or Google Maps
- browser camera APIs (
getUserMedia) - canvas frame extraction
- socket / HTTP upload
- Node.js
- Express
- Socket.IO
- Python
- OpenCV
- pretrained or lightweight inference logic
- rule-based incident fusion
- Google Gemini API
Used for:
- incident summary generation
- readable operator explanation
- short post-incident summary
- ElevenLabs
Used for:
- narrated incident alerts
- spoken dashboard briefing
- optional demo voice output
- MongoDB Atlas
Used for:
- incident storage
- event logs
- review state
- analytics history
- hospital recommendations
- Vultr
Used for:
- backend hosting if needed
- dashboard hosting if needed
- remote demo reliability
- Solana is not part of MVP because it does not naturally fit the current emergency-response workflow
Gemini is used after structured detection to convert technical outputs into clean, human-readable summaries. It helps the product feel more operational and polished.
ElevenLabs gives the system a voice. Instead of only visual alerts, the dashboard can also announce important incidents in a natural way.
Incident records are document-like, making MongoDB Atlas a good fit for storing incidents, summaries, status updates, and review history.
Vultr provides a practical hosting option if the team wants a reliable demo URL instead of depending entirely on local Wi-Fi.
An iPhone opens the mobile camera page in Safari or Chrome.
The page:
- asks for camera permission
- starts video preview
- captures frames periodically
- sends them to the backend
The backend receives frames over local network or hosted connection.
The backend:
- tags frames with timestamp and camera ID
- forwards them to the inference layer
The AI layer checks for:
- fall / collapse
- person down
- motionless person
- distress audio
- possible altercation
The incident engine fuses detections and creates:
- incident type
- severity
- responder recommendation
- hospital recommendation
- Gemini can generate a readable incident summary
- ElevenLabs can optionally turn that summary into a spoken alert
The dashboard updates live and shows:
- incident queue
- severity
- map location
- recommendation details
- operator confirmation actions
- optional voice playback
Incidents and review updates are stored in MongoDB Atlas for history and analytics.
Detects likely emergency situations from camera/audio input.
Turns raw detections into an actionable alert with severity, confidence, and summary.
Suggests the appropriate response type such as:
- EMS
- EMS + Police
- Fire + EMS
- Monitor only
Suggests the most appropriate hospital / urgent care destination with ETA.
Converts structured AI output into a concise, readable operator-facing summary.
Narrates high-priority incidents so the system can literally speak alerts.
Allows an operator to confirm, reject, or mark an incident as false positive.
Tracks previous incidents and system activity using MongoDB Atlas.
Owns:
- mobile camera page
- frame sending
- AI detection output
Owns:
- backend server
- incident creation
- severity logic
- responder recommendation
- hospital recommendation
- MongoDB Atlas integration
Owns:
- main dashboard
- incident queue
- incident details
- confirm/reject workflow
- optional voice playback UI
Owns:
- Gemini integration
- ElevenLabs integration
- hospital data
- ETA / route logic
- Vultr deployment if used
- prerecorded backup demo
indyverse-response/
│
├── frontend/
│ ├── src/
│ │ ├── components/
│ │ ├── pages/
│ │ ├── hooks/
│ │ ├── services/
│ │ ├── types/
│ │ └── App.tsx
│ └── package.json
│
├── mobile-camera/
│ ├── src/
│ │ ├── pages/
│ │ ├── hooks/
│ │ ├── services/
│ │ └── App.tsx
│ └── package.json
│
├── backend/
│ ├── src/
│ │ ├── routes/
│ │ ├── sockets/
│ │ ├── services/
│ │ ├── engine/
│ │ ├── integrations/
│ │ │ ├── gemini/
│ │ │ ├── elevenlabs/
│ │ │ └── mongodb/
│ │ ├── data/
│ │ ├── types/
│ │ └── index.ts
│ └── package.json
│
├── ai/
│ ├── inference/
│ ├── models/
│ ├── utils/
│ └── main.py
│
├── data/
│ ├── hospitals.json
│ ├── incidents.mock.json
│ └── cameras.json
│
├── demo-assets/
│ ├── videos/
│ ├── screenshots/
│ └── test-scenarios/
│
├── README.md
└── CLAUDE.md