WEHack 2026 · The Night at the Museum · Conservatory Lab Track
A biomedical AI platform that transforms patient memory interactions into longitudinal cognitive biomarkers — giving clinicians actionable longitudinal data between clinic visits, packaged in an immersive museum experience.
| Track | Why We Qualify |
|---|---|
| Conservatory Lab (Biotech) | Automates passive cognitive data collection between clinical visits. Tracks 6 validated speech biomarkers without burdening patients or caregivers. |
| WEHack General | Museum metaphor is literal — patient's past is archived like a museum collection, studied to understand their present. |
| Best Use of Gemini API | Gemini Vision analyzes uploaded photographs for era, emotion, and context; generates clinical longitudinal summaries. |
| Best Use of ElevenLabs | Warm, human voice prompts the patient through each memory — replacing cold UI text with natural conversation. |
| Best Domain (GoDaddy) | neurovault.health — register via GoDaddy Registry |
- Node.js 18+
- Firebase project (Firestore + Google Auth enabled)
- Gemini API key (aistudio.google.com)
- ElevenLabs API key (elevenlabs.io)
git clone https://github.com/your-team/neurovault
cd neurovault
npm install # installs root concurrently
cd frontend && npm install
cd ../backend && npm install
cd neurovault
cd frontend && npm install
npm run dev Frontend — copy frontend/.env.example → frontend/.env:
VITE_FIREBASE_API_KEY=...
VITE_FIREBASE_AUTH_DOMAIN=...
VITE_FIREBASE_PROJECT_ID=...
VITE_FIREBASE_STORAGE_BUCKET=...
VITE_FIREBASE_MESSAGING_SENDER_ID=...
VITE_FIREBASE_APP_ID=...Backend — copy backend/.env.example → backend/.env:
GEMINI_API_KEY=your_gemini_key
ELEVENLABS_API_KEY=your_elevenlabs_key
ELEVENLABS_VOICE_ID=21m00Tcm4TlvDq8ikWAM
FIREBASE_SERVICE_ACCOUNT_PATH=./serviceAccountKey.jsonDownload your Firebase service account JSON from Firebase Console → Project Settings → Service Accounts → Generate new private key. Save as backend/serviceAccountKey.json.
# From root (runs both concurrently)
npm run dev
# Or separately:
npm run dev:frontend # → http://localhost:5173
npm run dev:backend # → http://localhost:3001- Landing →
http://localhost:5173— cinematic museum entrance with WEHack night theme - Login → Google OAuth (patient account)
- Onboarding → name, age, gender, primary language
- Museum Home → 4 floating gallery cards (Childhood / Young Adult / Family / Recent)
- Select a Gallery → spotlight reveals the memory artifact photo
- ElevenLabs speaks → "Does this look familiar to you?"
- Patient responds → microphone records voice; waveform animates live
- Silent processing → Gemini analyzes photo, metrics computed
- Warm affirmation → "That's wonderful — thank you for sharing."
- Clinician dashboard → visit
http://localhost:5173/clinician(code:NEURO2025) - Live graphs update → fluency, pauses, speech rate plotted; Gemini insight panel renders
| Layer | Technology |
|---|---|
| Frontend | React 18 + Vite + Tailwind + Framer Motion |
| Backend | Node.js + Express |
| AI Vision | Google Gemini 1.5 Flash |
| Voice | ElevenLabs TTS API |
| Auth + DB | Firebase Auth + Firestore |
| Deployment | Vercel |
| Domain | GoDaddy Registry → neurovault.health |
| Feature | What It Captures | Implementation |
|---|---|---|
| Response Latency | Word-finding difficulty | Web Audio API (client-side) |
| Speech Rate | Cognitive processing speed | Whisper transcript + timer |
| Pause Duration | Hesitation, mental searching | Silence detection |
| Fluency Score | Composite verbal fluency | Custom NLP scoring |
| Lexical Diversity | Vocabulary range | Type-token ratio |
| Word Count | Utterance length / engagement | Whisper word count |
Critical design principle: All metrics are tracked as within-patient baseline changes — NOT compared to population averages. This is what clinical tools actually do.
- Patient view: Dark cinematic museum (deep navy + gold/teal). Feels like walking through an art museum at night. No clinical language — ever.
- Clinician view: Clean clinical analytics dashboard. Recharts time-series graphs. Color-coded trend arrows. AI insight panel.
- NeuroVault is a monitoring tool, not a diagnostic tool
- Zero clinical language in the patient-facing UI (
cognitive,recall,assessmentare scrubbed) - All AI insights are clearly labeled for clinician interpretation only
- Patients are never corrected — emotional affirmation always comes before any follow-up
- Silence and non-responses are recorded as data — not treated as failures
neurovault/
├── frontend/
│ ├── src/
│ │ ├── pages/
│ │ │ ├── Landing.jsx # Cinematic museum entrance
│ │ │ ├── Login.jsx # Google OAuth
│ │ │ ├── Onboarding.jsx # Patient profile setup
│ │ │ ├── MuseumHome.jsx # Gallery artifact cards
│ │ │ ├── MemorySession.jsx # Core memory flow + recording
│ │ │ ├── ClinicianDashboard.jsx # Analytics + charts
│ │ │ └── ClinicianLogin.jsx # Clinician access gate
│ │ ├── hooks/
│ │ │ └── useAudioAnalysis.js # Web Audio API + waveform
│ │ ├── context/
│ │ │ └── AuthContext.jsx # Firebase auth state
│ │ └── firebase.js # Firebase config
│ └── tailwind.config.js # Museum color theme
│
├── backend/
│ └── src/
│ ├── index.js # Express server
│ ├── firebase.js # Admin SDK + mock store
│ ├── routes/
│ │ ├── analyze.js # Main AI pipeline endpoint
│ │ ├── voice.js # ElevenLabs TTS proxy
│ │ └── sessions.js # Save/retrieve sessions
│ └── services/
│ ├── gemini.js # Gemini Vision analysis
│ ├── elevenlabs.js # ElevenLabs TTS
│ └── audioAnalysis.js # Speech metric computation
│
├── vercel.json # Deployment config
└── README.md
npm i -g vercel
vercel --prodAdd all environment variables in the Vercel dashboard under your project settings.
Built at WEHack 2026 · UTD · April 11–12