Join meetings. Transcribe conversations. Detect conflicts. Propose actions. Remember everything.
Memex is a fully local AI meeting assistant. It joins your calls, transcribes them in real-time, cross-references past conversations, and generates strategic action plans β all running on your own machine, with your own data.
Run it entirely for free using a local LLM like Ollama β no API keys, no cloud subscriptions, no data leaving your machine.
https://personal-website-gray-iota-67.vercel.app/Memex/
- Local Web Dashboard
- What Can It Do?
- Running 100% Free & Local
- Architecture
- Quick Start
- Usage Examples
- Custom Skills
- API Endpoints
- Project Structure
- Debugging
- Documentation
- Notes
- License
Memex includes a full-featured React dashboard that runs in your browser β no Teams, no Azure, no cloud accounts required. Just start the dashboard and go.
cd dashboard
npm start
# Opens at http://localhost:5173| Tab | What It Does |
|---|---|
| π¬ Chat | Conversational AI interface β ask questions about meetings, request summaries, trigger actions. Supports multiple chat sessions with full history persistence |
| π Transcripts | Browse, upload, and manage meeting transcripts. Drag-and-drop .vtt or .txt files for instant processing. View speaker-labeled transcripts with color-coded speakers |
| β Actions | Human-in-the-loop action queue β review, approve, reject, or reset proposed actions (schedule meetings, send emails, set reminders). Filterable by status with real-time counters |
| π Memory Search | Semantic search across all stored meetings and memories. Ask natural language questions like "What did we decide about the API?" and get ranked results |
The dashboard features:
- π¨ Animated mesh gradient background with glassmorphism UI
- πΎ Chat sessions persisted to localStorage across browser restarts
- β‘ Quick action buttons for common tasks (join meeting, send email, check status)
- π Live stats in the sidebar β meetings stored, memories count, pending actions
- π File upload for transcript ingestion directly from the browser
- π·οΈ Renameable transcript labels for easy organization
The dashboard backend is a Flask API (dashboard/server/api.py) that wraps the Python orchestrator and provides REST endpoints for everything the frontend needs.
| Capability | Description |
|---|---|
| π§ Live Meeting Transcription | Joins Teams calls as a participant and transcribes audio in real-time |
| π Strategic Game Plans | Analyzes transcripts to produce summaries, detected conflicts, and proposed action items |
| π§ Cross-Meeting Memory | Stores everything in a local vector database β detects contradictions and missed follow-ups across meetings |
| β Human-in-the-Loop Actions | Proposes actions (schedule, email, remind) and waits for your explicit approval before executing |
| π File Upload Support | Drop .vtt or .txt transcript files for instant analysis β via Teams chat or the dashboard |
| π Calendar & Email | Schedule meetings and send emails via Microsoft Graph API (optional) |
| π Semantic Search | Ask "What did John say about the API deadline?" and get answers from any past meeting |
| π₯οΈ Local Dashboard | Full web UI β chat, transcripts, actions, and search β no cloud required |
Memex can run entirely on your machine at zero cost:
| Component | Free Option |
|---|---|
| LLM | Ollama with any local model (Llama, Mistral, Gemma, etc.) |
| Meeting Transcription | Vexa β self-hosted, CPU or GPU |
| Vector Memory | ChromaDB β local Docker container |
| Dashboard | Included β React + Flask, runs locally |
| Teams Integration | Optional β Azure Bot free tier (10K messages/month) if you want Teams connectivity |
No API keys. No cloud subscriptions. No data leaves your machine.
To use a local LLM, set LLM_PROVIDER=ollama in your .env and point it at your Ollama instance. To use a cloud LLM, set your Anthropic or Google Gemini API key instead.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Memex Dashboard (http://localhost:5173) β
β React UI β Chat, Transcripts, Actions, Memory Search β
ββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββ
β REST API
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Dashboard API Server (Port 5100) β
β Flask β wraps the Python orchestrator β
ββββββββ¬ββββββββββββββββββββββββββββββββββββββββ¬ββββββββββββββββββββ
β β
βΌ βΌ
ββββββββββββββββββββββ ββββββββββββββββββββββββββββββββ
β ChromaDB (8100) β β LLM (local or cloud) β
β Vector memory DB β β Ollama / Claude / Gemini β
ββββββββββββββββββββββ ββββββββββββββββββββββββββββββββ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Microsoft Teams (Cloud) β
β You chat with the bot / join calls here β
ββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββ
β HTTPS (via ngrok tunnel)
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ngrok (Port 3978) β
β Public HTTPS URL β forwards to your local machine β
ββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββ
β
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β OpenClaw Agent (Ports 3978 + 18789) β
β The AI brain β receives messages, runs skills, responds β
ββββββββ¬ββββββββββββββββββββββββββββββββββββββββ¬ββββββββββββββββββββ
β β
βΌ βΌ
ββββββββββββββββββββββ ββββββββββββββββββββββββββββββββ
β ChromaDB (8100) β β Vexa (WSL2, Port 8056) β
β Vector memory DB β β Joins calls & transcribes β
β Long-term meeting β β audio in real-time β
β storage β ββββββββββββββββ¬ββββββββββββββββ
ββββββββββββββββββββββ β
βΌ
ββββββββββββββββββββββββββββββββ
β Vexa Bridge (Port 3001) β
β Relays transcripts, handles β
β Graph API (email/calendar) β
ββββββββββββββββββββββββββββββββ
| Service | Port | Required? | Role |
|---|---|---|---|
| Dashboard | 5173 + 5100 |
For local use | React UI + Flask API |
| ChromaDB | 8100 |
Yes | Vector database for long-term meeting memory |
| OpenClaw | 3978 + 18789 |
For Teams | AI agent runtime β processes messages via LLM, runs skills |
| ngrok | Tunnels 3978 |
For Teams | Exposes local bot to Teams via public HTTPS |
| Vexa | 8056 |
For live transcription | Self-hosted meeting transcription (runs in WSL2) |
| Vexa Bridge | 3001 |
For Teams + email/calendar | Relays transcripts + provides Graph API endpoints |
| Azure Bot | Cloud | For Teams | Identity/routing layer for Microsoft Teams |
No Azure, no ngrok, no Teams β just the local dashboard.
# 1. Install dependencies
pip install -r requirements.txt
cd dashboard && npm install && cd ..
# 2. Configure
copy .env.example .env
# Edit .env β set at least one LLM provider (or use Ollama)
# 3. Start ChromaDB
docker compose up -d
# 4. Start the dashboard
cd dashboard && npm startOpen http://localhost:5173 β you're ready to upload transcripts, chat, and search.
# 1. Clone & Install
git clone https://github.com/YOUR-USERNAME/ws.git
cd ws
powershell -ExecutionPolicy Bypass -File setup.ps1
# 2. Configure
copy .env.example .env
# Edit .env with your API keys and Azure credentials
# 3. Register Azure Bot (free tier)
# Follow docs/04-azure-setup.md
# 4. Configure OpenClaw
openclaw config set channels.msteams.appId "YOUR_APP_ID"
openclaw config set channels.msteams.appPassword "YOUR_CLIENT_SECRET"
openclaw config set channels.msteams.tenantId "YOUR_TENANT_ID"
openclaw config set channels.msteams.enabled true
openclaw config set channels.msteams.dmPolicy open
openclaw config set channels.msteams.allowFrom '["*"]'
# 5. Start Everything (4 terminals)
docker compose up -d # ChromaDB
wsl -d Ubuntu -- bash -c "cd ~/vexa && make up TRANSCRIPTION=cpu" # Vexa
ngrok http --domain=YOUR-STATIC-DOMAIN.ngrok-free.app 3978 # Tunnel
cd openclaw && node dist/entry.js gateway # Agent
cd vexa-bridge && npm start # Bridge
# 6. Sign in to Microsoft Graph (one-time, for email/calendar)
Invoke-RestMethod http://localhost:3001/graph/loginπ‘ Tip: Use
start-all.ps1to launch all services at once.
- Upload a transcript file and click "Generate Game Plan"
- Chat: "What were the key decisions from the last meeting?"
- Search memory: "budget discussion Q3"
- Approve or reject proposed actions with one click
Join this meeting: https://teams.live.com/meet/9375083437515?p=ABC123
Summarize the meeting
Approve Step 1
What did John say about the API deadline?
Schedule a meeting with the team for tomorrow at 2pm
Memex joins LIVE like a real participant!

Memex uses four custom OpenClaw skills (for Teams mode):
| Skill | Purpose |
|---|---|
memex-meeting-processor |
Parses transcripts β generates Strategic Game Plans |
memex-action-queue |
Human-in-the-loop approval workflow for proposed actions |
memex-vexa-listener |
Handles live meeting capture, join commands, email & calendar |
memex-file-ingest |
Processes uploaded .vtt and .txt transcript files |
| Endpoint | Method | Description |
|---|---|---|
/api/chat |
POST | Send a message and get an AI response |
/api/transcripts |
GET | List all stored meetings |
/api/transcripts/upload |
POST | Upload and process a transcript file |
/api/transcripts/<id> |
GET | Get full transcript for a meeting |
/api/actions |
GET | List all actions in the queue |
/api/actions/<id>/approve |
POST | Approve a pending action |
/api/actions/<id>/reject |
POST | Reject a pending action |
/api/memory/search?q=... |
GET | Semantic search across all meetings |
/api/stats |
GET | System statistics |
/api/pending |
GET | List unprocessed transcript files |
| Endpoint | Method | Description |
|---|---|---|
/join-meeting |
POST | Join a Teams meeting via Vexa |
/fetch-transcript/:id |
GET | Fetch transcript by meeting ID |
/schedule-meeting |
POST | Schedule a calendar event (Graph API) |
/send-email |
POST | Send an email (Graph API) |
/memory/store |
POST | Store a transcript in ChromaDB |
/memory/query?q=... |
GET | Semantic search across all meetings |
/graph/login |
GET | Start Microsoft sign-in (device code flow) |
/health |
GET | Service health check |
ws/
βββ dashboard/ # π₯οΈ Local web dashboard
β βββ src/ # React frontend (Vite + Tailwind + Framer Motion)
β β βββ components/
β β β βββ ChatPanel.tsx # Conversational AI chat interface
β β β βββ TranscriptViewer.tsx # Transcript browser + upload + viewer
β β β βββ ActionQueue.tsx # Action approval/rejection queue
β β β βββ MemorySearch.tsx # Semantic memory search
β β β βββ Sidebar.tsx # Navigation + session management + stats
β β β βββ Background.tsx # Animated mesh gradient background
β β βββ App.tsx # Main app with tab routing
β βββ server/
β βββ api.py # Flask API backend (port 5100)
βββ openclaw/ # OpenClaw agent runtime (local build with fixes)
β βββ extensions/msteams/ # MS Teams channel integration
βββ vexa-bridge/ # Node.js bridge server (transcript relay + Graph API)
β βββ index.js # Main bridge application
βββ skills/ # Custom OpenClaw skills
β βββ memex-meeting-processor/
β βββ memex-action-queue/
β βββ memex-vexa-listener/
β βββ memex-file-ingest/
βββ docs/ # Setup guides and documentation
βββ teams-manifest/ # Teams app manifest for sideloading
βββ memex_agent_orchestrator.py # Python orchestrator (ChromaDB + LLM analysis)
βββ docker-compose.yml # ChromaDB container config
βββ setup.ps1 # Automated setup script
βββ start-all.ps1 # Launch all services
βββ stop-all.ps1 # Stop all services
βββ .env.example # Environment variable template
βββ requirements.txt # Python dependencies
See the full Debugging Guide for detailed troubleshooting.
Quick diagnostics:
# Check each service
Invoke-RestMethod http://localhost:8100/api/v2/heartbeat # ChromaDB
Invoke-RestMethod http://localhost:5100/api/health # Dashboard API
Invoke-RestMethod http://localhost:8056/ # Vexa
Test-NetConnection localhost -Port 3978 # OpenClaw Bot
Invoke-RestMethod http://localhost:3001/health # Bridge| Doc | Description |
|---|---|
| Overview | What is Memex and what can it do |
| Architecture | System design and service breakdown |
| Prerequisites | Software and accounts needed |
| Azure Setup | Azure Bot registration (free tier) |
| Installation | Step-by-step installation |
| Startup Guide | Starting all services |
| Testing | End-to-end testing guide |
| Troubleshooting | Common issues and fixes |
| Reference | File reference and known issues |
| Debugging | Deep debugging guide |
| Capabilities | Full bot capabilities with examples |
- Local build of OpenClaw: This project uses a local build (
openclaw/) rather than the globalopenclawCLI. The local build includes fixes to the MS Teams provider that prevent a crash-loop bug. Always start withnode dist/entry.js gatewayfrom theopenclaw/directory. - Windows 11 + WSL2: The full setup (with live meeting transcription) is designed for Windows 11 with WSL2. The dashboard works on any platform.
- Azure free tier: If using Teams integration, the Azure Bot uses the F0 (free) pricing tier β 10,000 messages/month at no cost.
Private project. All rights reserved.