An MCP server for serving and querying documentation with AI capabilities. Built for the YC Agents Hackathon.
# Install uv package manager (same as Dedalus uses)
brew install uv # or pip install uv
# Install dependencies
uv sync --no-dev
# Configure API keys for AI features
cp config/.env.example .env.local
# Edit .env.local and add your OpenAI API key
# Test
uv run python tests/test_server.py
# Run
uv run main
pyproject.toml
- Package configuration with dependenciesmain.py
(root) - Entry point that Dedalus expectssrc/main.py
- The actual MCP server codedocs/
- Your documentation files
-
Set Environment Variables in Dedalus UI:
OPENAI_API_KEY
- Your OpenAI API key (required for AI features)
-
Deploy:
dedalus deploy . --name "your-docs-server"
- Installs dependencies using
uv sync
frompyproject.toml
- Runs
uv run main
to start the server - Server runs in
/app
directory in container - Docs are served from
/app/docs
- Serve markdown documentation
- Search across docs
- AI-powered Q&A (with OpenAI)
- Rate limiting (10 requests/minute) to protect API keys
- Ready for agent handoffs
list_docs()
- List documentation filessearch_docs()
- Search with keywordsask_docs()
- AI answers from docsindex_docs()
- Index documentsanalyze_docs()
- Analyze for tasks
See docs/
directory for:
MIT