Definitely OpenAI. (It's definitely not.)
An OpenAI-compatible API proxy that makes any AI provider speak the OpenAI protocol — complete with streaming and function calling — even when the provider supports neither.
Currently only supports Straico as an AI Provider, but has an extensible architecture for future providers.
git clone <repository-url>
cd doai-proxy
npm install
cp .env.example .env
# Edit .env and add your provider API key
npm startVerify:
curl http://localhost:8000/health
# {"status":"ok","service":"doai-proxy","timestamp":"..."}Some AI providers are almost OpenAI-compatible but lack streaming and function calling. DOAI Proxy fills both gaps:
- Streaming Simulation: Converts non-streaming responses into SSE with configurable chunking
- Function Calling: Injects tool definitions into prompts, parses responses, formats as
tool_calls - OpenAI Compatibility: Presents
/v1/chat/completions,/v1/models— the client never knows the difference
OpenAI-Compatible Client (OpenCode, etc.)
↓
DOAI Proxy (localhost:8000)
↓
Provider API (Straico, etc.)
- server.js - Express server, auth middleware, request orchestration
- streaming.js - SSE streaming simulation (none/smart modes)
- tools.js - Function calling via prompt injection + 4 response parsers
- utils.js - Logging, delays, response formatting
- providers/ - Extensible provider layer (
straico, more coming)
cp .env.example .env
docker-compose up -d --build
curl http://localhost:8000/healthnpm run dev # Hot-reload via node --watch
npm run lint # ESLint
npm run lint:fix # Auto-fix lint
npm test # Run testsFull documentation is available at docs site (or run npm run docs:dev locally):
- Getting Started
- Configuration Reference
- Streaming
- Function Calling
- Authentication
- Summarization
- Adding Providers
- Docker Deployment
- Production
- Security
- Troubleshooting
- API Reference
- Examples
See LICENSE file for details.