A standalone CLI that serves Cursor's API as an OpenAI-compatible endpoint.
ChatWise - All-in-one agentic chat app.
- OAuth authentication with Cursor
- OpenAI-compatible
/v1/chat/completionsendpoint - Automatic model discovery from your Cursor account
- Tool calling support
- Streaming responses
npm i -g cursor-openai-api
cursor-openai-api login
cursor-openai-api serve# Pull the image
docker pull ghcr.io/egoist/cursor-openai-api:latest
# Run
docker run -p 3000:3000 \
-v ~/.config/cursor-openai-api:/home/appuser/.config/cursor-openai-api \
ghcr.io/egoist/cursor-openai-api:latest login
docker run -p 3000:3000 \
-v ~/.config/cursor-openai-api:/home/appuser/.config/cursor-openai-api \
ghcr.io/egoist/cursor-openai-api:latest serveCredentials are stored in ~/.config/cursor-openai-api/credentials.json.
Authenticate with Cursor via OAuth browser flow.
cursor-openai-api loginClear stored credentials.
cursor-openai-api logoutCheck authentication status.
cursor-openai-api whoamiList available Cursor models.
cursor-openai-api modelsStart the OpenAI-compatible proxy server. Uses PORT environment variable (default: 3000).
PORT=3000 cursor-openai-api serveThe server exposes:
POST /v1/chat/completions- Chat completions endpointGET /v1/models- List available models
Check if the proxy server is running.
cursor-openai-api statusimport OpenAI from "openai"
const client = new OpenAI({
apiKey: "cursor", // any non-empty string
baseURL: "http://localhost:3000/v1",
})
const chat = await client.chat.completions.create({
model: "claude-4-sonnet",
messages: [{ role: "user", content: "Hello!" }],
})- Bun or Node.js 18+
- macOS, Linux, or WSL
- A Cursor account with API access
# Install dependencies
bun install
# Run without building
bun run src/cli.ts serve
# Build for production
bun run buildPush a tag to trigger the Docker build:
git tag v0.0.1
git push origin v0.0.1This project is modified from https://github.com/ephraimduncan/opencode-cursor