A wrapper around GitHub Copilot API to make it OpenAI compatible, making it usable for other tools.
copilot-api-demo.mp4
- Bun (>= 1.2.x)
- GitHub account with Copilot Individual subscription
To install dependencies, run:
bun install
You can run the project directly using npx:
npx copilot-api@latest
With options:
npx copilot-api --port 8080
The following command line options are available:
Option | Description | Default |
---|---|---|
--port, -p | Port to listen on | 4141 |
--verbose, -v | Enable verbose logging | false |
Example usage:
npx copilot-api@latest --port 8080 --verbose
The project can be run from source in several ways:
bun run dev
bun run start
To avoid rate limiting and optimize your experience:
- Consider using free models (e.g., Gemini, Mistral, Openrouter) as the
weak-model
- Use architect mode sparingly
- Disable
yes-always
in your aider configuration - Be mindful that Claude 3.7 thinking mode consume more tokens
- Manual authentication flow
- Manual request approval system
- Rate limiting implementation
- Token usage tracking and monitoring
- Enhanced error handling and recovery