AI-powered inline code completions for VS Code via a Model Context Protocol server.
VS Code Extension (MCP Client)
└── InlineCompletionItemProvider
│ debounce + context extraction
│
└──── stdio ────► MCP Server
└── get_inline_completion tool
└── Anthropic Claude API
- Node.js ≥ 18
ANTHROPIC_API_KEYenvironment variable set
# Install all dependencies
npm install
# Build everything
npm run build# Terminal 1: Watch server
cd packages/server && npm run dev
# Terminal 2: Watch extension
cd packages/extension && npm run dev- Open this repo in VS Code
- Press
F5→ launches Extension Development Host - The MCP server starts automatically as a child process
- Type code → ghost text completions appear after the debounce delay
| Setting | Default | Description |
|---|---|---|
mcpInlineComplete.enabled |
true |
Enable/disable completions |
mcpInlineComplete.debounceMs |
300 |
Delay before requesting completion (ms) |
mcpInlineComplete.prefixLines |
100 |
Lines before cursor sent as context |
mcpInlineComplete.suffixLines |
30 |
Lines after cursor sent as context |
- Command palette:
MCP Inline Complete: Toggle On/Off - Status bar: click the
✦ MCPindicator
packages/
shared/ ← Shared TypeScript types
server/ ← MCP server (stdio transport, Anthropic SDK)
extension/ ← VS Code extension (InlineCompletionItemProvider)
Edit packages/server/src/index.ts — replace the Anthropic call with
any other provider (OpenAI, Ollama, local model, etc). The MCP interface
stays the same.
The MCP server can expose additional tools alongside get_inline_completion:
get_hover_info— context-aware hover documentationrefactor_selection— AI-powered refactoring suggestionsexplain_code— inline code explanations