A cross-platform desktop AI chat application with multi-provider support and MCP tool integration.
- Multiple AI providers — OpenAI, Anthropic (Claude), and LM Studio (local models via OpenAI-compatible API)
- MCP tool servers — connect any Model Context Protocol server over HTTP-SSE, HTTP Streamable, or stdio
- Per-tool approval — approve or deny individual tool calls before they execute, with per-server auto-approve override
- Conversation management — persistent conversations with custom titles, per-conversation provider/model selection, and context window tracking
- System prompts — global default and per-conversation system prompts
- Parameter controls — temperature, top-p, and max tokens configurable per conversation
- Markdown rendering — assistant responses rendered with syntax-highlighted code blocks
- Export — export conversations to Markdown
- Auto-updates — update checks via a Cloudflare Worker backed by GitHub Releases
- Cross-platform — macOS (Apple Silicon + Intel), Windows, and Linux
Coming soon.
Pre-built binaries are available on the Releases page.
| Platform | Format |
|---|---|
| macOS (Apple Silicon) | .zip |
| macOS (Intel) | .zip |
| Windows | Squirrel installer (.exe) |
| Linux | .deb / .rpm |
| Layer | Technology |
|---|---|
| Runtime | Electron 42 |
| Frontend | React 19, TypeScript, Tailwind CSS v4, Vite |
| State | Zustand v5 with persist middleware |
| Main-process storage | electron-store v11 |
| Build / package | electron-forge |
| Update server | Cloudflare Worker |
- Node.js 20 or later
- npm 10 or later
git clone https://github.com/OpenConduit/Client.git
cd Client
npm install
npm start# Current platform
npm run make
# macOS only
npm run make -- --platform darwin
# Windows only
npm run make -- --platform win32Output lands in out/make/.
On first launch, open Settings (⌘, on macOS) and add at least one AI provider:
| Provider | Required fields |
|---|---|
| OpenAI | API key, default model |
| Anthropic | API key, default model |
| LM Studio | Base URL (e.g. http://localhost:1234), default model |
Add MCP tool servers in Settings → MCP Servers. Supported transports:
- HTTP-SSE — URL + optional headers
- HTTP Streamable — URL + optional headers
- stdio — command, args, and optional environment variables
src/
main.ts # Electron entry point
preload.ts # Context bridge (window.api)
main/
ipc.ts # All IPC handlers
providers/ # AI provider clients
mcp/client.ts # MCP client
store/settings.ts # electron-store (main process)
renderer/
App.tsx # Root component
components/ # UI components
hooks/ # Chat lifecycle hooks
stores/ # Zustand stores (renderer)
shared/
types.ts # Shared TypeScript types
worker/ # Cloudflare Worker (update checks + feedback)
- Fork the repo and create a branch from
main - Run
npm run lintand ensure it passes before opening a PR - Fill out the pull request template — in particular the Process Boundary Check section (no Node/Electron imports in renderer files)
- Sign the Contributor License Agreement — the CLA bot will prompt you on your first PR
OpenConduit is source-available under the GNU AGPL v3.0 for personal and open-source use. A separate commercial license is available for commercial deployments — contact contact@openconduit.ai for details.