Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
- 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
- 💻 Cross-Platform: Available for Windows, MacOS, and Linux
- 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
- ☁️ OAP Cloud Integration: One-click access to managed MCP servers via OAPHub.ai - eliminates complex local deployments
- 🏗️ Dual Architecture: Modern Tauri version alongside traditional Electron version for optimal performance
- 🌍 Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish, Japanese, Korean with more coming soon
- ⚙️ Advanced API Management: Multiple API keys and model switching support with
model_settings.json
- 🛠️ Granular Tool Control: Enable/disable individual MCP tools for precise customization
- 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
- 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates
- 🏗️ Dual Architecture Support: Dive now supports both Electron and Tauri frameworks simultaneously
- ⚡ Tauri Version: New modern architecture with optimized installer size (Windows < 30MB)
- 🌐 OAP Platform Integration: Native support for OAPHub.ai cloud services with one-click MCP server deployment
- 🔐 OAP Authentication: Comprehensive OAP login and authentication support
- 📁 Enhanced Model Configuration: Complete restructuring with
model_settings.json
for managing multiple models - 🛠️ Granular MCP Control: Individual tool enable/disable functionality for better customization
- 🎨 UI/UX Enhancements: Various interface improvements across the application
- 🔄 Updated dive-mcp-host: Latest architectural improvements incorporated
- Windows: Available in both Electron and Tauri versions ✅
- macOS: Currently Electron only 🔜
- Linux: Currently Electron only 🔜
Migration Note: Existing local MCP/LLM configurations remain fully supported. OAP integration is additive and does not affect current workflows.
Get the latest version of Dive:
Choose between two architectures:
- Tauri Version (Recommended): Smaller installer (<30MB), modern architecture
- Electron Version: Traditional architecture, fully stable
- Python and Node.js environments will be downloaded automatically after launching
- Electron Version: Download the .dmg version
- You need to install Python and Node.js (with npx uvx) environments yourself
- Follow the installation prompts to complete setup
- Electron Version: Download the .AppImage version
- You need to install Python and Node.js (with npx uvx) environments yourself
- For Ubuntu/Debian users:
- You may need to add
--no-sandbox
parameter - Or modify system settings to allow sandbox
- Run
chmod +x
to make the AppImage executable
- You may need to add
Dive offers two ways to access MCP tools: OAP Cloud Services (recommended for beginners) and Local MCP Servers (for advanced users).
For advanced users who prefer local control. The system comes with a default echo MCP Server, and you can add more powerful tools like Fetch and Youtube-dl.
The easiest way to get started! Access enterprise-grade MCP tools instantly:
- Sign up at OAPHub.ai
- Connect to Dive using one-click deep links or configuration files
- Enjoy managed MCP servers with zero setup - no Python, Docker, or complex dependencies required
Benefits:
- ✅ Zero configuration needed
- ✅ Cross-platform compatibility
- ✅ Enterprise-grade reliability
- ✅ Automatic updates and maintenance
Add this JSON configuration to your Dive MCP settings to enable local tools:
"mcpServers":{
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch",
"--ignore-robots-txt"
],
"enabled": true
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/allowed/files"
],
"enabled": true
},
"youtubedl": {
"command": "npx",
"args": [
"@kevinwatt/yt-dlp-mcp"
],
"enabled": true
}
}
You can connect to external cloud MCP servers via Streamable HTTP transport. Here's the Dive configuration example for SearXNG service from OAPHub:
{
"mcpServers": {
"SearXNG_MCP_Server": {
"transport": "streamable",
"url": "https://proxy.oaphub.ai/v1/mcp/181672830075666436",
"headers": {
"Authorization": "GLOBAL_CLIENT_TOKEN"
}
}
}
}
Reference: @https://oaphub.ai/mcp/181672830075666436
You can also connect to external MCP servers (not local ones) via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:
{
"mcpServers": {
"MCP_SERVER_NAME": {
"enabled": true,
"transport": "sse",
"url": "YOUR_SSE_SERVER_URL"
}
}
}
yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:
winget install yt-dlp
brew install yt-dlp
pip install yt-dlp
See BUILD.md for more details.
- 💬 Join our Discord
- 🐦 Follow us on Twitter/X Reddit Thread
- ⭐ Star us on GitHub
- 🐛 Report issues on our Issue Tracker