Didim Agent CLI is an open-source AI agent that brings the power of multiple AI providers directly into your terminal. Built on the Gemini CLI foundation, it supports Gemini, Claude, OpenAI, and OpenAI-compatible (vLLM, Ollama, LM Studio) endpoints through a unified provider adapter architecture, giving you the most direct path from your prompt to your preferred model.
Learn all about Didim Agent CLI in our documentation.
- 🧠 Multi-provider support: Use Gemini, Claude, OpenAI, or local models
(vLLM/Ollama) — switch providers and models with
/modelor/auth login. - 🔧 Built-in tools: Google Search grounding, file operations, shell commands, web fetching — all tools work across providers.
- 🔌 Extensible: MCP (Model Context Protocol) support with deterministic tool naming and sLM-compatible parameter normalization.
- 🤖 Sub-agent support: Sub-agents work with all providers via the
provider-independent
llm*pipeline. - 💻 Terminal-first: Designed for developers who live in the command line.
- 🛡️ Open source: Apache 2.0 licensed.
- Node.js version 20 or higher
- macOS, Linux, or Windows
# Using npx (no installation required)
npx @didim365/agent-clinpm install -g @didim365/agent-clibrew install gemini-clisudo port install gemini-cli# Create and activate a new environment
conda create -y -n gemini_env -c conda-forge nodejs
conda activate gemini_env
# Install Gemini CLI globally via npm (inside the environment)
npm install -g @didim365/agent-cliSee Releases for more details.
New preview releases will be published each week at UTC 2359 on Tuesdays. These
releases will not have been fully vetted and may contain regressions or other
outstanding issues. Please help us test and install with preview tag.
npm install -g @didim365/agent-cli@preview- New stable releases will be published each week at UTC 2000 on Tuesdays, this
will be the full promotion of last week's
previewrelease + any bug fixes and validations. Uselatesttag.
npm install -g @didim365/agent-cli@latest- New releases will be published each day at UTC 0000. This will be all changes
from the main branch as represented at time of release. It should be assumed
there are pending validations and issues. Use
nightlytag.
npm install -g @didim365/agent-cli@nightly- Query and edit large codebases
- Generate new apps from PDFs, images, or sketches using multimodal capabilities
- Debug issues and troubleshoot with natural language
- Automate operational tasks like querying pull requests or handling complex rebases
- Use MCP servers to connect new capabilities, including media generation with Imagen, Veo or Lyria
- Run non-interactively in scripts for workflow automation
- Ground your queries with built-in Google Search for real-time information
- Conversation checkpointing to save and resume complex sessions
- Custom context files (AGENTS.md) to tailor behavior for your projects
Integrate Gemini CLI directly into your GitHub workflows with Gemini CLI GitHub Action:
- Pull Request Reviews: Automated code review with contextual feedback and suggestions
- Issue Triage: Automated labeling and prioritization of GitHub issues based on content analysis
- On-demand Assistance: Mention
@gemini-cliin issues and pull requests for help with debugging, explanations, or task delegation - Custom Workflows: Build automated, scheduled and on-demand workflows tailored to your team's needs
Choose the authentication method that best fits your needs. You can also use
/auth login inside the CLI to interactively select a provider and enter your
API key.
Note: Both
DIDIM_*andGEMINI_*environment variable prefixes are supported. The CLI uses a centralresolveEnv()utility that checksDIDIM_*first, then falls back toGEMINI_*for backward compatibility.
✨ Best for: Individual developers and Gemini Code Assist license holders.
didim
# Select "Login with Google" and follow the browser authentication flowFor organization accounts, set your Google Cloud project first:
export GOOGLE_CLOUD_PROJECT="YOUR_PROJECT_ID"
didim✨ Best for: Developers who need specific Gemini model control.
export GEMINI_API_KEY="YOUR_API_KEY"
didim✨ Best for: Developers who prefer Claude models (Opus, Sonnet, Haiku).
export ANTHROPIC_API_KEY="YOUR_API_KEY"
didim✨ Best for: Developers who prefer OpenAI models (GPT-4.1, o3, o4-mini).
export OPENAI_API_KEY="YOUR_API_KEY"
didim✨ Best for: Enterprise teams and production workloads.
export GOOGLE_API_KEY="YOUR_API_KEY"
export GOOGLE_GENAI_USE_VERTEXAI=true
didim✨ Best for: Local/self-hosted models and privacy-sensitive environments.
Using /auth login (Recommended):
didim
# Run /auth login, select "sLM (OpenAI-compatible endpoint)"
# Follow the 4-step wizard: URL → Server Type → Credentials → AdvancedUsing environment variables:
export ENABLE_MULTI_PROVIDER=true
export LLM_PROVIDER=openai-compatible
export LLM_BASE_URL="http://localhost:8000/v1"
export LLM_MODEL="your-model-name"
didimLimiting tools for context-constrained sLM:
Add to ~/.didim/settings.json:
{
"tools": {
"core": [
"read_file",
"search_file_content",
"glob",
"replace",
"write_file",
"run_shell_command"
]
}
}For detailed setup for each provider, see the authentication guide and provider guide.
didimdidim --include-directories ../lib,../docsdidim -m gemini-2.5-flash # Gemini
didim -m claude-sonnet-4-5-20250929 # Claude
didim -m gpt-4.1 # OpenAIGet a simple text response:
didim -p "Explain the architecture of this codebase"For more advanced scripting, including how to parse JSON and handle errors, use
the --output-format json flag to get structured output:
didim -p "Explain the architecture of this codebase" --output-format jsonFor real-time event streaming (useful for monitoring long-running operations),
use --output-format stream-json to get newline-delimited JSON events:
didim -p "Run tests and deploy" --output-format stream-jsoncd new-project/
didim
> Write me a Discord bot that answers questions using a FAQ.md file I will providegit clone https://github.com/user/project
cd project
didim
> Give me a summary of all of the changes that went in yesterday- Quickstart Guide - Get up and running quickly.
- Authentication Setup - Detailed auth configuration.
- Configuration Guide - Settings and customization.
- Keyboard Shortcuts - Productivity tips.
- Commands Reference - All slash commands
(
/help,/chat, etc). - Custom Commands - Create your own reusable commands.
- Context Files (AGENTS.md) - Provide persistent context to the CLI.
- Checkpointing - Save and resume conversations.
- Token Caching - Optimize token usage.
- Built-in Tools Overview
- MCP Server Integration - Extend with custom tools.
- Custom Extensions - Build and share your own commands.
- Headless Mode (Scripting) - Use Gemini CLI in automated workflows.
- Provider Guide - Multi-provider runtime usage
(
gemini,claude,openai,openai-compatibleincluding vLLM). - Multi-Provider Configuration - Environment variables and precedence for provider/model resolution.
- Migration Guide - Move from Gemini-only to
provider-independent (
llm*) call paths. - Provider Adapter API - Adapter contract and streaming/event types overview.
- Architecture Overview - How Gemini CLI works.
- IDE Integration - VS Code companion.
- Sandboxing & Security - Safe execution environments.
- Trusted Folders - Control execution policies by folder.
- Enterprise Guide - Deploy and manage in a corporate environment.
- Telemetry & Monitoring - Usage tracking.
- Tools API Development - Create custom tools.
- Local development - Local development tooling.
export ENABLE_MULTI_PROVIDER=true
export LLM_PROVIDER=openai-compatible
export LLM_BASE_URL="http://localhost:8000/v1"
export LLM_MODEL="Qwen/Qwen2.5-7B-Instruct"
didimOr use the interactive wizard:
didim
# /auth login → sLM → Enter URL → Select vLLM → Enter model name- Troubleshooting Guide - Common issues and solutions.
- FAQ - Frequently asked questions.
- Use
/bugcommand to report issues directly from the CLI.
Configure MCP servers in ~/.didim/settings.json (or ~/.gemini/settings.json
for backward compatibility) to extend the CLI with custom tools:
> @github List my open pull requests
> @slack Send a summary of today's commits to #dev channel
> @database Run a query to find inactive users
MCP tool naming is deterministic — tools are registered with consistent names regardless of server discovery order. Tool parameters are automatically normalized via schema-based coercion, with enhanced tolerance for sLM (small Language Model) tool call formatting.
See the MCP Server Integration guide for setup instructions.
We welcome contributions! Gemini CLI is fully open source (Apache 2.0), and we encourage the community to:
- Report bugs and suggest features.
- Improve documentation.
- Submit code improvements.
- Share your MCP servers and extensions.
See our Contributing Guide for development setup, coding standards, and how to submit pull requests.
Check our Official Roadmap for planned features and priorities.
- Official Roadmap - See what's coming next.
- Changelog - See recent notable updates.
- NPM Package - Package registry.
- GitHub Issues - Report bugs or request features.
- Security Advisories - Security updates.
See the Uninstall Guide for removal instructions.
- License: Apache License 2.0
- Terms of Service: Terms & Privacy
- Security: Security Policy
Built on Gemini CLI by Google — extended by Didim365
