A terminal-based autonomous coding agent with natural conversation and MCP-powered file operations. Works completely free with Ollama and the standard filesystem MCP server, or use Morph (paid) for faster performance.
- 🤖 Autonomous Agent: Multi-step tool calling - AI autonomously explores, gathers information, and completes tasks
- 💬 Natural Conversation: Chat naturally with an AI coding assistant
- 📁 MCP-Based File Operations: All file operations delegated to MCP servers (read, write, edit, create, delete)
- 🆓 100% Free Option: Works with Ollama (free) + standard filesystem MCP server (free, no API keys)
- 🗂️ Smart Context Management: Add files to conversation context for analysis and modification
- 🔒 Read-Only References: Mark files as read-only to prevent accidental modifications
- 🔌 Multi-Server Support: Connect to multiple MCP servers for extended functionality
- 🦙 Ollama Support: Works with Ollama cloud models and local models
- 🔄 OpenAI Compatible: Works with any OpenAI-compatible API
- 🔄 Command History: Navigate previous commands with arrow keys (↑/↓)
- 🚀 Auto-Configuration: Automatically detects your project directory
pip install a-coder-clicp config.json.example config.json
# Edit config.json with your API key and model settings# Navigate to your project
cd /path/to/your/project
# Start A-Coder
a-coder --config ~/config.json
# Or use the short form:
a-coder -c ~/config.jsonThat's it! The filesystem server automatically configures itself to your current directory.
Use the up/down arrow keys to cycle through your command history. Your command history persists between sessions, making it easy to reuse complex commands.
- Add files:
/add path/to/file - Read-only mode:
/readonly path/to/file - List context:
/context - Clear context:
/clear
pip install a-coder-cliOr install from source:
git clone https://github.com/morph-llm/a-coder-cli.git
cd a-coder-cli
pip install -e .Copy the example configuration and customize it:
cp config.json.example config.jsonThen edit config.json with your settings:
{
"openai": {
"api_key": "your-api-key",
"base_url": "https://api.openai.com/v1",
"model": "gpt-4"
},
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/your/project"
]
}
}
}Set the A_CODER_CONFIG_PATH environment variable to avoid passing --config every time:
export A_CODER_CONFIG_PATH="$HOME/config.json"ACODER_CONFIG_PATH and ACODER_CONFIG remain as legacy aliases for backwards compatibility.
Option 1: Standard Filesystem Server (Free, Recommended)
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/your/project"
]
}Note: The path /path/to/your/project will be automatically replaced with your current working directory when you run a-coder. You can also specify absolute paths for multiple directories:
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/Users/username/projects/my-app",
"/Users/username/projects/shared-lib"
]
}Option 2: Morph Filesystem Server (Requires API Key)
"filesystem-with-morph": {
"command": "npx",
"args": ["@morph-llm/morph-fast-apply"],
"env": {
"MORPH_API_KEY": "your-morph-api-key",
"ALL_TOOLS": "true"
}
}Option 3: Context7 (Optional, for library documentation)
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context-sdk"]
}Navigate to your project directory and run:
cd /path/to/your/project
a-coder --config config.jsonThe filesystem server will automatically use the current directory. Or with command-line options:
a-coder --openai-key "your-key" --config config.jsonExample:
cd ~/projects/my-web-app
a-coder --config ~/config.json
# Filesystem server will automatically access ~/projects/my-web-app/add <filepath>- Add file to conversation context (editable)/add-ro <filepath>- Add file as read-only reference/files- List all added files with status/remove <filepath>- Remove file from context/clear-files- Clear all added files
/mcp-list- List connected MCP servers/mcp-tools <server>- List available tools from a server/mcp-call <server> <tool> [args]- Call an MCP tool directly
/help- Show detailed help/clear- Clear screen/exitor/quit- Exit application
You> /add src/app.py
✓ Added src/app.py
You> /add-ro docs/architecture.md
✓ Added docs/architecture.md (read-only)
You> /files
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Added Files ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ File │ Status │ Size │
├───────────────────┼───────────┼───────────────────┤
│ src/app.py │ editable │ 2,345 bytes │
│ docs/architecture │ read-only │ 5,678 bytes │
└───────────────────┴───────────┴───────────────────┘
You> Can you refactor the main function to be more modular?
A-Coder> I'll analyze the code and suggest improvements...
- ACoderCLI: Main application class managing conversation and file context
- MCP Integration: Delegates all file operations to MCP servers
- OpenAI Integration: Uses GPT-4 for intelligent responses
- File Context: Maintains added files and includes them in conversation
- User requests file modification
- AI analyzes request and added files
- AI calls appropriate MCP tool (edit_file, write_file, etc.)
- MCP server performs operation
- Result returned to user
The CLI supports any MCP server. Popular options:
The standard MCP filesystem server provides comprehensive file operations:
Available Tools:
read_text_file- Read file contents with optional head/tail limitswrite_file- Create or overwrite filesedit_file- Make selective edits with pattern matchinglist_directory- List directory contentscreate_directory- Create new directoriesmove_file- Move or rename files/directoriessearch_files- Search for files matching patternsdirectory_tree- Get recursive directory structureget_file_info- Get file metadatalist_allowed_directories- List accessible directories
Setup:
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]
}Fast file editing via Morph Apply (10,500+ tokens/sec). Same tools as above plus additional optimizations.
Setup:
"filesystem-with-morph": {
"command": "npx",
"args": ["@morph-llm/morph-fast-apply"],
"env": {
"MORPH_API_KEY": "your-morph-api-key",
"ALL_TOOLS": "true"
}
}When to use Morph:
- ⚡ Need maximum performance (10,500+ tokens/sec)
- 🏢 Enterprise/production environments
- 💼 Professional development workflows
When to use free filesystem server:
- 🆓 Personal projects
- 🎓 Learning and experimentation
- 💻 Standard development workflows
Documentation and code context for libraries and frameworks.
Any MCP-compatible server can be added to the configuration.
- Use
/add-rofor reference files: Prevents accidental modifications - Keep context focused: Only add relevant files to reduce token usage
- Use edit_file over write_file: Much faster for modifications
- Batch related changes: Make multiple edits in one request
Set via environment variable or config file:
export OPENAI_API_KEY="your-key"Verify server configuration in config.json and check server logs.
Use absolute paths or ensure files are relative to current working directory.
a-coder-cli/
├── a_coder_cli.py # Main application
├── config.py # Configuration management
├── config.json # Configuration file
├── requirements.txt # Python dependencies
├── setup.py # Package setup
└── README.md # This file
python -m pytest tests/MIT
For issues and feature requests, visit the GitHub repository.