Your AI-powered coding companion, right inside your terminal.
ShellMate is a lightweight, open-source terminal AI assistant that brings the power of modern AI coding tools — like Claude Code, GitHub Copilot CLI, and Warp AI — directly to your command line. Built with TypeScript and React Ink, it provides an interactive REPL with streaming responses, an extensible tool system, and seamless integration with any AI model via OpenRouter.
Whether you're navigating a codebase, writing scripts, refactoring code, or just exploring — ShellMate has you covered.
ShellMate is a TypeScript-based terminal AI assistant that demonstrates the fundamental architecture of AI-powered coding tools. It connects to any LLM available on OpenRouter (Claude, GPT-4, Gemini, LLaMA, and more), executes tools locally on your machine, and streams responses token-by-token for a snappy, real-time experience.
On first run, ShellMate interactively prompts you to configure your AI model and API key — no manual .env setup required.
- 💬 Interactive REPL — Real-time streaming chat interface built with React and Ink
- 🛠️ 8 Built-in Tools — File read/write/edit, shell commands, glob, grep, AST analyzer, and ask user — all callable by the AI
- 🔄 Streaming Responses — Token-by-token output for a responsive, conversational feel
- 🎨 Beautiful Terminal UI — Colorized output, spinners, and a friendly welcome screen
- 🔌 Multi-Model Support — Use any model on OpenRouter: Claude, GPT-4, Gemini, LLaMA, and more
- 🔀 Live Model Switching — Change your AI model and API key on the fly with
/change-model - 🎛️ Configurable Max Tokens — Control response length per request with
/maxtokens - 🌲 AST Code Analyzer — Understand code structure (functions, classes, imports, exports) before making changes
- ⚡ Zero-Config Setup — Interactive first-run wizard configures your model and API key automatically
- 📁 Persistent
.shellmateConfig — All configuration stored in a local.shellmate/directory - 📦 Fully Typed — End-to-end TypeScript with Zod schema validation
- Node.js 18+
- An OpenRouter API key (get one here)
npm install -g shellmate-cligit clone https://github.com/imramkrishna/shellmate.git
cd shellmate
npm install
npm run buildWhen you launch ShellMate for the first time, it will interactively prompt you to configure:
- AI Model — The model identifier from OpenRouter (e.g.,
anthropic/claude-sonnet-4) - API Key — Your OpenRouter API key
This configuration is saved locally in a .shellmate/ directory so you only need to do it once.
# Launch ShellMate
shellmate
# Or use the short alias
cc# Start with your configured default model
shellmate
# Override the model for a single session
shellmate --model anthropic/claude-3.5-sonnet
# Example queries once inside the REPL:
> List all TypeScript files in the current directory
> Read the package.json and explain the dependencies
> Create a hello world script in Python
> Search for TODO comments in the codebase
> Refactor the login function to use async/awaitShellMate supports special commands inside the REPL:
| Command | Description |
|---|---|
/change-model |
Interactively update your AI model and API key |
/maxtokens |
Set the maximum number of tokens per response |
ShellMate provides 8 built-in tools that the AI can use to interact with your local environment:
| Tool | Description | Example Use Cases |
|---|---|---|
| bash | Execute shell commands with timeout support (30s default, 10MB buffer) | Running scripts, installing packages, git operations |
| read | Read file contents with line numbers and pagination | Viewing source code, inspecting config files |
| write | Create or overwrite files (auto-creates directories) | Generating new files, scripts, configurations |
| edit | Replace exact string matches in files (supports replace_all) |
Precise code modifications, refactoring |
| glob | Find files matching glob patterns | File discovery, listing source files |
| grep | Search file contents using regex (uses ripgrep if available) | Finding code patterns, TODOs, function definitions |
| analyze_file_structure | AST-based code structure analysis for 30+ languages | Understanding codebase structure before making changes |
| ask_user | Ask the user questions during tool execution | Clarifying ambiguous requests, confirming destructive actions |
The analyze_file_structure tool uses ts-morph to parse source files and extract structural information via Abstract Syntax Trees. It works on individual files or entire directories.
Extracted Information:
- Functions — Name, parameters, return type, and line number
- Classes — Name, methods (with params, return types, line numbers), and properties
- Imports — Module source, named specifiers, default imports
- Exports — Exported declaration names
- Variable Declarations — Top-level constants and variables
Supported Languages:
| Category | Extensions |
|---|---|
| JavaScript / TypeScript | .ts, .tsx, .js, .jsx, .mts, .cts, .mjs, .cjs |
| Python | .py, .pyw, .pyi |
| Java / Kotlin / Scala | .java, .kt, .kts, .scala |
| C / C++ / Objective-C | .c, .h, .cpp, .cxx, .cc, .hpp, .hxx, .m, .mm |
| C# | .cs |
| Go | .go |
| Rust | .rs |
| Ruby | .rb, .erb |
| PHP | .php |
| Swift | .swift |
| Dart | .dart |
| Shell | .sh, .bash, .zsh, .fish |
| Elixir / Erlang | .ex, .exs, .erl |
| Haskell | .hs |
| Lua | .lua |
| R | .r, .R |
| Perl | .pl, .pm |
| Zig / Nim / Julia | .zig, .nim, .jl |
| Web / Markup | .html, .css, .scss, .sass, .less, .vue, .svelte |
| Config / Data | .json, .yaml, .yml, .toml, .xml, .graphql, .gql |
| SQL | .sql |
| Markdown | .md, .mdx |
The analyzer automatically skips common non-source directories (node_modules, dist, build, .git, target, .venv, coverage, etc.).
The ask_user tool allows the AI to interactively ask you questions during tool execution. It supports three input types:
text— Free-form text inputselect— Choose from a list of optionsconfirm— Yes/no confirmation
This enables the AI to request clarification or confirmation before performing ambiguous or destructive operations.
src/
├── api/ # OpenRouter API client and types
├── core/ # Core conversation logic and message loop
├── lib/ # Shared libraries (AST file analysis)
├── tools/ # Tool implementations (bash, read, write, edit, glob, grep, analyze, ask_user)
├── ui/ # React/Ink UI components (REPL, MessageList, TextInput)
└── utils/ # Utility functions (colors, config, max tokens)
-
API Layer (
api/)client.ts— Streaming chat completion client for OpenRoutertypes.ts— TypeScript definitions for chat messages and tool calls
-
Core Logic (
core/)messageLoop.ts— Orchestrates multi-turn conversations with tool executionsystemPrompt.ts— Generates context-aware system promptstoolExecutor.ts— Executes tool calls and handles resultsquery.ts— Conversation state management
-
Tool System (
tools/)bash.ts— Execute shell commands with timeout supportread.ts— Read files with line numbers and paginationwrite.ts— Create or overwrite files with directory creationedit.ts— Replace exact string matches in filesglob.ts— Find files matching glob patternsgrep.ts— Search file contents using regex (ripgrep/grep)analyzeFile.ts— AST-based code structure analysisaskUser.ts— Interactive user prompts during tool execution
-
Shared Libraries (
lib/)files.ts— AST parsing and file structure analysis using ts-morph
-
UI Layer (
ui/)REPL.tsx— Main REPL component with state managementMessageList.tsx— Displays conversation history with streamingTextInput.tsx— User input componentToolResult.tsx— Formatted tool execution resultsAskUserPrompt.tsx— Interactive prompt component for AI-initiated questionsAskSystemMessage.tsx— System message prompts (model change, max tokens)SelectInput.tsx— Selection input for multi-choice prompts
-
Utilities (
utils/)colors.ts— Color formatting helpersgenerateConfig.ts— Interactive first-run configuration wizardgetConfig.ts— Reads saved configuration from.shellmate/keys.txtgetMaxTokensConfig.ts— Reads max tokens setting from.shellmate/maxtokens.txtaskUserBridge.ts— Bridge between ask_user tool and UI layer
User Message → AI Response (with tool calls)
↓
Execute Tools Locally
↓
Results → AI (next turn)
↓
Final Response to User
- User Input — You type a message in the terminal REPL
- API Request — The message is sent to OpenRouter with available tools and system context
- Streaming Response — The AI response streams token-by-token to the UI
- Tool Execution — If the AI calls tools, they execute locally and results are displayed
- Continuation — Results are fed back to the AI for additional turns until completion
All ShellMate configuration is stored in a .shellmate/ directory in your project root. This directory is created automatically on first run.
.shellmate/
├── keys.txt # AI model and API key
└── maxtokens.txt # Max tokens per response
Stores your AI model and OpenRouter API key:
AI_MODEL=anthropic/claude-sonnet-4
API_KEY=sk-or-v1-your-api-key-here
Controls the maximum number of tokens the AI can generate per response:
MAX_TOKENS=2000
If this file doesn't exist, the default is 2000 tokens.
ShellMate supports any model available on OpenRouter.
Override per-session via CLI:
# Claude models
shellmate --model anthropic/claude-sonnet-4
shellmate --model anthropic/claude-3.5-sonnet
# OpenAI models
shellmate --model openai/gpt-4-turbo
shellmate --model openai/gpt-4o
# Other models
shellmate --model google/gemini-pro
shellmate --model meta-llama/llama-3-70bYou can switch your AI model and API key without leaving the REPL:
- Type
/change-modelin the REPL - Enter the new model identifier (e.g.,
openai/gpt-4o) - Enter your OpenRouter API key
- The new configuration is saved to
.shellmate/keys.txt
Note: The model change takes effect on the next session restart.
Control how long responses can be:
- Type
/maxtokensin the REPL - Enter the maximum number of tokens per response
- The setting is saved to
.shellmate/maxtokens.txt
This is useful for controlling cost and response length — lower values produce shorter, cheaper responses; higher values allow for more detailed output.
To fully reset your configuration, delete the .shellmate directory and run ShellMate again:
rm -rf .shellmate
shellmateOr use the REPL commands /change-model and /maxtokens to update individual settings without restarting.
- Runtime: Node.js with ES Modules
- Language: TypeScript
- UI Framework: React with Ink (terminal UI)
- CLI Framework: Commander.js
- Validation: Zod with JSON Schema generation
- AST Parsing: ts-morph (TypeScript Compiler API)
- API: OpenRouter (streaming chat completions)
- Utilities: Chalk, glob
# Run in development mode (hot reload)
npm run dev
# Build TypeScript
npm run build
# Type checking
npx tsc --noEmitshellmate/
├── src/
│ ├── api/ # API client and types
│ │ ├── client.ts # OpenRouter streaming client
│ │ └── types.ts # Chat and tool types
│ ├── core/ # Core conversation logic
│ │ ├── messageLoop.ts # Multi-turn conversation orchestration
│ │ ├── systemPrompt.ts # System prompt generation
│ │ ├── toolExecutor.ts # Tool execution engine
│ │ └── query.ts # Conversation state
│ ├── lib/ # Shared libraries
│ │ └── files.ts # AST file analysis engine
│ ├── tools/ # Tool implementations
│ │ ├── analyzeFile.ts # AST code structure analyzer
│ │ ├── askUser.ts # Interactive user prompts
│ │ ├── bash.ts # Shell command execution
│ │ ├── edit.ts # String replacement editing
│ │ ├── glob.ts # File pattern matching
│ │ ├── grep.ts # Content search
│ │ ├── read.ts # File reading
│ │ ├── write.ts # File creation/overwriting
│ │ ├── types.ts # Tool type definitions
│ │ └── index.ts # Tool registry
│ ├── ui/ # Terminal UI components
│ │ ├── App.tsx # Main app component
│ │ ├── REPL.tsx # REPL logic and state
│ │ └── components/ # UI subcomponents
│ │ ├── AskSystemMessage.tsx # System prompts (model/tokens)
│ │ ├── AskUserPrompt.tsx # AI-initiated user prompts
│ │ ├── MessageList.tsx # Conversation display
│ │ ├── SelectInput.tsx # Multi-choice selection
│ │ ├── TextInput.tsx # Text input component
│ │ ├── ToolResult.tsx # Tool result formatting
│ │ └── WelcomeScreen.tsx # Welcome display
│ │ └── lib/
│ │ └── shortcuts.ts # REPL command handlers
│ ├── utils/ # Utilities
│ │ ├── askUserBridge.ts # Tool-to-UI bridge for ask_user
│ │ ├── colors.ts # Color formatting
│ │ ├── generateConfig.ts # First-run config wizard
│ │ ├── getConfig.ts # Config reader (.shellmate/keys.txt)
│ │ └── getMaxTokensConfig.ts # Max tokens reader
│ ├── index.ts # CLI entry point
│ └── main.ts # Commander CLI setup
├── .shellmate/ # Configuration directory (auto-created)
│ ├── keys.txt # AI_MODEL and API_KEY
│ └── maxtokens.txt # MAX_TOKENS setting
├── package.json
└── tsconfig.json
- API keys are configured on first run and stored locally in
.shellmate/keys.txt - The
.shellmate/directory should be added to your.gitignore - No keys are hardcoded or committed to version control
- Tool execution happens in the current directory context
- The
writetool prompts for confirmation before overwriting existing files - No automatic file uploads or external data sharing
Contributions are welcome! Feel free to:
- Add new tools to the
src/tools/directory - Enhance the UI components
- Improve the system prompt
- Add support for different AI providers
- Experiment with different conversation strategies
This project demonstrates several key concepts in building AI-powered CLI tools:
- Streaming AI Responses — Server-Sent Events (SSE) parsing and token streaming
- Tool/Function Calling — OpenAI-compatible tool definitions and execution
- Terminal UI — Building interactive CLIs with React and Ink
- Conversation Management — Multi-turn conversations with tool integration
- AST Parsing — Using ts-morph to analyze code structure programmatically
- Tool-to-UI Communication — Bridge pattern for interactive tool prompts
- Type Safety — End-to-end TypeScript with Zod validation
- No async/background command support
- No file watching or hot reload
- Limited context window management
- No multi-session/multi-file diffs
- Basic error handling without retries
- No conversation history persistence
- Single-threaded tool execution
- Model change via
/change-modelrequires REPL restart to take effect
See FEATURES_COMPARISON.md for a detailed feature comparison.
MIT License — Feel free to use this for learning, experimentation, and building your own tools.
Inspired by the architectures of:
- Claude Code — Anthropic's AI coding assistant
- GitHub Copilot CLI — GitHub's terminal AI helper
- Warp AI — AI-integrated terminal emulator
Built by Ram Krishna Yadav — GitHub
Note: This is an educational project. For production use, consider the official tools like GitHub Copilot CLI or Claude Code.
