A CLI chat client for multiple LLM providers with a plugin system powered by Yaegi.
Tools are written as plain Go source files and loaded at runtime — no recompilation needed.
- Multiple LLM Providers: Support for OpenAI, Gemini, Groq, GLM, and more
- Dynamic Plugin System: Write tools in Go without recompilation
- Persistent Memory: AI can learn and remember information across conversations
- Skills System: Load specialized prompts for different tasks
- Identity/Persona: Customize AI behavior with IDENTITY.md
- Session Resumption: Resume previous conversations per directory with
-resume - Interactive & One-shot Modes: Use interactively or pipe commands
go install github.com/yagi-agent/yagi@latest
yagi [options] [prompt]
| Flag | Description | Default |
|---|---|---|
-model |
provider/model format (e.g. google/gemini-2.5-pro) |
openai/gpt-4.1-nano |
-key |
API key (overrides environment variable) | |
-quiet |
Suppress informational messages | |
-verbose |
Show verbose output including plugin loading | |
-yes |
Skip plugin approval prompts (use with caution) | |
-list |
List available providers and models | |
-resume |
Resume previous session for the current directory | |
-skill |
Use a specific skill (e.g., explain, refactor, debug) |
|
-stdio |
Run in STDIO mode for editor integration | |
-v |
Show version |
The default model can be overridden with the YAGI_MODEL environment variable.
When run without arguments, yagi starts in interactive mode. Pass a prompt as arguments or via pipe for one-shot mode.
In interactive mode, the following slash commands are available:
| Command | Description |
|---|---|
/model [name] |
Show or change the current model |
/agent [on|off] |
Toggle autonomous mode (auto-execute tools without approval) |
/plan [on|off] |
Toggle planning mode (show execution plan before acting) |
/mode |
Show current mode settings |
/clear |
Clear conversation history |
/revoke [name] |
Revoke plugin approval (all to revoke all) |
/exit |
Exit yagi |
/help |
Show available commands |
When enabled (/agent on), tools are executed automatically without requiring user approval. This is useful for hands-free operation. The maximum number of autonomous iterations per turn is 20.
When enabled (/plan on), yagi asks the AI to generate a step-by-step execution plan before acting. You can review and confirm or cancel the plan.
# Start interactive chat with the default model
yagi
# Start interactive chat with Gemini
yagi -model google/gemini-2.5-flash
# Resume the previous session for the current directory
yagi -resume# Pass a prompt as arguments
yagi "こんにちは"
# Pipe input as a prompt
echo "Write FizzBuzz in Go" | yagi
# Specify a model for one-shot
yagi -model google/gemini-2.5-flash "Explain this error: segmentation fault"
# Pass file contents
cat main.go | yagi "Review this code"
# Pass command output
git diff | yagi "Summarize this diff"# List all available providers and models
yagi -list
# Filter models by keyword
yagi -list gemini
# Use a specific model
yagi -model google/gemini-2.5-pro "Hello"Models are specified in provider/model format. The following providers are supported:
| Provider | Env Variable |
|---|---|
openai |
OPENAI_API_KEY |
google |
GEMINI_API_KEY |
anthropic |
ANTHROPIC_API_KEY |
deepseek |
DEEPSEEK_API_KEY |
mistral |
MISTRAL_API_KEY |
groq |
GROQ_API_KEY |
xai |
XAI_API_KEY |
perplexity |
PERPLEXITY_API_KEY |
together |
TOGETHER_API_KEY |
fireworks |
FIREWORKS_API_KEY |
cerebras |
CEREBRAS_API_KEY |
cohere |
COHERE_API_KEY |
openrouter |
OPENROUTER_API_KEY |
qwen |
QWEN_API_KEY |
sambanova |
SAMBANOVA_API_KEY |
zai |
Z_AI_API_KEY |
Set the corresponding environment variable before running:
export GEMINI_API_KEY="your-api-key"
yagi -model google/gemini-2.5-flashUse yagi -list to see all available models, or yagi -list <keyword> to filter.
Yagi can use not only cloud-based LLM models but also locally running models.
The following explains how to use Qwen2.5-7B-Insturct running on llama-server.
-
Add the locally running
llama-serverprovider to~/.config/yagi/providers.json.[ { "name": "llama-server", "apiurl": "http://localhost:8080/v1" } ]By changing the port number correctly, it will also work with ollama (11434) and LM Studio (1234).
By changing the hostname (IP address), you can connect to an AI server running on another machine on your local LAN.
You can add multiple providers by changing the name.
-
Start llama-server
llama-server -hf Qwen/Qwen2.5-7B-Instruct-GGUF:qwen2.5-7b-instruct-q4_k_m -c 0 -fa on --jinjaDon't forget to specify the
--jinjaoption. -
Start yagi
yagi -model llama-server/llama-serveris the name of the provider you added in Step 1. The model name can be empty, but it is required, so you must add a/.
Yagi automatically saves conversation history per working directory. Use -resume to continue where you left off.
# Work on a project
cd ~/myproject
yagi "Add error handling to main.go"
# Later, resume the conversation
cd ~/myproject
yagi -resume
# [resumed 4 messages from previous session]Sessions are stored in ~/.config/yagi/sessions/ and are keyed by directory path. The last 100 messages (excluding system prompts) are retained. Tool call history is preserved so the AI retains full context.
You can customize the AI's behavior by specifying a custom identity file. The identity file path can be configured in three ways (in order of priority):
-
Environment Variable (recommended for GitHub Actions):
export YAGI_IDENTITY_FILE=/path/to/custom-identity.md yagi -
Config File (
~/.config/yagi/config.json):{ "prompt": "> ", "identity_file": "custom-identity.md" }Relative paths are resolved from the config directory (
~/.config/yagi/). -
Default:
~/.config/yagi/IDENTITY.md
- name: Run yagi with custom identity
env:
YAGI_IDENTITY_FILE: ${{ github.workspace }}/.github/yagi-identity.md
GEMINI_API_KEY: ${{ secrets.GEMINI_API_KEY }}
run: |
yagi "Review the latest commit"Yagi can learn and remember information across conversations using the built-in memory system. Learned information is stored in ~/.config/yagi/memory.json and automatically included in the AI's context.
Three memory management tools are included by default:
- remember: Save information for future recall
- recall: Retrieve previously saved information
- list_memories: View all stored memories
$ yagi "My name is Taro"
# AI uses the 'remember' tool to save: user_name = Taro
$ yagi "What's my name?"
# AI retrieves from memory: "Your name is Taro"
$ yagi "I prefer Go over Python"
# AI remembers: favorite_language = GoThe AI automatically uses these tools when appropriate. Memory is persistent across sessions and tied to the current config directory.
Tools are Go source files placed in ~/.config/yagi/tools/. Each file is interpreted by Yaegi at startup — no compilation required.
Define a Tool struct with the following fields:
| Field | Type | Description |
|---|---|---|
Name |
string |
Tool name (used in function calling) |
Description |
string |
Description shown to the LLM |
Parameters |
string |
JSON Schema for the tool's parameters |
Run |
func(context.Context, string) (string, error) |
Function that receives a context and JSON arguments string, returns the result and error |
The package name must be tool.
package tool
import (
"context"
"encoding/json"
)
var Tool = struct {
Name string
Description string
Parameters string
Run func(context.Context, string) (string, error)
}{
Name: "reverse",
Description: "Reverse the input string",
Parameters: `{
"type": "object",
"properties": {
"text": {
"type": "string",
"description": "The text to reverse"
}
},
"required": ["text"]
}`,
Run: func(ctx context.Context, args string) (string, error) {
var params struct {
Text string `json:"text"`
}
if err := json.Unmarshal([]byte(args), ¶ms); err != nil {
return "", err
}
runes := []rune(params.Text)
for i, j := 0, len(runes)-1; i < j; i, j = i+1, j-1 {
runes[i], runes[j] = runes[j], runes[i]
}
return string(runes), nil
},
}Tools can import "hostapi" to access host-provided functions that require dependencies not available in the Yaegi sandbox.
| Function | Signature | Description |
|---|---|---|
FetchURL |
func(ctx context.Context, url string, headers map[string]string) string |
Fetch URL content as raw body with optional HTTP headers |
HTMLToText |
func(ctx context.Context, html string) string |
Convert HTML to plain text with links preserved |
WebSocketSend |
func(ctx context.Context, url, message string, maxMessages, timeoutSec int) string |
Send a WebSocket message and collect responses as a JSON array |
SaveMemory |
func(ctx context.Context, key, value string) string |
Save a key-value pair to memory.json (returns "Saved" or error message) |
GetMemory |
func(ctx context.Context, key string) string |
Retrieve a value from memory by key (returns empty string if not found) |
DeleteMemory |
func(ctx context.Context, key string) string |
Delete a key from memory (returns "Deleted" or error message) |
ListMemory |
func(ctx context.Context) string |
List all memory entries as JSON |
package tool
import (
"context"
"encoding/json"
"hostapi"
)
var Tool = struct {
Name string
Description string
Parameters string
Run func(context.Context, string) (string, error)
}{
Name: "fetch_url",
Description: "Fetch the content of a URL and return it as text",
Parameters: `{
"type": "object",
"properties": {
"url": {
"type": "string",
"description": "The URL to fetch"
}
},
"required": ["url"]
}`,
Run: func(ctx context.Context, args string) (string, error) {
var params struct {
URL string `json:"url"`
}
if err := json.Unmarshal([]byte(args), ¶ms); err != nil {
return "", err
}
return hostapi.FetchURL(ctx, params.URL, nil), nil
},
}package tool
import (
"context"
"encoding/json"
"hostapi"
)
var Tool = struct {
Name string
Description string
Parameters string
Run func(context.Context, string) (string, error)
}{
Name: "remember",
Description: "Remember information for future conversations",
Parameters: `{
"type": "object",
"properties": {
"key": {"type": "string", "description": "Identifier (e.g., 'user_name')"},
"value": {"type": "string", "description": "Information to remember"}
},
"required": ["key", "value"]
}`,
Run: func(ctx context.Context, args string) (string, error) {
var params struct {
Key string `json:"key"`
Value string `json:"value"`
}
if err := json.Unmarshal([]byte(args), ¶ms); err != nil {
return "", err
}
return hostapi.SaveMemory(ctx, params.Key, params.Value), nil
},
}Tools can use any Go standard library package. For third-party functionality, use the host API described above.
MIT
Yasuhiro Matsumoto (a.k.a. mattn)