ask is a Go CLI to query LLMs from your terminal with SDK-free HTTP clients.
go build -o ask .- Set a provider key. If missing,
config.jsonandconfig.template.jsonare created automatically on first config-using command:
ask key set openai --value sk-...- Set default provider and model:
ask provider set openai
ask models select --provider openai --search mini- Ask:
ask "command to remove a commit from git"Or start without a question and type it interactively:
ask
# ask> command to remove a commit from gitYou can also pipe the question via stdin:
echo "command to remove a commit from git" | askIf the model returns command, ask opens an editable terminal prompt with the command prefilled.
Enter: run command- Edit text first: run edited command
Ctrl+C: copy suggested command to clipboard and exit promptCtrl+D: exit prompt without running
ask "question" [options]
ask models list|select|set|current
ask provider list|current|set|show|add|remove
ask key set|show|clear
ask config show|path|template
ask markdown on|off|status
ask help ask|models|provider|key|config|markdown-p, --provider <name>-m, --model <id>--timeout <dur|sec>(default:90s)--no-markdown--no-run--json
If your question starts with -, use:
ask -- "-question that starts with dash"Built-in providers:
openaianthropicgeminiollamaopenrouter
Add a custom OpenAI-compatible provider:
ask provider add myproxy \
--base-url https://llm.example.com/v1 \
--api-key-env MYPROXY_API_KEYDefault config path:
- macOS/Linux:
~/.ask/config.json - Windows:
%USERPROFILE%\.ask\config.json
Overrides:
ASK_CONFIG=/path/to/config.jsonASK_CONFIG_DIR=/path/to/config/dirask --config /path/to/config.json ...
Security defaults:
- config directory mode:
0700 - config file mode:
0600
API key resolution order:
- Environment variable from
api_key_env(or built-in default env var) api_keyinconfig.json
Show active paths:
ask config path
ask config templateShow raw config content:
ask config show{
"version": 1,
"current_provider": "",
"providers": {
"openai": {
"api_key": "",
"model": "gpt-5-nano",
"api_key_env": "OPENAI_API_KEY"
},
"anthropic": {
"api_key": "",
"model": "",
"api_key_env": "ANTHROPIC_API_KEY"
},
"gemini": {
"api_key": "",
"model": "",
"api_key_env": "GEMINI_API_KEY"
},
"ollama": {
"api_key": "",
"model": "",
"base_url": "http://127.0.0.1:11434"
},
"openrouter": {
"api_key": "",
"model": "",
"api_key_env": "OPENROUTER_API_KEY"
}
},
"custom_providers": {
"myproxy": {
"base_url": "https://llm.example.com/v1",
"api_key": "",
"model": "",
"api_key_env": "MYPROXY_API_KEY",
"headers": {
"X-Client-Name": "ask"
}
}
},
"render_markdown": true
}- Model lists are fetched from provider APIs
- Responses are requested in structured JSON (
answer,command) with fallback parsing. - Markdown rendering uses
charmbracelet/glamour.
go test ./...
go vet ./...
go build ./...