aga (short for Agent Gate) is a lightweight LLM hub designed for agentic workflows (human usage welcome too). It comes with built-in concurrency, tool execution, rate limiting, and conversation persistence.
curl -fsSL https://raw.githubusercontent.com/TolgaOk/agentgate/main/scripts/install.sh | bashOr with Go:
go install github.com/TolgaOk/agentgate/cmd/aga@latest
aga ask "What files are in this directory?" # create conversation
aga ask --json "summarize this project" # JSON output (for agentic call)
aga ask --context /tmp/session.json "now fix it" # continue a conversation (or start a new one)
aga ask --skill ./skills "say hi in whatsapp" # with custom skill directory
aga auth openai # sign in with ChatGPT (subscription)
aga auth status # check token status
aga metrics # llm usage summarySee aga ask --help for more options.
Agent Loop
┌──────────┐ ┌────────┐ ┌──────────┐ ┌──────────────┐
┌──────────┐│ │ │──▶│ LLM API │──▶│ text, │
┌──────────┐││ │ │ │(+context)│ │ context.json │
│ aga ask ││───▶│ │ └────┬─────┘ └──────────────┘
│ "prompt" │┘ │ SQLite │ tool? │
└──────────┘ │ │ ▼
│ │ ┌──────────┐
│ │◀──┤ tool exec│
│ │ │(+context)│
│ │ └──────────┘
└────────┘
▲
└── limit concurrent requests, usage tracking
Persistency: Each conversation is saved.
aga preserves the chat history in context.json for each conversation, updated incrementally with each LLM and tool call.
Concurrency: Limit parallel API calls.
Concurrent LLM calls are queued and executed with a limit per provider (or globally) via SQLite. A heartbeat mechanism automatically frees slots from crashed processes.
Tools: Extend the tools by adding
skill.mdthat describes aCLItool.
Tools can be registered as special skills that describe the CLI tool within the frontmatter inside the metadata attribute. For example:
skills/ls.md
---
name: ls
description: List directory contents
metadata:
command: ls
args:
path:
type: string
position: 1
desc: directory to list
all:
type: boolean
flag: "-a"
desc: show hidden files
---
<!-- No body is needed for tool skills -->Skills with metadata.command in the frontmatter are registered as tools.
Note: aga comes barebones with no tools or skills. You can add bash tool to cover most use cases or each CLI tool individually (including aga itself).
openai- Use subscription:
aga auth openai(primary) - Requires API key:
OPENAI_API_KEY(fallback)
- Use subscription:
anthropic(subscription is not supported)- Requires API key:
ANTHROPIC_API_KEY
- Requires API key:
openrouter- Requires API key:
OPENROUTER_API_KEY
- Requires API key:
AG_PROVIDER overrides the provider (e.g. AG_PROVIDER=anthropic).
AG_MODEL overrides the model (e.g. AG_MODEL=claude-sonnet-4-20250514).
~/.config/agentgate/
config.toml
system.md # system prompt
tokens.json # OAuth tokens (auto-managed)
skills/ # skill folder
~/.agentgate/
aga.db # SQLite
config.toml
provider = "openai" # openai, anthropic, openrouter
model = "gpt-5.2" # model name for the provider
max_tokens = 8192 # max output tokens per LLM call
max_steps = 20 # max agent loop iterations
concurrent_global_limit = 3 # max concurrent LLM calls across all providers
concurrent_per_provider_limit = 1 # max concurrent LLM calls per provider
[policy] # tool execution policy
timeout = "30s" # default timeout for tool execution
allowed = ["ls"] # auto accept
blocked = ["sudo"] # always rejectPolicy: Tools not in
allowedrequire y/N confirmation. Tools inblockedare always rejected.
Use --auto-accept to skip confirmations for agentic use (blocked tools are still rejected).