Five local AI model families. One CLI.
infrctl is a local-first command-line tool for running AI models from your terminal through Ollama. It keeps the user-facing model surface intentionally small: Qwen, DeepSeek, Llama, Gemma, and Phi.
CA: 0x22d0a50bb8b5789655ae1c84b34dc7804b0378e2
Dexscreener: https://dexscreener.com/base/0xfea8fb6981c153cfc66c855687290f441f94018db40627aa09c8fb79a631ce7bStandalone install, no Node.js required:
curl -fsSL https://raw.githubusercontent.com/infrctl/infrctl-cli/main/install.sh | shThe installer downloads the right binary from GitHub Releases into ~/.local/bin, verifies the release checksum, and installs Ollama automatically if it is missing on Linux or macOS.
npm install, for Node.js users:
npm install -g infrctlTo inspect the installer first:
curl -fsSL https://raw.githubusercontent.com/infrctl/infrctl-cli/main/install.sh -o install.sh
sh install.shRequirements:
- Linux or macOS for the curl installer.
curlorwget.
Node.js 18+ is required only for npm installs or local development.
If you already manage Ollama yourself, skip the bundled Ollama install:
curl -fsSL https://raw.githubusercontent.com/infrctl/infrctl-cli/main/install.sh \
| INFRCTL_SKIP_OLLAMA=1 shinfrctl setup --starter
infrctl
infrctl -p "say hello from infrctl"
infrctl smith "explain this repo"The starter setup installs Ollama if needed, starts it when possible, and pulls a lighter first-run model set: Phi and Qwen. Add more models later:
infrctl pull deepseek
infrctl pull llama
infrctl pull gemmainfrctl is designed to feel like a modern interactive terminal assistant:
infrctl # start interactive chat
infrctl qwen # start chat with Qwen
infrctl -p "explain this log" # print a one-shot answer
infrctl -p "continue" --continue
infrctl --resume <session-id>
infrctl sessionsUse JSON output for scripts:
infrctl -p "say hello" --output-format jsonPipe stdin into one-shot mode:
cat README.md | infrctl -p "summarize this"Interactive chat sessions autosave locally:
~/.infrctl/sessions/
~/.infrctl/chats/
Inside chat:
/help
/status
/transcript
/switch qwen
/system <prompt>
/temp 0.4
/clear
/exit
Smith is the local coding agent inside infrctl:
infrctl smith "find the config loading code"
infrctl smith "add tests for the status command"
infrctl smith --model qwen "refactor this function"
infrctl smith --dry-run "show the patch you would make"
infrctl smith --json "explain this repo"Run it without a task to open an interactive Smith session:
infrctl smithSmith works inside your current directory by default. To point it at another repo:
infrctl smith --cwd /path/to/repo "summarize the command structure"Safety defaults are intentionally conservative:
- Smith can scan, read, and search your repo.
- Smith drafts unified diff patches and asks before applying them.
- Smith asks before running shell commands.
- Smith saves reversible patch backups for undo.
- Smith reads key repo files like
AGENTS.md,README.md,package.json, and framework config. - Smith blocks destructive, publishing, deployment, and private-key-related commands in V1.
Permission profiles:
infrctl smith --profile safe "explain this code" # read/search only
infrctl smith --profile normal "fix this test" # default approvals
infrctl smith --profile fast "fix and run checks" # auto-edit safe patches
infrctl smith --profile danger "move quickly" # still blocks destructive commandsApproval modes:
infrctl smith --approval ask "add a small test" # default: approve patches
infrctl smith --approval step "inspect this bug" # approve each action
infrctl smith --approval auto-edit "fix a typo" # apply safe validated patchesShell modes:
infrctl smith --shell ask "run tests after the fix" # default: ask first
infrctl smith --shell safe "fix and check the build" # auto-run safe checks
infrctl smith --shell off "only propose file edits" # never run commandsUndo the latest Smith-applied patch for the current workspace:
infrctl smith --undoInside interactive Smith:
/status
/diff
/apply
/reject
/undo
/run npm test
/test full
/model qwen
/compact
/exit
Smith sessions are saved locally:
~/.infrctl/agent-sessions/
~/.infrctl/agent-patches/
~/.infrctl/repo-memory/
For coding tasks, Qwen and DeepSeek are usually the strongest defaults. Phi is useful for tiny machines and quick edits.
infrctl setup
infrctl setup --starter
infrctl models
infrctl pull qwen
infrctl pull all
infrctl chat qwen
infrctl ask qwen "hello"
infrctl serve qwen
infrctl serve qwen --auto-port
infrctl status
infrctl smith "add tests for config loading"
infrctl doctor
infrctl config show
infrctl sessions
infrctl resume
infrctl completion zsh
infrctl updateIf a selected model is missing, ask, chat, and serve offer to pull it. Use --yes to skip the prompt:
infrctl -p "hello" --yesV1 exposes exactly five model families:
- Qwen
- DeepSeek
- Llama
- Gemma
- Phi
The real Ollama tags are selected through a local registry and hardware-aware recommendations.
Start the local API server:
infrctl serve qwenIf the default port is busy:
infrctl serve qwen --auto-portChat completions:
curl http://127.0.0.1:8787/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "qwen",
"messages": [
{ "role": "user", "content": "Hello" }
]
}'Health check:
curl http://127.0.0.1:8787/healthinfrctl completion bash
infrctl completion zsh
infrctl completion fishCheck your environment:
infrctl doctor
infrctl statusIf Ollama is missing:
infrctl setupIf Ollama is installed but not running:
ollama serveIf a server port is busy:
infrctl serve phi --auto-portIf a model is missing:
infrctl pull phiStandalone users can rerun the installer:
curl -fsSL https://raw.githubusercontent.com/infrctl/infrctl-cli/main/install.sh | shNode.js users can update with npm:
npm install -g infrctl@latestFrom inside the CLI:
infrctl updateinfrctl runs locally through Ollama. It does not send prompts to cloud APIs. It does not collect telemetry.
cd infrctl
npm ci
npm run check
npm run build:binaryRelease checklist:
docs/release-checklist.md
MIT