Skip to content

leflakk/openclose

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenClose

A local-first Python AI coding assistant that works with any OpenAI-compatible API.

OpenClose runs entirely on your machine. Connect it to vLLM, llama.cpp, Ollama, or OpenAI — all conversations and data stay in a local SQLite database. The web UI streams responses in real time via SSE, and a headless CLI mode lets you script it into workflows.

Features

  • Local-first — SQLite storage, no cloud, no remote auth, no telemetry
  • Provider-agnostic — works with any OpenAI-compatible endpoint (vLLM, llama.cpp, Ollama, OpenAI)
  • Web UI — three-column layout with session sidebar, chat, and context info panel
  • Real-time streaming — Server-Sent Events for live token-by-token output
  • Two built-in agentsbuild (full tool access) and plan (read-only analysis)
  • Custom agents — define your own in config.toml with inheritance, custom prompts, and tool restrictions
  • 13 built-in tools — read, write, edit, glob, grep, ls, bash, patch, webfetch, todo, question, lint, multiedit
  • Permission system — per-tool allow/deny/ask rules with last-match-wins semantics
  • Auto-formatting — ruff, black, gofmt, rustfmt, prettier, shfmt, clang-format
  • Linting — ruff, mypy, eslint, golangci-lint, clippy, clang-tidy, cppcheck
  • Context management — automatic compaction when approaching the context window limit
  • CLI modeopenclose run -p "..." for non-interactive scripting with optional JSON output

Installation

Requires Python 3.12+.

From PyPI

pip install openclose

With uv (recommended)

uv pip install openclose

From source

git clone https://github.com/leflakk/openclose.git
cd openclose
uv sync

Quickstart

1. Configure a provider

Create ~/.config/openclose/config.toml (Linux) or ~/Library/Application Support/openclose/config.toml (macOS):

[[providers]]
name = "default"
base_url = "http://localhost:8000/v1"

If your provider requires an API key, set it via environment variable or config:

export OPENCLOSE_API_KEY="sk-..."
# or
export OPENAI_API_KEY="sk-..."

Key resolution order: OPENCLOSE_API_KEY env var > OPENAI_API_KEY env var > config file api_key field.

2. Start the web UI

openclose serve

Opens your browser to http://127.0.0.1:9876. Use --host, --port, --no-browser, or --project-dir to customize.

3. Or run headless

openclose run -p "Explain the main function in src/main.py"
openclose run -p "Add error handling to the parser" --agent build
openclose run -p "Analyze the test coverage gaps" --agent plan --json

4. List sessions

openclose sessions

Configuration

Configuration is loaded in priority order (highest wins):

  1. Environment variables (OPENCLOSE_*)
  2. Project config (.openclose/config.toml in your project directory)
  3. User config (~/.config/openclose/config.toml)
  4. Defaults

Example config.toml:

# Provider — any OpenAI-compatible endpoint
[[providers]]
name = "default"
base_url = "http://localhost:8000/v1"
api_key = ""
default_model = ""

# Session defaults
default_agent = "build"
max_context_tokens = 128000
compaction_threshold = 0.9

# Override built-in agent settings
[[agents]]
name = "build"
model = "qwen3-30b-a3b"

# Custom agent with inheritance
[[agents]]
name = "review"
extends = "plan"
description = "Code reviewer"
system_prompt = "You are a code reviewer. Focus on bugs, security, and clarity."

# Permission rules (last match wins)
[[permissions]]
tool = "*"
action = "ask"

[[permissions]]
tool = "read"
action = "allow"

[[permissions]]
tool = "bash"
path = "/tmp/*"
action = "deny"

See docs/guide-agents-customization.md for the full agent customization guide.

Agents

Agent Description Tool restrictions
build Primary agent with full tool access for code writing and execution None
plan Read-only analysis agent (extends build) Denied: write, edit, bash, patch

Custom agents support:

  • Inheritance via extends — create variants without repeating config
  • Semantic traits — e.g. ["readonly"] adjusts the system prompt automatically
  • Tool filteringallowed_tools and denied_tools control what the LLM can call
  • Custom system prompts with $variable substitution

Web UI

The web UI has a three-column layout: sessions sidebar, chat panel, and context/files sidebar.

Slash commands

Command Description
/new Start a new session
/sessions Switch to another session
/agents Switch agent
/compact Compress context window
/undo Remove last message pair
/export Export session as JSON
/copy Copy last response
/skip Toggle auto-approve permissions
/help Show available commands

Sessions can be forked to continue a conversation with a different agent.

Architecture

See ARCHITECTURE.md for the full package structure, design decisions, and data flow diagrams.

Contributing

git clone https://github.com/leflakk/openclose.git
cd openclose
uv sync
uv run pytest tests/

Code quality requirements:

  • Lintinguv run ruff check src/ tests/
  • Type checkinguv run mypy --strict src/ tests/
  • Testsuv run pytest tests/ --cov=openclose --cov-fail-under=80

CI runs all three checks on every push and pull request via GitHub Actions.

License

MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

No contributors