Skip to content

docs: add BYOCLI guide (docs/byocli.md)#129

Open
BlakeHung wants to merge 7 commits intoopenabdev:mainfrom
BlakeHung:feat/ollama-acp-adapter
Open

docs: add BYOCLI guide (docs/byocli.md)#129
BlakeHung wants to merge 7 commits intoopenabdev:mainfrom
BlakeHung:feat/ollama-acp-adapter

Conversation

@BlakeHung
Copy link
Copy Markdown

@BlakeHung BlakeHung commented Apr 7, 2026

Summary

Add docs/byocli.md — a guide for bringing your own ACP-compatible CLI to openab.

Contents

  • ACP protocol requirements — JSON-RPC methods (initialize, session/new, session/prompt), notification types, message format
  • Configuration — how to wire a custom CLI into config.toml
  • Testing guide — manual JSON-RPC testing + end-to-end with openab
  • Session lifecycle — spawn → init → session → prompt → cleanup
  • Ollama example — using local-ai-acp as a reference implementation, with resource requirements table

Changes

  • docs/byocli.md — new file
  • README.md — reverted to upstream (no changes)
  • config.toml.example — reverted to upstream (no changes)

Per maintainer feedback: README stays focused on official backends, community adapters documented in the BYOCLI guide.

- Add ollama-acp adapter written in Rust, bridging ACP JSON-RPC to
  Ollama's OpenAI-compatible chat completions API with SSE streaming
- Add Dockerfile.ollama for containerized deployment (pure Rust, no
  Node.js runtime needed)
- Add Ollama preset to config.toml.example
- Supports any Ollama model (default: gemma4:26b)
- Full ACP compliance: initialize, session/new, session/prompt
- Streaming notifications: agent_message_chunk, tool_call, etc.
@BlakeHung BlakeHung requested a review from thepagent as a code owner April 7, 2026 17:14
@BlakeHung BlakeHung changed the title feat: add Ollama ACP adapter (Rust) for local AI support docs: add local-ai-acp reference to config.toml.example Apr 8, 2026
Add local-ai-acp (https://github.com/BlakeHung/local-ai-acp) to:
- Pluggable Agent Backends table in README
- Manual config.toml examples with setup instructions
- config.toml.example with commented preset

local-ai-acp is a standalone Rust binary that bridges any
OpenAI-compatible API (Ollama, LocalAI, vLLM, llama.cpp, LM Studio)
to ACP. No runtime dependencies.
@BlakeHung BlakeHung changed the title docs: add local-ai-acp reference to config.toml.example docs: add local-ai-acp BYOCLI for local AI support Apr 8, 2026
thepagent

This comment was marked as off-topic.

@thepagent
Copy link
Copy Markdown
Collaborator

We've been thinking about this more — rather than adding individual third-party adapters to the README, we think the right move is:

  1. Create a BYOCLI guide (docs/byocli.md) that documents how to bring your own CLI — what the ACP protocol expects over stdio, how to configure config.toml, and how to test it end-to-end.
  2. Include an Ollama example in that guide as a reference implementation.

This way the README stays focused on officially bundled backends, and community adapters like local-ai-acp have a clear path to plug in without needing to be listed in the main docs.

We'll get that guide written up. Once it's in place, would you be open to restructuring this PR to reference the BYOCLI guide instead of adding directly to the README table?

@thepagent
Copy link
Copy Markdown
Collaborator

To clarify our thinking on this:

  README.md — Official backends only
  ┌────────────────────────────────────────────────┐
  │  Pluggable Agent Backends                      │
  │                                                │
  │  Official CLIs (bundled/tested by openab)      │
  │  ┌──────────────────────────────────────────┐  │
  │  │  • kiro-cli                              │  │
  │  │  • codex-acp                             │  │
  │  │  • claude-agent-acp                      │  │
  │  │  • gemini                                │  │
  │  │  • cursor, copilot, qwen ... (planned)   │  │
  │  └──────────────────────────────────────────┘  │
  │                                                │
  │  Want to bring your own CLI?                   │
  │  → See docs/byocli.md                         │
  └────────────────────────────────────────────────┘

  docs/byocli.md — BYOCLI guide
  ┌────────────────────────────────────────────────┐
  │                                                │
  │  For any 3rd-party or custom CLI/adapter:      │
  │                                                │
  │  1. ACP protocol requirements (stdio JSON-RPC) │
  │  2. config.toml setup                          │
  │  3. Testing your adapter                       │
  │  4. Example: Ollama with local-ai-acp          │
  │                                                │
  │  You configure it yourself — openab doesn't    │
  │  need to know about your CLI, just speak ACP.  │
  └────────────────────────────────────────────────┘

The README table is reserved for official/tested backends. Third-party adapters like local-ai-acp fall under BYOCLI — users can plug in anything that speaks ACP over stdio by following the guide. No code changes to openab needed.

We'll get docs/byocli.md written up with an Ollama example. Once that's in, this PR can be restructured to contribute to the BYOCLI guide instead of the README table.

@BlakeHung
Copy link
Copy Markdown
Author

@thepagent
Thanks for the direction! This makes a lot of sense — keeping the README focused on official backends and having a dedicated BYOCLI guide is a cleaner approach.

I'm happy to restructure this PR to contribute a docs/byocli.md instead. I'll include:

  1. ACP protocol requirements — what a BYOCLI needs to implement over stdio (initialize, session/new, session/prompt, notifications)
  2. config.toml setup — how to wire a custom CLI into openab
  3. Testing guide — how to test a BYOCLI end-to-end with manual JSON-RPC
  4. Ollama example — using local-ai-acp as a reference implementation

I'll revert the README changes and push the new guide shortly. Let me know if there's a preferred structure or anything else you'd like covered in the doc!

- Add docs/byocli.md covering ACP protocol requirements, config setup,
  testing guide, and session lifecycle
- Include local-ai-acp as reference BYOCLI implementation
- Revert README.md and config.toml.example to upstream (no changes)

Per maintainer feedback: README stays focused on official backends,
community adapters are documented in the BYOCLI guide.
@BlakeHung BlakeHung changed the title docs: add local-ai-acp BYOCLI for local AI support docs: add BYOCLI guide (docs/byocli.md) Apr 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants