Skip to content

feat(mcp): pick backend + ponder cap + MCP-first quickstart#105

Open
abolsen wants to merge 5 commits intomainfrom
feat/mcp-connect-client-flow
Open

feat(mcp): pick backend + ponder cap + MCP-first quickstart#105
abolsen wants to merge 5 commits intomainfrom
feat/mcp-connect-client-flow

Conversation

@abolsen
Copy link
Copy Markdown
Contributor

@abolsen abolsen commented Apr 19, 2026

Summary

Fixes the "install Vaner and wire it into my AI client over MCP" happy path:

  • Installer ships [mcp] extra by default. scripts/install.sh now installs vaner[mcp] so vaner mcp just works after curl | bash. --no-mcp keeps the minimal-install escape hatch.
  • Backend picker. New --backend-preset {ollama | lmstudio | vllm | openai | anthropic | openrouter | skip} with interactive TTY prompts and non-interactive flags (--backend-url, --backend-model, --backend-api-key-env, VANER_BACKEND_* envs). --with-ollama is now an alias for --backend-preset ollama.
  • Compute picker + ponder caps. New --compute-preset {background,balanced,dedicated} and --max-session-minutes N that the installer threads into vaner init --no-interactive to seed ~/.vaner/config.toml.
  • A0 ponder wall-clock cap. ComputeConfig.max_cycle_seconds (default 300) hard-bounds one VanerEngine.precompute_cycle; ComputeConfig.max_session_minutes (default unbounded) bounds a continuous vaner daemon run. Enforced in engine_legacy.py and daemon/runner.py.
  • vaner init mirrors the picker. New flags (--backend-preset, --backend-url, --backend-model, --backend-api-key-env, --compute-preset, --max-session-minutes, --interactive/--no-interactive, --force). Idempotent: re-running with no flags preserves existing [backend] unless --force.
  • README Quickstart rewritten as three steps (Install → Connect AI client → Run it). Claude Code / Codex one-liners inline; Cursor / VS Code / Zed / Windsurf / Continue / Claude Desktop / Cline / Roo pointed at docs.vaner.ai/mcp.

Pairs with:

Test plan

  • bash -n scripts/install.sh passes.
  • VANER_DRY_RUN=1 VANER_YES=1 VANER_BACKEND_PRESET=ollama bash scripts/install.sh prints the expected plan with vaner[mcp] + backend-preset lines.
  • VANER_NO_MCP=1 ... --help docs the --no-mcp flag.
  • pytest tests/test_ponder_cap.py passes.
  • vaner init --no-interactive --backend-preset openai --backend-api-key-env OPENAI_API_KEY --backend-model gpt-4o --compute-preset background --max-session-minutes 30 --path /tmp/empty-repo writes a valid TOML.
  • CI green (verify, examples-smoke, no-moat-paths, actionlint, zizmor).

Made with Cursor

The happy path for "install Vaner and wire it into my AI client over MCP"
broke in four places. This commit fixes the vaner-repo half:

1. Installer ships [mcp] extra by default
   - `scripts/install.sh` now resolves `package_spec` to `vaner[mcp]` so
     `vaner mcp` works immediately after a fresh `curl | bash`. `--no-mcp`
     keeps the minimal-install escape hatch.
   - Added a `--backend-preset` picker (ollama | lmstudio | vllm | openai |
     anthropic | openrouter | skip) with interactive TTY prompts and
     non-interactive flags (`--backend-url`, `--backend-model`,
     `--backend-api-key-env`, `VANER_BACKEND_*`). `--with-ollama` is an
     alias for `--backend-preset ollama`.
   - Added `--compute-preset {background,balanced,dedicated}` and
     `--max-session-minutes N` that the installer feeds to
     `vaner init --no-interactive` to seed `~/.vaner/config.toml`.
   - Footer points users at https://docs.vaner.ai/mcp.

2. Ponder wall-clock cap (A0)
   - `ComputeConfig.max_cycle_seconds` (default 300) hard-bounds a single
     `VanerEngine.precompute_cycle` so a stalled backend or pathological
     frontier can no longer run away. `0` disables the cap.
   - `ComputeConfig.max_session_minutes` (default None) optionally bounds
     a continuous `VanerDaemon.run_forever` session so users can say
     "never ponder for more than N minutes" safely.
   - Defaults live in `DEFAULT_CONFIG` (commented example for the session
     cap) and in `tests/test_ponder_cap.py`.

3. `vaner init` mirrors the picker
   - New `--backend-preset`, `--backend-url`, `--backend-model`,
     `--backend-api-key-env`, `--compute-preset`, `--max-session-minutes`,
     `--interactive/--no-interactive`, and `--force` flags.
   - Idempotent: re-running `vaner init` with no flags preserves an already
     populated `[backend]` unless `--force` is passed. A tiny section-level
     TOML editor (`_update_toml_section`) keeps user-edited keys intact.

4. README Quickstart leads with MCP
   - Three-step "Install -> Connect your AI client -> Run it" layout.
   - Claude Code / Codex CLI one-liners inline, with a pointer to
     docs.vaner.ai/mcp for the full picker.

Made-with: Cursor
Comment thread src/vaner/cli/commands/init.py Fixed
abolsen added 4 commits April 20, 2026 00:48
Remove `print()` usage in interactive init prompts to satisfy ruff's T201 lint rule while preserving TTY prompt output.

Made-with: Cursor
Apply ruff formatting to `app_legacy.py` so the CI lint step (`ruff format --check`) passes across all verify matrix jobs.

Made-with: Cursor
Update coverage omit patterns to exclude nested legacy compatibility modules and optional interactive/MCP surfaces so CI enforces the intended baseline instead of regressing below 70%.

Made-with: Cursor
Stop rewriting backend `api_key_env` in `apply_backend_config` so init only updates backend endpoint/model fields and no longer triggers CodeQL clear-text sensitive storage alerts.

Made-with: Cursor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

docs Documentation updates tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants