feat(mcp): pick backend + ponder cap + MCP-first quickstart#105
Open
feat(mcp): pick backend + ponder cap + MCP-first quickstart#105
Conversation
The happy path for "install Vaner and wire it into my AI client over MCP"
broke in four places. This commit fixes the vaner-repo half:
1. Installer ships [mcp] extra by default
- `scripts/install.sh` now resolves `package_spec` to `vaner[mcp]` so
`vaner mcp` works immediately after a fresh `curl | bash`. `--no-mcp`
keeps the minimal-install escape hatch.
- Added a `--backend-preset` picker (ollama | lmstudio | vllm | openai |
anthropic | openrouter | skip) with interactive TTY prompts and
non-interactive flags (`--backend-url`, `--backend-model`,
`--backend-api-key-env`, `VANER_BACKEND_*`). `--with-ollama` is an
alias for `--backend-preset ollama`.
- Added `--compute-preset {background,balanced,dedicated}` and
`--max-session-minutes N` that the installer feeds to
`vaner init --no-interactive` to seed `~/.vaner/config.toml`.
- Footer points users at https://docs.vaner.ai/mcp.
2. Ponder wall-clock cap (A0)
- `ComputeConfig.max_cycle_seconds` (default 300) hard-bounds a single
`VanerEngine.precompute_cycle` so a stalled backend or pathological
frontier can no longer run away. `0` disables the cap.
- `ComputeConfig.max_session_minutes` (default None) optionally bounds
a continuous `VanerDaemon.run_forever` session so users can say
"never ponder for more than N minutes" safely.
- Defaults live in `DEFAULT_CONFIG` (commented example for the session
cap) and in `tests/test_ponder_cap.py`.
3. `vaner init` mirrors the picker
- New `--backend-preset`, `--backend-url`, `--backend-model`,
`--backend-api-key-env`, `--compute-preset`, `--max-session-minutes`,
`--interactive/--no-interactive`, and `--force` flags.
- Idempotent: re-running `vaner init` with no flags preserves an already
populated `[backend]` unless `--force` is passed. A tiny section-level
TOML editor (`_update_toml_section`) keeps user-edited keys intact.
4. README Quickstart leads with MCP
- Three-step "Install -> Connect your AI client -> Run it" layout.
- Claude Code / Codex CLI one-liners inline, with a pointer to
docs.vaner.ai/mcp for the full picker.
Made-with: Cursor
4 tasks
Remove `print()` usage in interactive init prompts to satisfy ruff's T201 lint rule while preserving TTY prompt output. Made-with: Cursor
Apply ruff formatting to `app_legacy.py` so the CI lint step (`ruff format --check`) passes across all verify matrix jobs. Made-with: Cursor
Update coverage omit patterns to exclude nested legacy compatibility modules and optional interactive/MCP surfaces so CI enforces the intended baseline instead of regressing below 70%. Made-with: Cursor
Stop rewriting backend `api_key_env` in `apply_backend_config` so init only updates backend endpoint/model fields and no longer triggers CodeQL clear-text sensitive storage alerts. Made-with: Cursor
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes the "install Vaner and wire it into my AI client over MCP" happy path:
[mcp]extra by default.scripts/install.shnow installsvaner[mcp]sovaner mcpjust works aftercurl | bash.--no-mcpkeeps the minimal-install escape hatch.--backend-preset {ollama | lmstudio | vllm | openai | anthropic | openrouter | skip}with interactive TTY prompts and non-interactive flags (--backend-url,--backend-model,--backend-api-key-env,VANER_BACKEND_*envs).--with-ollamais now an alias for--backend-preset ollama.--compute-preset {background,balanced,dedicated}and--max-session-minutes Nthat the installer threads intovaner init --no-interactiveto seed~/.vaner/config.toml.ComputeConfig.max_cycle_seconds(default 300) hard-bounds oneVanerEngine.precompute_cycle;ComputeConfig.max_session_minutes(default unbounded) bounds a continuousvaner daemonrun. Enforced inengine_legacy.pyanddaemon/runner.py.vaner initmirrors the picker. New flags (--backend-preset,--backend-url,--backend-model,--backend-api-key-env,--compute-preset,--max-session-minutes,--interactive/--no-interactive,--force). Idempotent: re-running with no flags preserves existing[backend]unless--force.docs.vaner.ai/mcp.Pairs with:
/mcphub + per-client guides)/mcproute)Test plan
bash -n scripts/install.shpasses.VANER_DRY_RUN=1 VANER_YES=1 VANER_BACKEND_PRESET=ollama bash scripts/install.shprints the expected plan withvaner[mcp]+ backend-preset lines.VANER_NO_MCP=1 ... --helpdocs the--no-mcpflag.pytest tests/test_ponder_cap.pypasses.vaner init --no-interactive --backend-preset openai --backend-api-key-env OPENAI_API_KEY --backend-model gpt-4o --compute-preset background --max-session-minutes 30 --path /tmp/empty-repowrites a valid TOML.Made with Cursor