refactor(commands,#1198): migrate recall-engrams to RustBackedCommand#1265
Merged
Conversation
Sister command of cognition/admit-inbox-message (refactored in #1256). Same shape — validate, call mixin, wrap result — now expressed as RustBackedCommand subclass declarations rather than re-implemented boilerplate. - requiredParams = ['personaId'] - validateParams() override adds the kind-companion required-field checks (by_id needs id, by_keyword needs keyword, by_origin needs origin); calls super first - callRust delegates to the typed mixin - toResult shapes the snake_case Rust response into the camelCase result via the existing factory Behavior preserved: every original validation message + return shape matches. Net: 86 -> 100 LOC, but most new lines are typed declarations and the explicit per-field error messages — boilerplate is gone. npm run build:ts clean.
joelteply
added a commit
that referenced
this pull request
May 16, 2026
…ion/vision-describe (#1292) * feat(cognition,#1276): migrate VisionInferenceProvider to Rust cognition/vision-describe Per Joel 2026-05-15 ("mission to eliminate slop and slowly oxidize this project") and the #1248 oxidizer umbrella, move TS-side vision inference orchestration to Rust. TS becomes a thin shim. Outlier-validation pair with codex's #1284 (AIDecisionService.evaluateGating → cognition/should-respond, structured-decision shape); this card is the freeform-shape outlier. Same Rust+thin-TS-shim pattern as recall-engrams (#1265). ## Rust side (new) `workers/continuum-core/src/cognition/vision_describe.rs` — 337 LOC. Owns: 1. Vision-capable model selection (filter `model_registry` by `Capability::Vision`, prefer local providers). Single source of truth — no more `process.env.*_API_KEY` checks scattered in TS. 2. Prompt construction from option flags (detectObjects/Colors/Text, maxLength). Pure function; unit-tested. 3. Multimodal request assembly (text + base64 image content parts). 4. Inference dispatch via `runtime::execute_command_json("ai/generate", ...)` so the existing Rust adapters (Anthropic / OpenAI / LlamaCpp) shape the multimodal payload per their own native API contracts. 5. Response parsing into `VisionDescription`. Pure function; unit-tested. ts-rs auto-emits `VisionDescribeRequest`, `VisionDescribeOptions`, `VisionDescription` to `shared/generated/cognition/`. ## IPC wiring `modules/cognition.rs` — adds the `cognition/vision-describe` handler that parses params into `VisionDescribeRequest` and calls `describe_image`. `bindings/modules/cognition.ts` adds the `cognitionVisionDescribe` mixin entry on the RustCoreIPC client. ## TS side (collapsed) `system/vision/VisionInferenceProvider.ts` — 176 LOC → 86 LOC. Every method is now a single `Commands.execute('cognition/vision-describe', ...)` call. The four pieces of logic (selectModel, buildPrompt, generateText dispatch, parseResponse) are gone TS-side. `commands/cognition/vision-describe/` — generated via `scripts/cli.ts command generator/specs/cognition-vision-describe.json --force`, then the server command is refactored to extend `RustBackedCommand` (same shape as `recall-engrams`). All scaffolding (README, package.json, browser/server/shared/test) lands together so the command is discoverable + ./jtag-callable + persona-tool-callable. ## Verified - `cargo check --features metal` — clean (0 errors) - `cargo test cognition::vision_describe --lib --features metal` — 7 passed (4 unit tests + 3 ts-rs export tests) - `npx tsc --noEmit -p tsconfig.json` — vision-describe surface clean ## Phase 2 (deferred) Delete the orphaned `availableModels()` method on the TS shim once all callers move to a dedicated `ai/providers/list` Rust IPC with capability filter. Today the shim returns `[]` (legacy diagnostics surface only). Mission: Joel 2026-05-15 — "eliminate slop and slowly oxidize this project" Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * chore(eslint-baseline): ratchet -3 after #1276 vision-describe migration VisionInferenceProvider.ts collapse from 176 LOC to 86 LOC + new thin imports cleared 3 ESLint errors. Mac local: 5452→5449. Linux baseline mirrored at +1 (the standard platform skew). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * chore(eslint-baseline): linux ratchet -1 to match #1276 Mac baseline Linux CI counted 5449 errors while baseline was 5450. Mac local was already at 5449 (matching reality). Sync linux to lock the win. * fix(cognition,#1276): address review on vision-describe + linux baseline Review fixes: Block-merge (1) — `availableModels()` returned `[]` silently: - Delete VisionInferenceProvider.availableModels() + the matching VisionDescriptionService.getAvailableModels() accessor (no production callers; deletion is per the no-silent-fallback rule). - For the human-readable "what vision models do we have?" surface, the upcoming `ai/providers/list` IPC with capability filter is the right home. Block-merge (2) — no test coverage on select_vision_model 4-branch: - Factor priority logic into pure helper `pick_vision_candidate(&[VisionCandidate], &VisionDescribeOptions)` + add 7 unit tests covering: empty input, priority 1 (preferred_model), priority 2 (preferred_provider), priority 3 (local), priority 4 (first), unknown preferred_model fallthrough, unknown preferred_provider fallthrough. Nits: - finish_reason: deserialize the wire string back into the typed `crate::ai::types::FinishReason` enum + pattern-match. Catches any future variant rename at compile time on both sides. - max_tokens: switch to `len.div_ceil(4)` (was `(len + 3) / 4` — same value, clearer intent). - describe_image: log substitution when preferred_model wasn't honored, so the call site can audit which provider actually ran. Preserved (with explicit doc): - VisionInferenceProvider.isAvailable() and VisionDescriptionService.isAvailable() — three production callers use them as `if (!isAvailable()) skip-this-work` guards (MediaPrewarmServerCommand, LiveRoomSnapshotService, MediaArtifactSource). Migration shim returns true synchronously with explicit doc that "true is best-effort; describe() returning null is the real signal." Future card replaces with async ai/providers/list-backed check. Linux baseline: - `eslint-baseline.linux.txt` ratcheted -1 to 5449 to match the Mac baseline + the actual count post-#1276 deletions. Verified: - cargo test cognition::vision_describe --lib --features metal: 14 passed (was 7 — added 7 priority-logic tests) - npx tsc --noEmit -p tsconfig.json: clean --------- Co-authored-by: Test <test@test.com> Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Follow-on to #1256 (which shipped `RustBackedCommand` + first migration of cognition/admit-inbox-message). Same shape — same migration.
What
`commands/cognition/recall-engrams/server/CognitionRecallEngramsServerCommand.ts` was the second-largest pre-#1256 example of the "validate → call mixin → wrap result" boilerplate. Now expressed declaratively:
Behavior preserved
Every original validation message + return shape matches. Net: 86 → 100 LOC, but most new lines are typed declarations + the explicit per-field error messages — boilerplate is gone.
Verification
Why this PR shape
Per #1198: incremental migration, one command per PR. ~50 Rust-backed commands remain; each adopts the base class on its own PR and sheds ~30 LOC of envelope.