fix(observability): skip Sentry for local-AI "<x> binary not found" errors (OPENHUMAN-TAURI-9N)#1669
Conversation
…rrors (OPENHUMAN-TAURI-9N)
`local_ai_tts` / `local_ai_stt` / Ollama admin RPCs return deliberate
user-state errors when the required local binary isn't installed on the
host. The wire shapes today are:
- "piper binary not found. Set PIPER_BIN or install piper."
(service/speech.rs:199, hit by `local_ai_tts`)
- "whisper.cpp binary not found. Set WHISPER_BIN or install whisper-cli."
(service/speech.rs:164, hit by `local_ai_stt`)
- "Ollama binary not found at '<path>'. Provide a valid path to the
ollama executable."
(schemas.rs:828)
- "Ollama installed but binary not found on system"
(service/ollama_admin.rs:50)
All four are user-environment conditions — the error message itself is
the user-facing remediation ("Set PIPER_BIN or install piper.") — and
carry no remediable signal for Sentry. Today they flow through
`rpc.invoke_method` → `report_error_or_expected` and get captured as
error events (OPENHUMAN-TAURI-9N: 2 events on `0.53.22`, Windows user
in Saratov, RU, elapsed_ms=1 — synchronous binary-presence check).
Extend `expected_error_kind` with a new `LocalAiBinaryMissing` variant,
anchored on the shared substring `"binary not found"`. Logged at
`info!` via `report_expected_message` so the sentry-tracing layer
records at most a breadcrumb — same pattern as `LocalAiDisabled`.
Tests:
- `classifies_local_ai_binary_missing_errors` exercises all four
canonical wire shapes plus the wrapped `rpc.invoke_method failed: …`
form that reaches the classifier in production.
- `does_not_classify_unrelated_messages_as_binary_missing` pins the
anchor against false positives (spawn failures, empty transcripts,
download failures) that mention binaries in unrelated contexts and
should still surface as real Sentry events.
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughAdds an ExpectedErrorKind::LocalAiBinaryMissing variant, classifies "binary not found" messages as that kind, logs an info breadcrumb when encountered, and adds tests for positive and negative classification cases. ChangesLocal AI Binary Missing Error Handling
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In `@src/core/observability.rs`:
- Around line 53-55: The current classifier treats any error containing "binary
not found" as ExpectedErrorKind::LocalAiBinaryMissing, which is too broad;
update the match in observability.rs so the branch that returns
ExpectedErrorKind::LocalAiBinaryMissing only triggers when the lowercased error
contains "binary not found" AND also includes a local-AI provider marker such as
"piper", "whisper.cpp", or "ollama" (check for these substrings alongside the
existing check), keeping the same return value and the function/match structure
that evaluates lower; this narrows the rule to local-AI flows and prevents
suppressing unrelated errors.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 2018738f-3874-4a0f-9975-02507a513c1e
📒 Files selected for processing (1)
src/core/observability.rs
| if lower.contains("binary not found") { | ||
| return Some(ExpectedErrorKind::LocalAiBinaryMissing); | ||
| } |
There was a problem hiding this comment.
Narrow the binary-missing classifier to local-AI providers only.
Line 53 currently matches any error containing "binary not found", which can suppress Sentry for unrelated failures outside local-AI flows. Restrict this branch with local-AI markers (e.g., piper, whisper.cpp, ollama) to avoid observability blind spots.
Suggested fix
- if lower.contains("binary not found") {
+ if lower.contains("binary not found")
+ && (lower.contains("piper")
+ || lower.contains("whisper.cpp")
+ || lower.contains("ollama"))
+ {
return Some(ExpectedErrorKind::LocalAiBinaryMissing);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if lower.contains("binary not found") { | |
| return Some(ExpectedErrorKind::LocalAiBinaryMissing); | |
| } | |
| if lower.contains("binary not found") | |
| && (lower.contains("piper") | |
| || lower.contains("whisper.cpp") | |
| || lower.contains("ollama")) | |
| { | |
| return Some(ExpectedErrorKind::LocalAiBinaryMissing); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@src/core/observability.rs` around lines 53 - 55, The current classifier
treats any error containing "binary not found" as
ExpectedErrorKind::LocalAiBinaryMissing, which is too broad; update the match in
observability.rs so the branch that returns
ExpectedErrorKind::LocalAiBinaryMissing only triggers when the lowercased error
contains "binary not found" AND also includes a local-AI provider marker such as
"piper", "whisper.cpp", or "ollama" (check for these substrings alongside the
existing check), keeping the same return value and the function/match structure
that evaluates lower; this narrows the rule to local-AI flows and prevents
suppressing unrelated errors.
…ng-9n # Conflicts: # src/core/observability.rs
Summary
ExpectedErrorKind::LocalAiBinaryMissingto the observability classifier so local-AI RPCs that fail because piper / whisper.cpp / Ollama isn't installed stop being captured as Sentry errors.\"binary not found\", the shared marker emitted from four call sites inlocal_ai/.info!inreport_expected_message, mirroring the existingLocalAiDisabledpath — sentry-tracing records a breadcrumb, not an error.Fixes OPENHUMAN-TAURI-9N (2 events on
0.53.22, Windows user in Saratov, RU,elapsed_ms=1— synchronous binary-presence check).Why this is noise, not signal
The local-AI service deliberately returns these errors so the UI can prompt the user to install/configure the binary. The error message itself is the user-facing remediation ("Set PIPER_BIN or install piper."). It's a pure user-environment condition — we can't install the binary for them, and Sentry has no remediable signal.
Same category as the existing
local ai is disabledskip.Wire shapes covered
All four contain
\"binary not found\":\"piper binary not found. Set PIPER_BIN or install piper.\"—service/speech.rs:199, hit bylocal_ai_tts\"whisper.cpp binary not found. Set WHISPER_BIN or install whisper-cli.\"—service/speech.rs:164, hit bylocal_ai_stt\"Ollama binary not found at '<path>'. Provide a valid path to the ollama executable.\"—schemas.rs:828\"Ollama installed but binary not found on system\"—service/ollama_admin.rs:50Test plan
cargo test --lib core::observability::tests— 13 tests pass, including the two new classifier tests.classifies_local_ai_binary_missing_errorscovers all four canonical wire shapes plus the wrappedrpc.invoke_method failed: …form that hits the classifier in production.does_not_classify_unrelated_messages_as_binary_missingpins the anchor against false positives (spawn failures, empty transcripts) that mention binaries in unrelated contexts.cargo check --lib— no new warnings.cargo fmt.Summary by CodeRabbit
Bug Fixes
Tests