ci(carl-smoke): extend probe to actually exercise chat → AI reply E2E#1000
Merged
Conversation
Per Joel's "100% free OOTB on MacBook Air on up, canary e2e working
from curl, Carl's case" — the existing smoke probe only validates the
page renders, not that a chat actually gets an AI reply. That's the
true Carl-impact gate: if Carl types "hello" + gets nothing, the
install isn't shippable, regardless of whether /health returned 200.
This extends the smoke script with a 4th phase:
4. End-to-end chat:
- Locate jtag binary (3 search paths)
- Send a unique probe message to #general
- Detect #994's "no listener" warning → exit 6 (distinct failure)
- Poll chat/export for an AI reply (default 90s timeout)
- On reply: report latency in PASS banner
- On timeout: list root-cause diagnostic commands per #964/#980 series
Exit codes (extends 0-3 from existing):
4 — chat/send command failed (system not ready for chat at all)
5 — no AI reply within timeout (the main Carl-blocker shape — silent AI)
6 — chat/send accepted but reported NO PERSONAS (#994 warning)
— distinct from 5: "no AI" vs "AI didn't respond"
CARL_CHAT_TIMEOUT_SEC env override (default 90s) for slow first-runs
where DMR is cold-loading the persona model.
The diagnostic message on exit 5 lists the post-#980 fix points so a
future regression has an obvious starting checklist:
- #997's 'local' default routing (cloud fallback dropped)
- DMR running (Docker Desktop 4.62+ check from install.sh)
- GPU EP cfg (#985/#991 fixed broken cfg gates)
- Persona model pulled into DMR
- NEW-A SIGABRT (tracked upstream as ggml-org/llama.cpp#22593)
Now CI's carl-install-smoke gate proves the OOTB chain works
end-to-end, not just up to the page render.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Per Joel: "100% free OOTB on MacBook Air on up, canary e2e working from curl, Carl case." Page-render gate is necessary but not sufficient — the actual Carl-impact target is "chat with AI works." Adds Phase 4 to carl-install-smoke.sh: send a probe message, poll chat/export for an AI reply, fail loudly with diagnostic root-cause checklist if silent.
Three new exit codes (4/5/6) distinguish failure modes (no chat-send / no AI reply / no persona listener).
Now CI proves the OOTB chain works end-to-end, not just up to page render.
🤖 Generated with Claude Code