Context
FR-005 landed (closed by #19 / PR #21). The workflows now consume LLM_JUDGE_URL / LLM_JUDGE_MODEL / TL_DEV_KEY from repo secrets/vars, but all workflows still target `runs-on: ubuntu-latest`. Dual-mode LLM judging from a GitHub-hosted runner can't reach an Ollama instance on a private LAN without exposing that endpoint publicly.
FR-005's Proposed solution option 1 — a self-hosted runner on the same network as the existing Ollama instance at `192.168.2.103:11434` — is the path this ticket picks up.
Must
Nice to have
Out of scope
- Switching to a hosted LLM API — separate decision, separate ticket.
- Re-enabling automatic push/PR triggers — downstream of this, decide once the self-hosted path is stable.
Acceptance criteria
- A dispatch of `test-pipeline.yml` with `judge_mode=dual` on the self-hosted runner completes with LLM judge reaching the LAN Ollama instance and scoring normally.
- Runner survives a host reboot without manual intervention.
- `cicd/CI_SETUP.md` names the runner's expected labels, restart policy, and operator checklist.
Blocks / unblocks
- Unblocks the follow-up decision on re-enabling push/PR triggers (FR-005 open question).
- Does not block the hosted-API-alternative ticket — they solve the same problem differently, pick whichever lands first.
Context
FR-005 landed (closed by #19 / PR #21). The workflows now consume
LLM_JUDGE_URL/LLM_JUDGE_MODEL/TL_DEV_KEYfrom repo secrets/vars, but all workflows still target `runs-on: ubuntu-latest`. Dual-mode LLM judging from a GitHub-hosted runner can't reach an Ollama instance on a private LAN without exposing that endpoint publicly.FR-005's Proposed solution option 1 — a self-hosted runner on the same network as the existing Ollama instance at `192.168.2.103:11434` — is the path this ticket picks up.
Must
Nice to have
Out of scope
Acceptance criteria
Blocks / unblocks