Conversation
…r for RAPS coverage (#2104) Previously the local .local/config/testing.toml used provider="openai" which caused the bootstrap to create an OpenAIProvider directly, silently ignoring [llm.router] and [llm.router.reputation] (RAPS). With provider="router" and chain=["openai"], LLM behavior is identical but ReputationTracker is active and record_quality_outcome is called. Add config/testing.toml as a tracked canonical reference with the correct configuration. Developers should copy it to .local/config/testing.toml before running CI sessions.
e6bb8e2 to
a5bcfa8
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
config/testing.tomlas a tracked canonical reference config for CI sessionsprovider = "openai"→provider = "router"so[llm.router.reputation]RAPS config is active[llm.router]chain is["openai"], same model/endpointcp config/testing.toml .local/config/testing.tomlRoot cause: With
provider = "openai", bootstrap createsOpenAIProviderdirectly, skipping[llm.router]and[llm.router.reputation]entirely.ReputationTrackeris never activated,record_quality_outcomeis never called.Evidence from CI-62: zero change to
router_reputation_state.jsonwithprovider = "openai". Withprovider = "router"alpha incremented from 5.43 → 6.21 → 6.95 across sessions.Closes #2104
Test plan
cargo +nightly fmt --check— passcargo clippy --workspace --features full -- -D warnings— pass (0 warnings)cargo nextest run --workspace --features full --lib --bins— 6363 passedconfig/testing.tomlto.local/config/testing.toml, run a live session, verifyrouter_reputation_state.jsonupdates after tool calls