-
Notifications
You must be signed in to change notification settings - Fork 2
chore(testing): testing.toml uses provider="openai" which silently ignores [llm.router.reputation] RAPS config #2104
Description
Summary
testing.toml contains [llm.router.reputation] RAPS configuration but sets provider = "openai" at the top level. With provider = "openai", the bootstrap creates an OpenAIProvider directly — the [llm.router] and [llm.router.reputation] sections are silently ignored.
Evidence (CI-62)
Running with testing.toml and saving memory_save: zero change to router_reputation_state.json. No "reputation scoring enabled" log. record_quality_outcome never called.
Running with provider = "router" and same chain: "reputation scoring enabled" logged, alpha increments from 5.43 → 6.21 → 6.95 across 2 sessions ✅.
Impact
- RAPS (
[llm.router.reputation]) is never exercised in standard CI sessions usingtesting.toml - The
reputation_state.jsonvalues only come from manual sessions using router configs - Future CI sessions may incorrectly conclude RAPS is tested when it's not
Fix
Two options:
- Change
testing.tomltoprovider = "router"withchain = ["openai"]— enables RAPS and keeps identical LLM behavior - Create a separate
testing-router.tomlwithprovider = "router"for RAPS-specific tests
Option 1 is preferred for consistent coverage. The created .local/testing/testing-raps.toml can serve as the template.
Severity
LOW — RAPS code is correct, just not tested in standard CI sessions.