Skip to content

chore(testing): testing.toml uses provider="openai" which silently ignores [llm.router.reputation] RAPS config #2104

@bug-ops

Description

@bug-ops

Summary

testing.toml contains [llm.router.reputation] RAPS configuration but sets provider = "openai" at the top level. With provider = "openai", the bootstrap creates an OpenAIProvider directly — the [llm.router] and [llm.router.reputation] sections are silently ignored.

Evidence (CI-62)

Running with testing.toml and saving memory_save: zero change to router_reputation_state.json. No "reputation scoring enabled" log. record_quality_outcome never called.

Running with provider = "router" and same chain: "reputation scoring enabled" logged, alpha increments from 5.43 → 6.21 → 6.95 across 2 sessions ✅.

Impact

  • RAPS ([llm.router.reputation]) is never exercised in standard CI sessions using testing.toml
  • The reputation_state.json values only come from manual sessions using router configs
  • Future CI sessions may incorrectly conclude RAPS is tested when it's not

Fix

Two options:

  1. Change testing.toml to provider = "router" with chain = ["openai"] — enables RAPS and keeps identical LLM behavior
  2. Create a separate testing-router.toml with provider = "router" for RAPS-specific tests

Option 1 is preferred for consistent coverage. The created .local/testing/testing-raps.toml can serve as the template.

Severity

LOW — RAPS code is correct, just not tested in standard CI sessions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    choreMaintenance tasksllmzeph-llm crate (Ollama, Claude)size/SSmall PR (11-50 lines)testingTests and quality

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions