Skip to content

fix(llm): return best_seen when escalations_remaining==0 and should_escalate#1759

Merged
bug-ops merged 3 commits intomainfrom
cascade-best-seen-escalation
Mar 14, 2026
Merged

fix(llm): return best_seen when escalations_remaining==0 and should_escalate#1759
bug-ops merged 3 commits intomainfrom
cascade-best-seen-escalation

Conversation

@bug-ops
Copy link
Copy Markdown
Owner

@bug-ops bug-ops commented Mar 14, 2026

Summary

  • In cascade_chat and cascade_chat_stream, when escalations_remaining == 0 && should_escalate, the current provider's response was returned instead of best_seen — discarding a potentially higher-quality earlier response
  • Applied the same pattern already used for the token-budget-exhaustion branch: best_seen.take().map_or(current, |(r, _)| r)
  • Two new tests (cascade_escalations_exhausted_returns_best_seen_not_current and cascade_stream_escalations_exhausted_returns_best_seen_not_current) verify the corrected behaviour

Test plan

  • cargo nextest run --config-file .github/nextest.toml -p zeph-llm — all cascade tests pass, including the two new regression tests
  • cargo clippy --workspace --features full -- -D warnings — clean
  • cargo +nightly fmt --check — clean

Closes #1755

…scalate

In both cascade_chat and cascade_chat_stream, the escalations_remaining==0
branch returned the current provider's response even when should_escalate
was true. This ignored best_seen, potentially giving the caller a lower-
quality response than an earlier provider had produced.

Apply the same pattern already used for the budget-exhaustion branch:
return best_seen.take().map_or(current, |(r, _)| r).

Closes #1755.
@github-actions github-actions bot added documentation Improvements or additions to documentation llm zeph-llm crate (Ollama, Claude) rust Rust code changes bug Something isn't working size/M Medium PR (51-200 lines) labels Mar 14, 2026
@bug-ops bug-ops enabled auto-merge (squash) March 14, 2026 16:21
@bug-ops bug-ops merged commit c05dddf into main Mar 14, 2026
15 checks passed
@bug-ops bug-ops deleted the cascade-best-seen-escalation branch March 14, 2026 16:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working documentation Improvements or additions to documentation llm zeph-llm crate (Ollama, Claude) rust Rust code changes size/M Medium PR (51-200 lines)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(llm): cascade_chat_stream escalations_remaining==0 path ignores best_seen

1 participant