-
-
Notifications
You must be signed in to change notification settings - Fork 69.6k
openai-codex/gpt-5.4 forward-compat fallback still uses 272k context #37875
Description
Summary
openai-codex/gpt-5.4 was added in #36590, but the current forward-compat path still resolves it with the legacy Codex context window.
On current main, the openai-codex/gpt-5.4 fallback clones gpt-5.3-codex / gpt-5.2-codex templates without overriding contextWindow, so OpenClaw continues to treat it as a ~272k model locally.
Repro
- On current
main,openclaw models listshowsopenai-codex/gpt-5.4at roughly266kcontext. src/agents/model-compat.test.tscurrently expectsresolveForwardCompatModel("openai-codex", "gpt-5.4", ...)to returncontextWindow = 272_000.- In
src/agents/model-forward-compat.ts, theopenai-codex/gpt-5.4forward-compat branch inherits template metadata instead of patching the GPT-5.4 window explicitly.
Impact
OpenClaw's local context accounting, model list output, and compaction decisions still use the old Codex window for openai-codex/gpt-5.4.
That can trigger unnecessary local compaction / retry behavior long before the backend request is actually at the GPT-5.4 limit.
Expected
openai-codex/gpt-5.4 should resolve with the GPT-5.4 window and max output budget instead of inheriting stale template values.
Proposed fix
When building the openai-codex/gpt-5.4 forward-compat model, explicitly override:
contextWindow = 1_050_000maxTokens = 128_000
and update the existing forward-compat/list tests accordingly.