Skip to content

openai-codex/gpt-5.4 forward-compat fallback still uses 272k context #37875

@yuweuii

Description

@yuweuii

Summary

openai-codex/gpt-5.4 was added in #36590, but the current forward-compat path still resolves it with the legacy Codex context window.

On current main, the openai-codex/gpt-5.4 fallback clones gpt-5.3-codex / gpt-5.2-codex templates without overriding contextWindow, so OpenClaw continues to treat it as a ~272k model locally.

Repro

  • On current main, openclaw models list shows openai-codex/gpt-5.4 at roughly 266k context.
  • src/agents/model-compat.test.ts currently expects resolveForwardCompatModel("openai-codex", "gpt-5.4", ...) to return contextWindow = 272_000.
  • In src/agents/model-forward-compat.ts, the openai-codex/gpt-5.4 forward-compat branch inherits template metadata instead of patching the GPT-5.4 window explicitly.

Impact

OpenClaw's local context accounting, model list output, and compaction decisions still use the old Codex window for openai-codex/gpt-5.4.

That can trigger unnecessary local compaction / retry behavior long before the backend request is actually at the GPT-5.4 limit.

Expected

openai-codex/gpt-5.4 should resolve with the GPT-5.4 window and max output budget instead of inheriting stale template values.

Proposed fix

When building the openai-codex/gpt-5.4 forward-compat model, explicitly override:

  • contextWindow = 1_050_000
  • maxTokens = 128_000

and update the existing forward-compat/list tests accordingly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions