Skip to content

openai-codex/gpt-5.4 is configurable but not actually supported in runtime (missing/Unknown model) #37623

@mane81

Description

@mane81

Summary

On OpenClaw 2026.3.2, openai-codex/gpt-5.4 can be added to config and appears in models list, but it is still treated as missing and fails at runtime with Unknown model / HTTP 404.

This makes it look like GPT-5.4 via openai-codex is supported when it is not yet wired through the runtime.

Environment

  • OpenClaw version: 2026.3.2
  • Install method: pnpm
  • Auth: openai-codex OAuth profile present and valid
  • Platform: macOS

Reproduction

  1. Configure the main agent or default model as:
    • openai-codex/gpt-5.4
    • or openai-codex/gpt-5.4-codex
  2. Restart gateway/node.
  3. Run:
    • openclaw models list
    • openclaw models status --json
    • start a fresh main session / normal agent turn

Actual behavior

  • openclaw models list shows:
    • openai-codex/gpt-5.4 ... configured,missing
    • openai-codex/gpt-5.4-codex ... configured,missing
  • runtime errors include:
    • FailoverError: Unknown model: openai-codex/gpt-5.4
    • FailoverError: HTTP 404: 404 page not found

Expected behavior

One of these should be true:

  1. openai-codex/gpt-5.4 is fully supported in runtime/catalog/forward-compat and works.
  2. Or OpenClaw should reject it early and clearly, instead of allowing it into config and then failing later at runtime.

Findings

It looks deeper than an allow-list issue.

The installed runtime still appears hard-wired around gpt-5.3-codex for openai-codex:

  • dist/model-picker-CGU6hX_z.js
    • OPENAI_CODEX_DEFAULT_MODEL = "openai-codex/gpt-5.3-codex"
  • dist/model-ZurrFOi9.js
    • Codex forward-compat/fallback logic is centered on gpt-5.3-codex
  • dist/model-catalog-qZGHxvcI.js
    • no equivalent generic handling for gpt-5.4

So the config layer accepts the model, but runtime/catalog resolution does not fully support it.

Suggested fix

  • Add proper runtime/catalog/forward-compat support for openai-codex/gpt-5.4 (and possibly gpt-5.4-codex if that is the intended canonical id), or
  • fail validation early with a clear error if openai-codex/gpt-5.4 is not supported yet.

Notes

I verified that falling back to openai-codex/gpt-5.3-codex restores normal operation, which makes this look specifically like missing GPT-5.4 integration rather than broken OAuth/auth.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions