Skip to content

openai-codex/gpt-5.4 can resolve with api undefined when custom models.providers.openai-codex config shadows built-in model metadata #39682

@scrubbbie

Description

@scrubbbie

Summary

On openclaw 2026.3.7, openai-codex/gpt-5.4 can resolve to a model object with api: undefined when the user has a custom models.providers.openai-codex block in ~/.openclaw/openclaw.json that defines the model but omits api.

In my case this caused the gateway to crash-loop on startup when cron replayed missed isolated jobs, with:

Unhandled promise rejection: Error: No API provider registered for api: undefined

Environment

  • OpenClaw: 2026.3.7
  • Install type: non-Docker user install
  • Gateway launched via user systemd service
  • Provider: openai-codex
  • Model: gpt-5.4

Minimal config shape that triggers it

This custom provider block was present in ~/.openclaw/openclaw.json:

{
  "models": {
    "providers": {
      "openai-codex": {
        "baseUrl": "https://chatgpt.com/backend-api",
        "models": [
          {
            "id": "gpt-5.3-codex",
            "name": "gpt-5.3-codex"
          },
          {
            "id": "gpt-5.4",
            "name": "gpt-5.4"
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "openai-codex/gpt-5.4"
      }
    }
  }
}

Note that api is omitted from the custom provider block.

Repro

  1. Configure a custom models.providers.openai-codex block like the above, without api.
  2. Set agents.defaults.model.primary to openai-codex/gpt-5.4.
  3. Run an isolated cron agentTurn job, or restart the gateway so missed cron jobs replay.
  4. The gateway eventually crashes with:
No API provider registered for api: undefined

Expected

One of these should happen:

  • the built-in openai-codex/gpt-5.4 metadata should still win or be merged, preserving api: openai-codex-responses
  • or the config should be rejected/validated with a clear error because the custom provider/model definition is incomplete

Actual

The custom models.providers.openai-codex definition appears to shadow the built-in gpt-5.4 model metadata. That leaves gpt-5.4 with no resolved API, which later crashes in the streaming path with api: undefined.

Likely root cause

From local inspection of the built bundle in 2026.3.7:

  • gpt-5.4 is known to the built-in catalog / forward-compat path for openai-codex
  • but buildInlineProviderModels() creates inline models with:
    • api: model.api ?? entry.api
  • if the custom provider block omits both, the inline model has no api
  • and that custom provider/model path seems to take precedence over the built-in forward-compat model for openai-codex/gpt-5.4

So this looks like a precedence/merge issue between custom provider config and built-in model metadata.

Local workaround

Adding this fixed it immediately for me:

"models": {
  "providers": {
    "openai-codex": {
      "api": "openai-codex-responses",
      "baseUrl": "https://chatgpt.com/backend-api",
      "models": [
        { "id": "gpt-5.4", "name": "gpt-5.4" }
      ]
    }
  }
}

After adding api: "openai-codex-responses", the gateway came up cleanly on openai-codex/gpt-5.4 and stopped crashing.

Suggestion

It would help if OpenClaw either:

  • merged built-in model metadata into custom provider models when provider/model matches a known built-in model, or
  • validated models.providers.* entries so required transport fields like api cannot be omitted silently for providers that need them

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions