Skip to content

fix(agents): resolve gpt-5.4 crash when custom openai-codex config omits api#39753

Merged
steipete merged 2 commits intoopenclaw:mainfrom
justinhuangcode:fix/codex-inline-model-api-fallback
Mar 8, 2026
Merged

fix(agents): resolve gpt-5.4 crash when custom openai-codex config omits api#39753
steipete merged 2 commits intoopenclaw:mainfrom
justinhuangcode:fix/codex-inline-model-api-fallback

Conversation

@justinhuangcode
Copy link
Copy Markdown
Contributor

@justinhuangcode justinhuangcode commented Mar 8, 2026

Fixes #39682

Summary

When you configure openai-codex with a custom models list but don't set api:

models:
  providers:
    openai-codex:
      models:
        - id: gpt-5.4

buildInlineProviderModels produces api: undefined, and the inline match returns it before forward-compat gets a chance to fill in openai-codex-responses. This crashes the agent.

The fix is small: only return the inline match early when it actually has an api. Otherwise let it fall through to the forward-compat path, which already handles user config (baseUrl, headers etc.) via applyConfiguredProviderOverrides.

Testing

Added a regression test for the config above. Existing 43 tests unaffected.

When a user configures `models.providers.openai-codex` with a models
array but omits the `api` field, `buildInlineProviderModels` produces
an entry with `api: undefined`.  The inline-match early return then
hands this incomplete model straight to the caller, skipping the
forward-compat resolver that would supply the correct
`openai-codex-responses` api — causing a crash loop.

Let the inline match fall through to forward-compat when `api` is
absent so the resolver chain can fill it in.

Fixes openclaw#39682
@openclaw-barnacle openclaw-barnacle bot added agents Agent runtime and tooling size: XS labels Mar 8, 2026
@justinhuangcode justinhuangcode changed the title fix(agents): let forward-compat resolve api when inline model omits it fix(agents): resolve gpt-5.4 crash when custom openai-codex config omits api Mar 8, 2026
@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Mar 8, 2026

Greptile Summary

This PR fixes a crash loop for openai-codex/gpt-5.4 (and similar models) caused by resolveModelWithRegistry returning an incomplete model with api: undefined when an inline provider model entry omits the api field. The fix guards the inline-match early-return on inlineMatch.api being truthy, allowing the existing forward-compat resolver to run and supply the correct api value (e.g. "openai-codex-responses"). User-supplied baseUrl and headers are preserved because the fall-through path already calls applyConfiguredProviderOverrides with the full providerConfig.

  • The one-line guard in resolveModelWithRegistry (if (inlineMatch.api)) is minimal, targeted, and correct.
  • Tracing through applyConfiguredProviderOverrides confirms all user-specified metadata from the inline entry (baseUrl, headers, reasoning, contextWindow, maxTokens) is still applied via the providerConfig / configuredModel lookup — no data loss on the fallthrough.
  • A regression test for the exact bug scenario was added and follows established test patterns.
  • Minor: the new test omits an assertion for baseUrl preservation even though the PR description highlights it as an important invariant of the fix.

Confidence Score: 5/5

  • This PR is safe to merge — the change is a minimal, targeted guard that corrects a crash-inducing api: undefined return and has no negative side effects.
  • The fix is a single conditional around an existing early-return, well-understood in context, covered by a direct regression test, and the fallthrough path already handles all user-supplied overrides correctly through applyConfiguredProviderOverrides. No existing tests are broken and the logic change is easy to reason about end-to-end.
  • No files require special attention.

Last reviewed commit: 7be385c

Comment on lines +658 to +664
const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg);
expect(result.error).toBeUndefined();
expect(result.model).toMatchObject({
api: "openai-codex-responses",
id: "gpt-5.4",
provider: "openai-codex",
});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test missing baseUrl preservation assertion

The PR description specifically calls out that the fix "preserves any user-supplied baseUrl / headers". The test configures baseUrl: "https://custom.example.com" but never asserts it appears on the resolved model. Adding this assertion would guard against a future regression where the fall-through path drops the user-configured baseUrl.

Suggested change
const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg);
expect(result.error).toBeUndefined();
expect(result.model).toMatchObject({
api: "openai-codex-responses",
id: "gpt-5.4",
provider: "openai-codex",
});
const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg);
expect(result.error).toBeUndefined();
expect(result.model).toMatchObject({
api: "openai-codex-responses",
id: "gpt-5.4",
provider: "openai-codex",
baseUrl: "https://custom.example.com",
});
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/pi-embedded-runner/model.test.ts
Line: 658-664

Comment:
**Test missing `baseUrl` preservation assertion**

The PR description specifically calls out that the fix "preserves any user-supplied `baseUrl` / `headers`". The test configures `baseUrl: "https://custom.example.com"` but never asserts it appears on the resolved model. Adding this assertion would guard against a future regression where the fall-through path drops the user-configured `baseUrl`.

```suggestion
    const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg);
    expect(result.error).toBeUndefined();
    expect(result.model).toMatchObject({
      api: "openai-codex-responses",
      id: "gpt-5.4",
      provider: "openai-codex",
      baseUrl: "https://custom.example.com",
    });
```

How can I resolve this? If you propose a fix, please make it concise.

@justinhuangcode justinhuangcode force-pushed the fix/codex-inline-model-api-fallback branch from 507cd3c to cb07d15 Compare March 8, 2026 11:29
@steipete steipete merged commit 6e086a5 into openclaw:main Mar 8, 2026
28 checks passed
@steipete
Copy link
Copy Markdown
Contributor

steipete commented Mar 8, 2026

Landed via temp rebase onto main.

  • Gate: pnpm check && pnpm build && pnpm test
  • Land commit: 8a91af2
  • Merge commit: 6e086a5

Thanks @justinhuangcode!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling size: XS

Projects

None yet

Development

Successfully merging this pull request may close these issues.

openai-codex/gpt-5.4 can resolve with api undefined when custom models.providers.openai-codex config shadows built-in model metadata

2 participants