fix(agents): resolve gpt-5.4 crash when custom openai-codex config omits api#39753
Conversation
When a user configures `models.providers.openai-codex` with a models array but omits the `api` field, `buildInlineProviderModels` produces an entry with `api: undefined`. The inline-match early return then hands this incomplete model straight to the caller, skipping the forward-compat resolver that would supply the correct `openai-codex-responses` api — causing a crash loop. Let the inline match fall through to forward-compat when `api` is absent so the resolver chain can fill it in. Fixes openclaw#39682
Greptile SummaryThis PR fixes a crash loop for
Confidence Score: 5/5
Last reviewed commit: 7be385c |
| const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg); | ||
| expect(result.error).toBeUndefined(); | ||
| expect(result.model).toMatchObject({ | ||
| api: "openai-codex-responses", | ||
| id: "gpt-5.4", | ||
| provider: "openai-codex", | ||
| }); |
There was a problem hiding this comment.
Test missing baseUrl preservation assertion
The PR description specifically calls out that the fix "preserves any user-supplied baseUrl / headers". The test configures baseUrl: "https://custom.example.com" but never asserts it appears on the resolved model. Adding this assertion would guard against a future regression where the fall-through path drops the user-configured baseUrl.
| const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg); | |
| expect(result.error).toBeUndefined(); | |
| expect(result.model).toMatchObject({ | |
| api: "openai-codex-responses", | |
| id: "gpt-5.4", | |
| provider: "openai-codex", | |
| }); | |
| const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg); | |
| expect(result.error).toBeUndefined(); | |
| expect(result.model).toMatchObject({ | |
| api: "openai-codex-responses", | |
| id: "gpt-5.4", | |
| provider: "openai-codex", | |
| baseUrl: "https://custom.example.com", | |
| }); |
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/pi-embedded-runner/model.test.ts
Line: 658-664
Comment:
**Test missing `baseUrl` preservation assertion**
The PR description specifically calls out that the fix "preserves any user-supplied `baseUrl` / `headers`". The test configures `baseUrl: "https://custom.example.com"` but never asserts it appears on the resolved model. Adding this assertion would guard against a future regression where the fall-through path drops the user-configured `baseUrl`.
```suggestion
const result = resolveModel("openai-codex", "gpt-5.4", "/tmp/agent", cfg);
expect(result.error).toBeUndefined();
expect(result.model).toMatchObject({
api: "openai-codex-responses",
id: "gpt-5.4",
provider: "openai-codex",
baseUrl: "https://custom.example.com",
});
```
How can I resolve this? If you propose a fix, please make it concise.507cd3c to
cb07d15
Compare
|
Landed via temp rebase onto main. Thanks @justinhuangcode! |
Fixes #39682
Summary
When you configure
openai-codexwith a custom models list but don't setapi:buildInlineProviderModelsproducesapi: undefined, and the inline match returns it before forward-compat gets a chance to fill inopenai-codex-responses. This crashes the agent.The fix is small: only return the inline match early when it actually has an
api. Otherwise let it fall through to the forward-compat path, which already handles user config (baseUrl, headers etc.) viaapplyConfiguredProviderOverrides.Testing
Added a regression test for the config above. Existing 43 tests unaffected.