Skip to content

fix(config): add azure-openai-responses to MODEL_APIS enum#52053

Closed
Cypherm wants to merge 7 commits intoopenclaw:mainfrom
Cypherm:fix/config-azure-openai-responses-api-enum
Closed

fix(config): add azure-openai-responses to MODEL_APIS enum#52053
Cypherm wants to merge 7 commits intoopenclaw:mainfrom
Cypherm:fix/config-azure-openai-responses-api-enum

Conversation

@Cypherm
Copy link
Copy Markdown
Contributor

@Cypherm Cypherm commented Mar 22, 2026

Summary

Problem: models.providers.*.api config validation rejects "azure-openai-responses" because it's missing from the MODEL_APIS enum, even though pi-ai fully registers it as an API provider with correct Azure-specific auth handling (AzureOpenAI client, api-key header, api-version query param).

Why it matters: Models not in the built-in catalog (e.g., newer Azure models like gpt-5.4-mini) cannot be resolved through config. Setting api: "openai-responses" as a workaround routes to the plain OpenAI SDK client, causing wrong auth headers and HTTP 404 from Azure endpoints. This also breaks heartbeat and subagent sessions silently (0 input/0 output tokens).

What changed:

  • Added "azure-openai-responses" to the MODEL_APIS array in src/config/types.models.ts

What did NOT change:

  • Zod schema (zod-schema.core.ts) automatically picks up the change via z.enum(MODEL_APIS) — no manual update needed
  • ModelApi type automatically widens via (typeof MODEL_APIS)[number]
  • No runtime behavior change — pi-ai already registers and handles azure-openai-responses correctly
  • No changes to OPENAI_MODEL_APIS in transcript-policy.ts, OPENAI_RESPONSES_PROVIDERS in openai-stream-wrappers.ts, or any provider implementation
  • Config help text and labels are generic enough to cover the new value without changes

Change Type

  • Bug fix (non-breaking)

Scope

Gateway (config schema validation)

Linked Issue

Closes #51735

Security Impact

  • New permissions requested: none
  • Secrets handling changes: none
  • New network calls: none
  • New command/tool execution: none
  • Data access changes: none

Human Verification

I personally verified:

  • pnpm tsgo passes — ModelApi type correctly includes "azure-openai-responses"
  • pnpm test -- src/config/ — 106 test files, 843 tests pass
  • pnpm format:fix — no formatting issues
  • Verified z.enum(MODEL_APIS) in zod-schema.core.ts automatically includes the new value (line 182)

Evidence

Test Files  106 passed (106)
     Tests  843 passed (843)
  Duration  26.74s

git diff --stat upstream/main...HEAD: 1 file changed, 1 insertion(+)

What I Did NOT Verify

  • Not verified: end-to-end with a real Azure OpenAI endpoint and a model not in the built-in catalog
  • Not verified: heartbeat/subagent sessions with azure-openai-responses config (tested validation layer only)

Failure Recovery

If this breaks in production:

  • Detection: Config validation errors for azure-openai-responses would reappear
  • Rollback: Remove the one-line addition from MODEL_APIS
  • Blast radius: Only affects users configuring Azure OpenAI Responses API in openclaw.json — zero impact on users who don't use this provider

Generated with Claude Code

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Mar 22, 2026

Greptile Summary

This PR fixes a config validation gap by adding "azure-openai-responses" to the MODEL_APIS enum in src/config/types.models.ts. Without this entry, setting api: "azure-openai-responses" in openclaw.json was rejected by the Zod schema (z.enum(MODEL_APIS) at line 182 of zod-schema.core.ts), forcing users to use "openai-responses" as a workaround — which routes requests to the plain OpenAI SDK client and causes wrong auth headers and HTTP 404s against Azure endpoints.

  • The fix is minimal and correct: the ModelApi type ((typeof MODEL_APIS)[number]) and ModelApiSchema (z.enum(MODEL_APIS)) both update automatically, so no other schema changes are required.
  • pi-ai already fully registers and handles azure-openai-responses at the runtime level (visible in openai-stream-wrappers.ts OPENAI_RESPONSES_PROVIDERS), so this is purely a validation layer unlock.
  • The decision not to add "azure-openai-responses" to OPENAI_MODEL_APIS in transcript-policy.ts is correct: that set is only consulted as a fallback when no explicit provider is present, and azure-openai-responses is resolved at the plugin/provider layer by pi-ai when a provider is specified.
  • No other azure API variants (e.g., azure-openai-completions) appear in the codebase, confirming the PR is correctly scoped.

Confidence Score: 5/5

  • Safe to merge — minimal, correct, isolated change with clear validation evidence and zero blast radius for existing users.
  • Single-line enum addition with automatic downstream propagation through the type system and Zod schema. The underlying runtime support already exists in pi-ai. The PR description explains all intentionally-untouched files, and the reasoning is sound. 843 tests pass.
  • No files require special attention.

Last reviewed commit: "fix(config): add azu..."

@openclaw-barnacle openclaw-barnacle bot added the docs Improvements or additions to documentation label Mar 22, 2026
@Cypherm
Copy link
Copy Markdown
Contributor Author

Cypherm commented Mar 22, 2026

@jalehman — Config schema rejects azure-openai-responses for models.providers.*.api even though pi-ai registers it → added to MODEL_APIS enum. 1 file, +1 line. Closes #51735.

CI failures (channels, extension-fast (discord)) are pre-existing on main — same discord test regression in the latest main CI run.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from 7d7ac8c to 6735fe3 Compare March 23, 2026 00:21
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6735fe390c

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

{"recordType":"path","path":"models.providers","kind":"core","type":"object","required":false,"deprecated":false,"sensitive":false,"tags":["models"],"label":"Model Providers","help":"Provider map keyed by provider ID containing connection/auth settings and concrete model definitions. Use stable provider keys so references from agents and tooling remain portable across environments.","hasChildren":true}
{"recordType":"path","path":"models.providers.*","kind":"core","type":"object","required":false,"deprecated":false,"sensitive":false,"tags":[],"hasChildren":true}
{"recordType":"path","path":"models.providers.*.api","kind":"core","type":"string","required":false,"enumValues":["openai-completions","openai-responses","openai-codex-responses","anthropic-messages","google-generative-ai","github-copilot","bedrock-converse-stream","ollama"],"deprecated":false,"sensitive":false,"tags":["models"],"label":"Model Provider API Adapter","help":"Provider API adapter selection controlling request/response compatibility handling for model calls. Use the adapter that matches your upstream provider protocol to avoid feature mismatch.","hasChildren":false}
{"recordType":"path","path":"models.providers.*.api","kind":"core","type":"string","required":false,"enumValues":["openai-completions","openai-responses","openai-codex-responses","azure-openai-responses","anthropic-messages","google-generative-ai","github-copilot","bedrock-converse-stream","ollama"],"deprecated":false,"sensitive":false,"tags":["models"],"label":"Model Provider API Adapter","help":"Provider API adapter selection controlling request/response compatibility handling for model calls. Use the adapter that matches your upstream provider protocol to avoid feature mismatch.","hasChildren":false}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Regenerate served config schema before updating enum baseline

This baseline line advertises azure-openai-responses, but the runtime schema served by config.schema is still sourced from GENERATED_BASE_CONFIG_SCHEMA (src/config/schema.ts:408-414), and that generated file still omits this enum value for models.providers.*.api and models.providers.*.models.*.api (src/config/schema.base.generated.ts:1029-1038 and 1133-1142). The commit therefore introduces a schema/baseline mismatch: tooling that relies on config.schema.lookup will not see the newly documented option, and pnpm config:docs:check can flag drift.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed in the follow-up commit (chore: regenerate base config schema after rebase). The generated file src/config/schema.base.generated.ts now includes azure-openai-responses in both models.providers.*.api and models.providers.*.models.*.api enum arrays.

"openai-completions",
"openai-responses",
"openai-codex-responses",
"azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Handle azure-openai-responses in Responses API guard paths

Allowing "azure-openai-responses" in MODEL_APIS makes this value reachable from user config, but multiple runtime guards still only treat "openai-responses"/"openai-codex-responses" as Responses APIs (for example sanitizeSessionHistory in src/agents/pi-embedded-runner/google.ts:566-583, stream wrapping in src/agents/pi-embedded-runner/run/attempt.ts:2320-2341, and OPENAI_RESPONSES_APIS in src/agents/pi-embedded-runner/openai-stream-wrappers.ts:11-159). Those guards apply transcript/payload downgrades required to avoid Responses API rejections (see src/agents/pi-embedded-helpers/openai.ts:85-209), so configs using the new enum can still fail on tool/reasoning replay flows.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Acknowledged. The Responses API guard paths (sanitizeSessionHistory, stream wrapping, etc.) are a pre-existing gap — they don't handle azure-openai-responses yet, but they also don't handle it on main today. This PR's scope is strictly the config validation layer (allowing the value through schema). The runtime guard paths are a separate concern tracked in the original issue discussion. Happy to address in a follow-up if maintainers want it bundled.

@Cypherm
Copy link
Copy Markdown
Contributor Author

Cypherm commented Mar 23, 2026

CI failure in check (tsgo TS1117) is unrelated to this PR:

  • Failed: tsgo reports TS1117: An object literal cannot have multiple properties with the same name in src/cli/acp-cli.option-collisions.test.ts, cron-cli.test.ts, devices-cli.test.ts, etc.
  • Our changes: exclusively in src/config/types.models.ts (+1 line) and src/config/schema.base.generated.ts (regen)
  • Same errors on main: confirmed identical TS1117 failures on main's latest check run
  • check-additional fails because it gates on check passing

Happy to rebase once upstream fixes the tsgo errors.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from 47f8f2e to f33cca1 Compare March 23, 2026 04:55
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f33cca1a65

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

"openai-completions",
"openai-responses",
"openai-codex-responses",
"azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Update Responses guard lists for azure-openai-responses

Adding "azure-openai-responses" to MODEL_APIS makes that value valid in user config, but the runtime still gates critical Responses handling on "openai-responses"/"openai-codex-responses" only (for example OPENAI_RESPONSES_APIS in src/agents/pi-embedded-runner/openai-stream-wrappers.ts:11, isOpenAIResponsesApi in src/agents/pi-embedded-runner/google.ts:566-567, and the downgrade wrapper branch in src/agents/pi-embedded-runner/run/attempt.ts:2320-2323). That means configs newly accepted by this commit can skip store/context and transcript downgrade logic used to prevent Responses API rejections, so Azure Responses runs can regress on tool/replay flows. Fresh evidence vs earlier review: this commit also updates src/config/schema.base.generated.ts to include the enum, so the value is now reachable through validated config paths.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same concern as the prior reply (see comment above). This PR's scope is strictly the config validation layer — allowing azure-openai-responses through the schema. The runtime guard paths (OPENAI_RESPONSES_APIS, etc.) are a pre-existing gap that exists on main today and are tracked separately. Happy to address in a follow-up if maintainers want it bundled.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from f33cca1 to 4bbf189 Compare March 23, 2026 06:47
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 4bbf189f86

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

"openai-completions",
"openai-responses",
"openai-codex-responses",
"azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Extend Responses guard paths for azure-openai-responses

Adding azure-openai-responses to MODEL_APIS makes models.providers.*.api/models.providers.*.models.*.api configs validate, but the runtime Responses safety logic still only matches openai-responses/openai-codex-responses (src/agents/pi-embedded-runner/openai-stream-wrappers.ts:11-159, src/agents/pi-embedded-runner/google.ts:566-583, src/agents/pi-embedded-runner/run/attempt.ts:2318-2338). With api: "azure-openai-responses", those downgrade/store-guard branches are skipped, so Azure Responses requests can miss the replay/payload sanitization that prevents multi-turn/tool-call rejections. Fresh evidence: the existing Azure regression coverage only asserts the store-force behavior for api: "openai-responses" (src/agents/pi-embedded-runner-extraparams.test.ts:1729-1741), which means this newly valid enum path is currently unprotected.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same concern as the prior two replies (see above). This PR's scope is the config validation layer only. The runtime Responses guard paths are a pre-existing gap on main, not introduced by this PR.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch 2 times, most recently from db1d427 to 65cc707 Compare March 23, 2026 09:23
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 65cc7078ca

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

"openai-completions",
"openai-responses",
"openai-codex-responses",
"azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Update Responses guards for azure-openai-responses API

Adding "azure-openai-responses" to MODEL_APIS makes models.providers.*.api/models.providers.*.models.*.api accept this value, and applyConfiguredProviderOverrides propagates provider api to the runtime model (src/agents/pi-embedded-runner/model.ts:134), so this commit introduces a newly reachable runtime state where model.api === "azure-openai-responses". The OpenAI Responses safety wrappers still gate on openai-responses only (OPENAI_RESPONSES_APIS in src/agents/pi-embedded-runner/openai-stream-wrappers.ts), which means the store=true and context-management payload patching used to prevent Azure Responses multi-turn failures is skipped for this new config path.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same concern as prior replies — see above.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch 2 times, most recently from 2f1c3e9 to cd4634a Compare March 23, 2026 15:13
@Cypherm
Copy link
Copy Markdown
Contributor Author

Cypherm commented Mar 24, 2026

CI triage update: 3 remaining failures are upstream — same failures present on main.

Failing (upstream):

  • check (tsgo) — shared lint/type-check, not our files
  • check-additional — gates on check
  • checks (node, channels) — channel tests, unrelated to config schema

Our area all passes:

  • build-smoke — builds successfully
  • checks (node, test, 1/2) — main test suite shard 1
  • checks (node, test, 2/2) — main test suite shard 2
  • checks (node, extensions) — extension tests
  • checks (node, contracts) — contract tests
  • checks-windows — 5/5 shards pass
  • check-docs — docs scope clean

Our change: +1 line in src/config/types.models.ts + schema regen. All test suites covering config pass.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from cd4634a to 80df3a0 Compare March 24, 2026 08:12
@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from 1da4263 to 509b6eb Compare March 30, 2026 00:38
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 509b6eb77d

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

"openai-completions",
"openai-responses",
"openai-codex-responses",
"azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Wire new API enum through model transport normalization

Adding "azure-openai-responses" to MODEL_APIS makes this config value valid, but the runtime transport whitelist still drops it (normalizeResolvedTransportApi in src/agents/pi-embedded-runner/model.ts:92-105). As a result, provider-level configs like models.providers.<id>.api: "azure-openai-responses" are silently coerced back to openai-responses in fallback/override paths (model.ts:322-334, 365-375, 528-543), so requests can route through the wrong OpenAI transport instead of Azure Responses. This means the new accepted enum value does not actually produce the intended runtime behavior for common provider-level configuration.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same concern as prior replies — see above.

@Cypherm
Copy link
Copy Markdown
Contributor Author

Cypherm commented Mar 30, 2026

CI failures are unrelated to this PR:

Our changes: src/config/types.models.ts (MODEL_APIS enum), src/config/schema.base.generated.ts, and regenerated doc baselines. All config-scoped.

Checks covering our area — all pass:

  • check ✅ (full type check)
  • check-additional
  • check-docs
  • checks-fast-contracts-protocol
  • checks-node-test-1
  • security-fast

Failing checks — all in unrelated areas:

  • checks-fast-extensions: resetTelegramThreadBindingsForTests is not a function in extensions/matrix/src/registry-backed.contract.test.ts — upstream telegram plugin refactor
  • checks-node-test-2: install.test.ts daemon token mock assertion — upstream CLI changes
  • build-smoke: CLI startup memory check — infrastructure
  • checks-node-channels-*: test-parallel infrastructure failures in discord/message-utils — upstream
  • checks-windows-*: same infrastructure failures on Windows

Main is currently running 5 CI jobs simultaneously — active refactoring by maintainers is the source of these failures. Happy to rebase once main stabilizes.

@Cypherm Cypherm force-pushed the fix/config-azure-openai-responses-api-enum branch from 509b6eb to cefa21c Compare March 31, 2026 03:53
@Cypherm
Copy link
Copy Markdown
Contributor Author

Cypherm commented Mar 31, 2026

Closing — the core change (adding azure-openai-responses to MODEL_APIS) was already merged upstream via #50851 by @kunalk16. After rebasing, this branch has no net diff vs main. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

docs Improvements or additions to documentation size: XS

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Config schema missing 'azure-openai-responses' in models.providers.*.api enum

1 participant