Skip to content

New:Add support for Azure models【Supports the GPT-5.4 and more models‌】#39540

Open
sawyer-shi wants to merge 10 commits intoopenclaw:mainfrom
sawyer-shi:main
Open

New:Add support for Azure models【Supports the GPT-5.4 and more models‌】#39540
sawyer-shi wants to merge 10 commits intoopenclaw:mainfrom
sawyer-shi:main

Conversation

@sawyer-shi
Copy link
Copy Markdown

@sawyer-shi sawyer-shi commented Mar 8, 2026

Summary

  • Problem: OpenClaw lacked first-class Azure OpenAI onboarding/auth support comparable to other providers; Azure host validation was too narrow (did not include *.cognitiveservices.azure.com); and there was no configurable API version for preview endpoints.
  • Why it matters: This made enterprise Azure deployments less reliable, especially in non-interactive/productized setups where manual config errors are costly.
  • What changed:
    • Added first-class Azure auth choice: azure-openai-api-key (interactive + non-interactive).
    • Added Azure flags: --azure-openai-base-url, --azure-openai-model-id, --azure-openai-api-version (default v1, optional preview override).
    • Expanded Azure host validation to support *.openai.azure.com, *.services.ai.azure.com, and *.cognitiveservices.azure.com.
    • Stored azureApiVersion in model params when non-default and forwarded it at runtime via stream options.
    • Split docs: added dedicated Azure provider page, kept OpenAI page as a pointer, and added Azure to providers index.
  • What did NOT change (scope boundary):
    • No behavior changes for non-Azure providers.
    • No new command/tool execution permissions.
    • No new secret storage mechanism (still uses existing auth profile + env/ref paths).

Change Type (select all)

  • Bug fix
  • Feature
  • Refactor
  • Docs
  • Security hardening
  • Chore/infra

Scope (select all touched areas)

  • Gateway / orchestration
  • Skills / tool execution
  • Auth / tokens
  • Memory / storage
  • Integrations
  • API / contracts
  • UI / DX
  • CI/CD / infra

Linked Issue/PR

  • Closes #xxxx
  • Related #xxxx

User-visible / Behavior Changes

  • New first-class Azure onboarding path:
    • openclaw onboard --auth-choice azure-openai-api-key
  • New Azure flags:
    • --azure-openai-base-url
    • --azure-openai-model-id
    • --azure-openai-api-version (defaults to v1)
  • In non-interactive mode, azure-openai-api-key now requires base URL + model ID and fails fast with a clear error if missing.
  • Azure endpoint validation now accepts *.cognitiveservices.azure.com.
  • Docs now have a dedicated Azure OpenAI page and provider index entry.

Security Impact (required)

  • New permissions/capabilities? (Yes/No)
    • No
  • Secrets/tokens handling changed? (Yes/No)
    • Yes
  • New/changed network calls? (Yes/No)
    • Yes
  • Command/tool execution surface changed? (Yes/No)
    • No
  • Data access scope changed? (Yes/No)
    • No
  • If any Yes, explain risk + mitigation:
    • Added AZURE_OPENAI_API_KEY handling through existing secret paths (plaintext/ref), with no new plaintext leak surface.
    • Added Azure provider network path with host suffix validation + URL normalization to reduce misconfiguration and unintended endpoints.
    • Docs/examples use placeholders for base URLs and env vars to avoid leaking tenant/resource details.

Repro + Verification

Environment

  • OS: Windows
  • Runtime/container: Node 22 + pnpm
  • Model/provider: azure-openai-responses/gpt-5.4
  • Integration/channel (if any): N/A
  • Relevant config (redacted): AZURE_OPENAI_API_KEY, azureOpenaiBaseUrl, azureOpenaiModelId, azureApiVersion

Steps

  1. Run onboarding (interactive or non-interactive) with azure-openai-api-key.
  2. Provide Azure base URL + deployment/model ID; optionally set --azure-openai-api-version.
  3. Verify generated config/auth profile and run Azure-related test suites.

Expected

  • Correct azure-openai-responses provider config is generated.
  • API key is stored through existing auth profile flow (including ref mode).
  • Preview apiVersion is configurable and forwarded.
  • Related tests pass.

Actual

  • Matches expected behavior; Azure-related suites pass.

Evidence

  • Failing test/log before + passing after
  • Trace/log snippets
  • Screenshot/recording
  • Perf numbers (if relevant)

Additional result:

  • Azure-related test suites passed: 142 passed.

Human Verification (required)

What you personally verified (not just CI), and how:

  • Verified scenarios:
    • Azure auth choice flow (interactive + non-interactive).
    • URL normalization and host validation (including *.cognitiveservices.azure.com).
    • --azure-openai-api-version default (v1) and preview override behavior.
    • Docs split + provider index update.
  • Edge cases checked:
    • Missing base URL/model ID in non-interactive mode.
    • Non-Azure host rejection.
    • env/ref secret handling paths.
  • What you did not verify:
    • No live production Azure tenant load/stress validation; verification is local + automated tests.
image

Compatibility / Migration

  • Backward compatible? (Yes/No)
    • Yes
  • Config/env changes? (Yes/No)
    • Yes
  • Migration needed? (Yes/No)
    • No
  • If yes, exact upgrade steps:
    • N/A (existing configs continue to work; Azure support is additive).

Failure Recovery (if this breaks)

  • How to disable/revert this change quickly:
    • Stop using azure-openai-responses and switch back to existing provider refs (openai/*, openrouter/*, etc.).
  • Files/config to restore:
    • Remove/revert models.providers["azure-openai-responses"] and related agents.defaults.models entries.
  • Known bad symptoms reviewers should watch for:
    • 4xx responses from invalid Azure endpoint/API version settings.
    • Non-interactive onboarding fast-fail when required Azure flags are missing.

Risks and Mitigations

  • Risk:
    • Incorrect endpoint or preview API version causes request failures.
    • Mitigation:
      • Fail-fast validation in onboarding, default v1, and explicit docs for v1 vs preview setup.
  • Risk:
    • New auth branch could regress onboarding stability.
    • Mitigation:
      • Added/updated multi-layer tests (CLI parsing, auth application, non-interactive onboarding, runtime extra params) and verified passing.

@openclaw-barnacle openclaw-barnacle bot added docs Improvements or additions to documentation cli CLI command changes commands Command implementations agents Agent runtime and tooling size: L labels Mar 8, 2026
@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Mar 8, 2026

Greptile Summary

This PR adds first-class Azure OpenAI support to OpenClaw, including a new azure-openai-api-key auth choice, three Azure-specific CLI flags (--azure-openai-base-url, --azure-openai-model-id, --azure-openai-api-version), expanded host validation for *.cognitiveservices.azure.com, and a configurable API version that is stored in model params and forwarded at runtime. The implementation is well-structured, thoroughly tested, and follows the existing provider-auth patterns precisely.

Key observations:

  • normalizeAzureOpenAIBaseUrl (azure-openai-config.ts, lines 43–58) validates the hostname suffix correctly but does not check the URL scheme. An http:// URL passes the Azure host check and is returned unchanged, meaning an accidentally-misconfigured endpoint could transmit the API key over plaintext HTTP. Explicitly asserting parsed.protocol === "https:" would close this gap and is consistent with the PR's stated security-hardening goal.
  • applyAzureOpenAIConfig (line 131) constructs the default model reference using params.modelId.trim() directly rather than normalizeAzureOpenAIModelId(params.modelId). Today these are equivalent, but if normalization ever gains additional transforms the keys written by the two sibling functions would diverge silently.
  • Dead code on line 117 — the mergedModels.length > 0 ? mergedModels : [defaultModel] ternary can never reach its fallback branch; mergedModels always contains at least defaultModel.
  • All other touchpoints (env-var registry, model-auth env map, OPENAI_RESPONSES_PROVIDERS, non-interactive inference, CLI flag registration, auth profile wiring, docs) look correct and complete.

Confidence Score: 4/5

  • Safe to merge with minor security hardening recommended — the missing HTTPS scheme check in URL validation is the only notable gap in an otherwise well-implemented feature.
  • The implementation is comprehensive, follows existing provider patterns, and is backed by a solid test suite (interactive, non-interactive, missing-flag, invalid-URL, ref-mode, and preview-version cases all covered). The only non-trivial finding is that normalizeAzureOpenAIBaseUrl validates the hostname but not the protocol, which means an http:// base URL would be stored and used without error. In practice Azure endpoints reject plain HTTP, so this is unlikely to cause a silent credential leak, but it is an omission in a function explicitly designed for security validation. The other two findings are dead code and a minor normalization consistency nit.
  • src/commands/azure-openai-config.ts — specifically the normalizeAzureOpenAIBaseUrl function (HTTPS scheme validation) and applyAzureOpenAIConfig (consistent use of the model-ID normalizer).

Last reviewed commit: e6a0b45

Comment on lines +43 to +58
let parsed: URL;
try {
parsed = new URL(candidate);
} catch {
throw new Error(
"Azure OpenAI base URL must be a valid URL (for example: https://<resource>.openai.azure.com/openai/v1).",
);
}

if (!isAzureHost(parsed.hostname)) {
throw new Error(
"Azure OpenAI base URL must use an Azure host (*.openai.azure.com, *.services.ai.azure.com, or *.cognitiveservices.azure.com).",
);
}

return `${parsed.origin}/openai/v1`;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing HTTPS protocol enforcement

normalizeAzureOpenAIBaseUrl validates the hostname but never checks that the scheme is https:. An http:// URL would pass the isAzureHost check and be returned as-is, meaning the stored baseUrl (and any subsequent API calls) would use plain HTTP — transmitting the API key in cleartext.

Since the PR explicitly includes security hardening (host suffix validation + URL normalization), adding a protocol check here is consistent with that goal:

Suggested change
let parsed: URL;
try {
parsed = new URL(candidate);
} catch {
throw new Error(
"Azure OpenAI base URL must be a valid URL (for example: https://<resource>.openai.azure.com/openai/v1).",
);
}
if (!isAzureHost(parsed.hostname)) {
throw new Error(
"Azure OpenAI base URL must use an Azure host (*.openai.azure.com, *.services.ai.azure.com, or *.cognitiveservices.azure.com).",
);
}
return `${parsed.origin}/openai/v1`;
if (!isAzureHost(parsed.hostname)) {
throw new Error(
"Azure OpenAI base URL must use an Azure host (*.openai.azure.com, *.services.ai.azure.com, or *.cognitiveservices.azure.com).",
);
}
if (parsed.protocol !== "https:") {
throw new Error(
"Azure OpenAI base URL must use HTTPS (received: " + parsed.protocol.replace(":", "") + ").",
);
}
return `${parsed.origin}/openai/v1`;

The test suite in azure-openai-config.test.ts should also cover this case with a dedicated it("rejects non-HTTPS base URLs", ...) assertion.

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/commands/azure-openai-config.ts
Line: 43-58

Comment:
**Missing HTTPS protocol enforcement**

`normalizeAzureOpenAIBaseUrl` validates the hostname but never checks that the scheme is `https:`. An `http://` URL would pass the `isAzureHost` check and be returned as-is, meaning the stored `baseUrl` (and any subsequent API calls) would use plain HTTP — transmitting the API key in cleartext.

Since the PR explicitly includes security hardening (host suffix validation + URL normalization), adding a protocol check here is consistent with that goal:

```suggestion
  if (!isAzureHost(parsed.hostname)) {
    throw new Error(
      "Azure OpenAI base URL must use an Azure host (*.openai.azure.com, *.services.ai.azure.com, or *.cognitiveservices.azure.com).",
    );
  }

  if (parsed.protocol !== "https:") {
    throw new Error(
      "Azure OpenAI base URL must use HTTPS (received: " + parsed.protocol.replace(":", "") + ").",
    );
  }

  return `${parsed.origin}/openai/v1`;
```

The test suite in `azure-openai-config.test.ts` should also cover this case with a dedicated `it("rejects non-HTTPS base URLs", ...)` assertion.

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Comment on lines +126 to +132
export function applyAzureOpenAIConfig(
cfg: OpenClawConfig,
params: { baseUrl: string; modelId: string; apiVersion?: string },
): OpenClawConfig {
const next = applyAzureOpenAIProviderConfig(cfg, params);
return applyAgentDefaultModelPrimary(next, `${AZURE_OPENAI_PROVIDER_ID}/${params.modelId.trim()}`);
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Normalization bypass — raw .trim() instead of normalizeAzureOpenAIModelId

applyAzureOpenAIConfig passes params.modelId.trim() directly to applyAgentDefaultModelPrimary while applyAzureOpenAIProviderConfig (called one line above) stores the model under normalizeAzureOpenAIModelId(params.modelId) as its key.

Today both produce the same result since normalizeAzureOpenAIModelId only does .trim(). However, if normalization ever gains additional transforms (e.g. lower-casing, stripping deployment-path prefixes), the key written by applyAzureOpenAIProviderConfig and the reference written here would silently diverge, breaking the default-model lookup at runtime.

Suggested change
export function applyAzureOpenAIConfig(
cfg: OpenClawConfig,
params: { baseUrl: string; modelId: string; apiVersion?: string },
): OpenClawConfig {
const next = applyAzureOpenAIProviderConfig(cfg, params);
return applyAgentDefaultModelPrimary(next, `${AZURE_OPENAI_PROVIDER_ID}/${params.modelId.trim()}`);
}
export function applyAzureOpenAIConfig(
cfg: OpenClawConfig,
params: { baseUrl: string; modelId: string; apiVersion?: string },
): OpenClawConfig {
const next = applyAzureOpenAIProviderConfig(cfg, params);
return applyAgentDefaultModelPrimary(next, `${AZURE_OPENAI_PROVIDER_ID}/${normalizeAzureOpenAIModelId(params.modelId)}`);
}
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/commands/azure-openai-config.ts
Line: 126-132

Comment:
**Normalization bypass — raw `.trim()` instead of `normalizeAzureOpenAIModelId`**

`applyAzureOpenAIConfig` passes `params.modelId.trim()` directly to `applyAgentDefaultModelPrimary` while `applyAzureOpenAIProviderConfig` (called one line above) stores the model under `normalizeAzureOpenAIModelId(params.modelId)` as its key.

Today both produce the same result since `normalizeAzureOpenAIModelId` only does `.trim()`. However, if normalization ever gains additional transforms (e.g. lower-casing, stripping deployment-path prefixes), the key written by `applyAzureOpenAIProviderConfig` and the reference written here would silently diverge, breaking the default-model lookup at runtime.

```suggestion
export function applyAzureOpenAIConfig(
  cfg: OpenClawConfig,
  params: { baseUrl: string; modelId: string; apiVersion?: string },
): OpenClawConfig {
  const next = applyAzureOpenAIProviderConfig(cfg, params);
  return applyAgentDefaultModelPrimary(next, `${AZURE_OPENAI_PROVIDER_ID}/${normalizeAzureOpenAIModelId(params.modelId)}`);
}
```

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

baseUrl,
api: "openai-responses",
...(normalizedApiKey ? { apiKey: normalizedApiKey } : {}),
models: mergedModels.length > 0 ? mergedModels : [defaultModel],
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unreachable fallback — mergedModels is always non-empty

mergedModels is constructed as either existingModels (when the modelId is already present, meaning it has at least one element) or [...existingModels, defaultModel] (which always appends defaultModel). Both branches produce an array with at least one element, so the ? mergedModels : [defaultModel] fallback here can never be reached. The line can be simplified to just models: mergedModels.

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/commands/azure-openai-config.ts
Line: 117

Comment:
**Unreachable fallback — `mergedModels` is always non-empty**

`mergedModels` is constructed as either `existingModels` (when the `modelId` is already present, meaning it has at least one element) or `[...existingModels, defaultModel]` (which always appends `defaultModel`). Both branches produce an array with at least one element, so the `? mergedModels : [defaultModel]` fallback here can never be reached. The line can be simplified to just `models: mergedModels`.

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: e6a0b45bc3

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +52 to +55
if (!isAzureHost(parsed.hostname)) {
throw new Error(
"Azure OpenAI base URL must use an Azure host (*.openai.azure.com, *.services.ai.azure.com, or *.cognitiveservices.azure.com).",
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Reject non-HTTPS Azure OpenAI base URLs

normalizeAzureOpenAIBaseUrl currently validates only the hostname suffix and then returns parsed.origin, so inputs like http://<resource>.openai.azure.com pass validation and become persisted provider endpoints. In that misconfiguration path, API-key authenticated requests can be sent over plaintext HTTP (or through insecure redirects/proxies), which is an avoidable credential-exposure risk for Azure onboarding flows.

Useful? React with 👍 / 👎.

Comment on lines +547 to +549
!(await maybeSetResolvedApiKey(resolved, (value) =>
setAzureOpenaiApiKey(value, undefined, apiKeyStorageOptions),
))
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Validate Azure required flags before writing credentials

In non-interactive Azure onboarding, the API key is persisted via setAzureOpenaiApiKey before validating --azure-openai-base-url and --azure-openai-model-id. If either flag is missing/invalid, the command exits with an error but has already mutated auth profiles, so a failed run can still overwrite or create Azure credentials unexpectedly.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 53ed5bc2a4

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

const noteAgentModel = createAuthChoiceAgentModelNoter(params);
const requestedSecretInputMode = normalizeSecretInputModeInput(params.opts?.secretInputMode);

await ensureApiKeyFromOptionEnvOrPrompt({
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Validate Azure inputs before persisting API key

The handler writes credentials via ensureApiKeyFromOptionEnvOrPrompt before validating azureOpenaiBaseUrl/azureOpenaiModelId. In interactive onboarding/configure flows, an invalid CLI-provided Azure URL or model ID will throw during normalization after the key has already been saved, so a failed run still mutates auth profiles. Reordering validation ahead of credential writes (or rolling back on failure) would prevent this partial-state side effect.

Useful? React with 👍 / 👎.

);
}

return `${parsed.origin}/openai/v1`;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve api-version query or reject it explicitly

normalizeAzureOpenAIBaseUrl unconditionally returns ${parsed.origin}/openai/v1, which discards any query string on user input. When users provide a full Azure endpoint containing ?api-version=... (notably preview deployments), onboarding silently strips that version and runtime calls default back to v1, which can break preview-only setups with no clear explanation. This should either preserve/migrate api-version into azureApiVersion or fail fast when query parameters are present.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 078f0280d9

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +536 to +540
provider: "azure-openai-responses",
cfg: baseConfig,
flagValue: opts.azureOpenaiApiKey,
flagName: "--azure-openai-api-key",
envVar: "AZURE_OPENAI_API_KEY",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Add Azure env-key discovery for non-interactive auth

This new Azure branch relies on resolveApiKey(...), which in turn uses resolveNonInteractiveApiKey/resolveEnvApiKey to detect keys from environment variables, but azure-openai-responses is not registered in src/agents/model-auth-env-vars.ts (PROVIDER_ENV_API_KEY_CANDIDATES). As a result, openclaw onboard --non-interactive --auth-choice azure-openai-api-key fails with “Missing --azure-openai-api-key (or AZURE_OPENAI_API_KEY in env...)” even when AZURE_OPENAI_API_KEY is set, breaking CI/automation flows that intentionally avoid passing secrets on the command line.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: a776132ed8

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +77 to +80
await ensureApiKeyFromOptionEnvOrPrompt({
token: params.opts?.azureOpenaiApiKey,
tokenProvider: "azure-openai-responses",
secretInputMode: requestedSecretInputMode,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Reject ref mode when Azure key is provided inline

This branch accepts --secret-input-mode ref together with --azure-openai-api-key and immediately routes the inline key through ensureApiKeyFromOptionEnvOrPrompt, which stores a reference credential instead of validating that AZURE_OPENAI_API_KEY is actually present in the environment. In that scenario onboarding reports success but the saved profile points to a missing env var, so auth fails later at runtime with a hard-to-diagnose broken configuration. The non-interactive Azure path already guards this combination, so the interactive path should apply the same validation.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d2b9ddb24c

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +116 to +118
const mergedModels = existingModels.some((model) => model.id === modelId)
? existingModels
: [...existingModels, defaultModel];
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Backfill Azure model compat when model ID already exists

When onboarding updates a config that already contains azure-openai-responses/<modelId>, this branch keeps existingModels unchanged and skips the compat: { supportsStore: false } default from buildAzureModelDefinition. In that migration path, runtime logic in src/agents/pi-embedded-runner/extra-params.ts (shouldForceResponsesStore) will still treat the model as store-capable and force store=true for Azure Responses calls, so rerunning onboarding does not actually fix existing Azure model entries that need store disabled.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in a28758c: existing azure-openai-responses model entries now backfill compat.supportsStore-false when modelId already exists,with regression test coverage in src/commands/azure-openai-config.test.ts.

@sawyer-shi sawyer-shi changed the title New:Add support for Azure OpenAI models New:Add support for Azure OpenAI models【Supports the GPT-5.4】 Mar 9, 2026
@sawyer-shi sawyer-shi changed the title New:Add support for Azure OpenAI models【Supports the GPT-5.4】 New:Add support for Azure OpenAI models【Supports the GPT-5.4 and more models‌】 Mar 9, 2026
@sawyer-shi sawyer-shi force-pushed the main branch 2 times, most recently from cbf706d to 6627906 Compare March 11, 2026 01:36
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 13f8a55604

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +595 to +599
provider: "azure-openai-responses",
cfg: baseConfig,
flagValue: opts.azureOpenaiApiKey,
flagName: "--azure-openai-api-key",
envVar: "AZURE_OPENAI_API_KEY",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Pass explicit Azure env var name to resolveApiKey

Fresh evidence in this commit shows the Azure non-interactive path calls resolveApiKey without envVarName, so resolveNonInteractiveApiKey cannot read AZURE_OPENAI_API_KEY unless the provider is registered in its env-candidate map. In this changeset, that means openclaw onboard --non-interactive --auth-choice azure-openai-api-key still errors as missing credentials even when AZURE_OPENAI_API_KEY is set, which breaks CI/automation flows that avoid putting secrets on the command line.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: ee023e5834

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +74 to +78
tokenProvider: "azure-openai-responses",
secretInputMode: requestedSecretInputMode,
config: nextConfig,
expectedProviders: ["azure-openai-responses"],
provider: "azure-openai-responses",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Register Azure env key for interactive auth discovery

This new Azure flow delegates key lookup to ensureApiKeyFromOptionEnvOrPrompt, which in turn relies on resolveEnvApiKey(provider) for env discovery; however azure-openai-responses is still not in src/agents/model-auth-env-vars.ts. Even though non-interactive onboarding now works via an explicit envVarName workaround, interactive onboarding with AZURE_OPENAI_API_KEY set will still miss the existing key and force manual entry, and env-only Azure auth resolution remains inconsistent with other API-key providers.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

@sawyer-shi
Copy link
Copy Markdown
Author

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

@sawyer-shi sawyer-shi force-pushed the main branch 2 times, most recently from f3fba5c to 3b8f002 Compare March 13, 2026 06:09
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 3b8f002e41

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +17 to +18
normalized.endsWith(".services.ai.azure.com") ||
normalized.endsWith(".cognitiveservices.azure.com")
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Keep Azure host allowlist consistent with stream wrappers

This change accepts *.services.ai.azure.com and *.cognitiveservices.azure.com as valid Azure base URLs, but direct-OpenAI runtime detection still only matches *.openai.azure.com in src/agents/pi-embedded-runner/openai-stream-wrappers.ts (isDirectOpenAIBaseUrl). As a result, newly accepted Azure endpoints bypass the OpenAI-responses wrapper path (shouldForceResponsesStore and related payload normalization), so they do not get the same Azure handling as existing *.openai.azure.com setups and can still hit Azure-specific response failures in configurations that rely on that wrapper behavior.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6c32ad7527

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +17 to +18
normalized.endsWith(".services.ai.azure.com") ||
normalized.endsWith(".cognitiveservices.azure.com")
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Keep Azure host allowlists consistent across runtime wrappers

This change now accepts *.services.ai.azure.com and *.cognitiveservices.azure.com as valid Azure base URLs, but src/agents/pi-embedded-runner/openai-stream-wrappers.ts still treats only *.openai.azure.com as a direct OpenAI/Azure endpoint in isDirectOpenAIBaseUrl. In configurations that rely on that direct-endpoint path (for example the shouldForceResponsesStore / Responses payload handling path), newly accepted Azure hosts will be classified differently at runtime than legacy Azure hosts, leading to inconsistent request behavior for the same provider.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

@sawyer-shi sawyer-shi changed the title New:Add support for Azure OpenAI models【Supports the GPT-5.4 and more models‌】 New:Add support for Azure models【Supports the GPT-5.4 and more models‌】 Mar 13, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 090a38529a

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

const noteAgentModel = createAuthChoiceAgentModelNoter(params);
const requestedSecretInputMode = normalizeSecretInputModeInput(params.opts?.secretInputMode);

await ensureApiKeyFromOptionEnvOrPrompt({
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Validate Azure URL/model before storing API key

applyAuthChoiceAzureOpenAI persists credentials through ensureApiKeyFromOptionEnvOrPrompt before it validates azureOpenaiBaseUrl and azureOpenaiModelId. When users provide an invalid --azure-openai-base-url or --azure-openai-model-id, the command exits with a validation error later in the flow, but the Azure API key has already been written to auth profiles, leaving partial state after a failed onboarding run.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

Please accept this PR as soon as possible, otherwise I will have to resolve conflicts and handle it every day.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b81b773a28

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +59 to +63
optionKey: "azureOpenaiApiKey",
authChoice: "azure-openai-api-key",
cliFlag: "--azure-openai-api-key",
cliOption: "--azure-openai-api-key <key>",
description: "Azure OpenAI API key",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Add Azure auth choice to grouped onboarding providers

Adding azure-openai-api-key here makes it available in the flat auth-choice list, but interactive onboarding uses promptAuthChoiceGrouped/buildAuthChoiceGroups, which only renders choices mapped in AUTH_CHOICE_GROUP_DEFS (plus plugin-provided groups). Because this new built-in choice was not added to a static group, openclaw onboard users who rely on the grouped provider picker cannot select Azure OpenAI at all unless they pass --auth-choice manually, so the new provider path is effectively inaccessible in the default interactive flow.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 979e26e2ba

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +60 to +63
authChoice: "azure-openai-api-key",
cliFlag: "--azure-openai-api-key",
cliOption: "--azure-openai-api-key <key>",
description: "Azure OpenAI API key",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Add Azure auth choice to grouped provider picker

This adds azure-openai-api-key to the flat auth-flag catalog, but interactive onboarding/configure flows render choices from grouped definitions (buildAuthChoiceGroupsAUTH_CHOICE_GROUP_DEFS) in src/commands/auth-choice-prompt.ts, and no Azure group mapping was added there. As a result, users in the default grouped picker cannot select Azure OpenAI unless they already know to pass --auth-choice azure-openai-api-key manually, so the new provider path is effectively hidden in normal interactive setup.

Useful? React with 👍 / 👎.

@openclaw-barnacle openclaw-barnacle bot added the gateway Gateway runtime label Mar 16, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling cli CLI command changes commands Command implementations docs Improvements or additions to documentation size: XL

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant