Skip to content

Config schema missing 'azure-openai-responses' in models.providers.*.api enum #51735

@habonlaci

Description

@habonlaci

Summary

The config validation schema for models.providers.*.api does not include "azure-openai-responses" as a valid option, even though pi-ai fully registers it as an API provider and uses it in its built-in model catalog.

Current behavior

The allowed enum values for models.providers.*.api are:

openai-completions, openai-responses, openai-codex-responses,
anthropic-messages, google-generative-ai, github-copilot,
bedrock-converse-stream, ollama

Setting "api": "azure-openai-responses" in openclaw.json fails config validation with an invalid API provider option error.

Impact

Models that exist in pi-ai's built-in catalog (e.g. gpt-5.2, gpt-4o) work correctly because modelRegistry.find() returns them with api: "azure-openai-responses" already set, bypassing the config api field.

However, newer models not yet in the built-in catalog (e.g. gpt-5.4-mini) can only be resolved via config. Since "azure-openai-responses" is not an allowed config value, the provider-level api must be set to "openai-responses", which routes to the plain OpenAI SDK client instead of AzureOpenAI. This causes:

  • Wrong auth header (Authorization: Bearer instead of api-key)
  • Missing api-version query parameter
  • HTTP 404 from Azure endpoint

This also affects heartbeat and subagent sessions using these models — they fail silently with 0 input/0 output tokens.

Root cause

pi-ai registers azure-openai-responses correctly:

  • register-builtins.js calls registerApiProvider({ api: "azure-openai-responses", ... })
  • models.generated.js has built-in models with api: "azure-openai-responses"
  • azure-openai-responses.js provider uses AzureOpenAI client with correct auth handling

But OpenClaw's config schema enum (used for validation) doesn't include it.

Expected behavior

"azure-openai-responses" should be a valid value for models.providers.*.api in the config schema, just like "anthropic-messages", "openai-responses", etc.

Workaround

Currently there's no clean workaround for models not in the built-in catalog. The only option is to use models that ARE in the catalog (which already have the correct api value), or switch to a different provider.

Environment

  • OpenClaw version: 2026.3.13
  • pi-ai: bundled version
  • Provider: Azure OpenAI (Azure AI Services endpoint)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions