-
-
Notifications
You must be signed in to change notification settings - Fork 69.6k
Custom provider onboarding forces openai-responses on all *.services.ai.azure.com endpoints #50528
Copy link
Copy link
Closed
Labels
maintainerMaintainer-authored PRMaintainer-authored PR
Description
Summary
openclaw onboard currently treats every *.services.ai.azure.com custom provider endpoint as Azure Responses and persists api: "openai-responses", even when the user selected OpenAI/chat-completions compatibility.
Why this is wrong
Microsoft's current docs say Azure v1 accepts both:
https://<resource>.openai.azure.com/openai/v1/https://<resource>.services.ai.azure.com/openai/v1/
And they explicitly document chat-completions models from other providers like DeepSeek/Grok on Foundry endpoints:
- Azure v1 lifecycle docs: https://learn.microsoft.com/en-us/azure/foundry/openai/api-version-lifecycle
- Foundry endpoints docs: https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/endpoints
Relevant docs points:
- the v1 docs say chat completions are supported for "models from other providers like DeepSeek and Grok"
- the v1 docs say
base_urlaccepts both*.openai.azure.com/openai/v1/and*.services.ai.azure.com/openai/v1/ - the Foundry endpoints docs show
POST https://<resource>.services.ai.azure.com/openai/deployments/<deployment>/chat/completions?...for DeepSeek
Current bad codepath
src/commands/onboard-custom.ts:38matches both*.services.ai.azure.comand*.openai.azure.cominisAzureUrl()src/commands/onboard-custom.ts:681then forcesapi: "openai-responses"for all of them, ignoringparams.compatibility
User-visible failure
A valid Foundry services.ai.azure.com chat-completions endpoint can be onboarded successfully but persisted with the wrong runtime API shape. The first real model request then goes to /responses instead of the chat-completions route that host/model actually supports.
Expected behavior
*.openai.azure.comAzure OpenAI endpoints can keep the current Responses-first path*.services.ai.azure.comshould not be forced onto Responses solely based on hostname- when the user selected OpenAI-compatible chat-completions, persist the matching runtime API shape instead of overriding it
Follow-up from
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
maintainerMaintainer-authored PRMaintainer-authored PR
Type
Fields
Give feedbackNo fields configured for issues without a type.