fix(onboard): restore openai-responses API for all Azure URLs#50740
fix(onboard): restore openai-responses API for all Azure URLs#50740Mhaizza wants to merge 1 commit intoopenclaw:mainfrom
Conversation
Regression introduced in 91104ac broke local vLLM endpoints that rely on openai-responses API compatibility. The commit narrowed Azure detection from 'isAzure' to 'isAzureOpenAi' (*.openai.azure.com only), causing non-Azure OpenAI-compatible endpoints to route to /v1/chat/completions instead of /responses. This resulted in 404 errors for local vLLM setups configured with openai-responses mode. Fix: Revert providerApi logic to use 'isAzure' (covers both *.services.ai.azure.com and *.openai.azure.com) to preserve backward compatibility. Fixes openclaw#50719
Greptile SummaryThis PR reverts a one-line regression introduced in commit
Confidence Score: 4/5
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: ba811b4629
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| const providerApi = isAzure | ||
| ? ("openai-responses" as const) |
There was a problem hiding this comment.
Preserve selected compatibility for Azure Foundry URLs
This change makes every Azure URL (isAzure) use openai-responses, which overrides the user-selected compatibility for *.services.ai.azure.com endpoints. The onboarding flow still allows and verifies Anthropic-style Azure Foundry endpoints (requestAnthropicVerification), but this assignment now writes an OpenAI Responses provider anyway, so a successful verification can produce a saved config that later calls the wrong API surface at runtime.
Useful? React with 👍 / 👎.
|
Closing as duplicate; this was superseded by #50535. |
Summary
Fixes #50719 — Local vLLM 404 regression introduced in 2026.3.13.
Root Cause
Commit 91104ac changed Azure URL detection in
src/commands/onboard-custom.ts:isAzure→ covered both*.services.ai.azure.comand*.openai.azure.comisAzureOpenAi→ only*.openai.azure.comThis broke local vLLM endpoints configured to use
openai-responsesAPI, causing them to route to/v1/chat/completionsinstead → 404 status code (no body).Changes
providerApilogic fromisAzureOpenAiback toisAzure(line 689)Testing
openai-responsesfor Azure Foundry URLsImpact
openai-responsesmode