-
-
Notifications
You must be signed in to change notification settings - Fork 69.5k
[Bug]: tui/agent etc not working when azure openai endpoint is onboarded #48939
Description
Bug type
Behavior bug (incorrect output/state without crash)
Summary
When we onboard openclaw to our machine through the onboarding wizard, if we try to configure an azure openai endpoint through the custom providers option, the onboarding is successful but tui, agent, etc does not work, it fails with 404 resource not found error.
Root Cause:
The issue happens because when an Azure OpenAI endpoint is onboarded through the custom provider option through the onboarding wizard, it is treated like a normal openai endpoint, specifically openai-completions. The issue here is that, pi-ai does not have any implementation for azure-openai-completions, it only has implementation for azure-openai-responses.
Suggested Fix
A good fix here would be that during onboarding, the azure openai endpoint details be stored as azure-openai-responses type instead of openai-completions. I would be more than happy to contribute and fix this issue once and for all.
Steps to reproduce
- Install openclaw
- Onboard using openclaw onboard
- Select custom providers when it asks for provider details
- Enter valid azure openai endpoint details
- The onboarding is successful but if we "openclaw tui" or "openclaw agent...", it fails with 404 error
Expected behavior
openclaw tui and openclaw agent should work as expected
Actual behavior
"openclaw tui", "openclaw agent..." fails with 404 error

OpenClaw version
2026.3.13
Operating system
Ubuntu 24.04
Install method
pnpm dev
Model
gpt-5.2-chat
Provider / routing chain
openclaw -> azure openai -> gpt-5.2-chat
Config file / key location
No response
Additional provider/model setup details
Config example:
{
"wizard": {
"lastRunAt": "2026-03-15T06:13:42.390Z",
"lastRunVersion": "2026.3.14",
"lastRunCommand": "onboard",
"lastRunMode": "local"
},
"models": {
"mode": "merge",
"providers": {
"custom-someresourcexyz-openai-azure-com": {
"baseUrl": "https://someresourcexyz.openai.azure.com/openai/deployments/gpt-5.2-chat",
"apiKey": "someapikey",
"api": "openai-completions",
"models": [
{
"id": "gpt-5.2-chat",
"name": "gpt-5.2-chat (Custom Provider)",
"reasoning": false,
"input": [
"text"
],
"cost": {
"input": 0,
"output": 0,
"cacheRead": 0,
"cacheWrite": 0
},
"contextWindow": 16000,
"maxTokens": 4096
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "custom-someresourcexyz-openai-azure-com/gpt-5.2-chat"
},
"models": {
"custom-someresourcexyz-openai-azure-com/gpt-5.2-chat": {}
},
"workspace": "/root/.openclaw/workspace"
}
},
"tools": {
"profile": "coding"
},
"commands": {
"native": "auto",
"nativeSkills": "auto",
"restart": true,
"ownerDisplay": "raw"
},
"session": {
"dmScope": "per-channel-peer"
},
"gateway": {
"port": 18789,
"mode": "local",
"bind": "loopback",
"auth": {
"mode": "token",
"token": "sometoken"
},
"tailscale": {
"mode": "off",
"resetOnExit": false
},
"nodes": {
"denyCommands": [
"camera.snap",
"camera.clip",
"screen.record",
"contacts.add",
"calendar.add",
"reminders.add",
"sms.send"
]
}
},
"meta": {
"lastTouchedVersion": "2026.3.14",
"lastTouchedAt": "2026-03-15T06:13:42.400Z"
}
}
Logs, screenshots, and evidence
Impact and severity
- Impacts azure subscription customers using azure openai
- Blocks workflow
- Frequency always
- Consequence is failed onboarding
Additional information
The issue happens because when an Azure OpenAI endpoint is onboarded as a custom provider through the onboarding wizard, it is treated like a normal openai endpoint, specifically openai-completions.
The issue here is that, pi-ai does not have any implementation for azure-openai-completions, it only has implementation for azure-openai-responses, hence, a good fix here would be that during onboarding, the azure openai ednpoint details be stored as azure-openai-responses type instead of openai-completions.
I would be more than happy to contribute and fix this issue once and for all.