Bug type
Regression (worked before, now fails)
Summary
Environment:
- OpenClaw version: 2026.3.13 (61d171a)
- OS: macOS (Apple Silicon M4)
- Auth method: OpenAI Codex (ChatGPT OAuth)
Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.
Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codex
Failed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-preview
The gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending ollama/ instead of the correct
provider prefix).
Workaround:
Downgrade to 2026.3.12:
openclaw update --tag [email protected] --yes
Steps to reproduce
- Configure openai-codex provider via
openclaw configure (ChatGPT OAuth method)
- Upgrade to 2026.3.13
- Open Web Control Dashboard
- Use the model dropdown to switch to any openai-codex or google model
- Error appears immediately
Expected behavior
Dashboard should send openai-codex/gpt-5.2-codex, not ollama/gpt-5.2-codex.
TUI /model command works correctly — this is a dashboard-only regression.
Actual behavior
Environment:
- OpenClaw version: 2026.3.13 (61d171a)
- OS: macOS (Apple Silicon M4)
- Auth method: OpenAI Codex (ChatGPT OAuth)
Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.
Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codex
Failed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-preview
The gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending ollama/ instead of the correct
provider prefix).
OpenClaw version
26.03.13
Operating system
macos26.3.1a
Install method
No response
Model
openai codex all models
Provider / routing chain
openclaw->openai codex->all 7 models
Additional provider/model setup details
- The TUI
/model command correctly switches models (backend routing is fine)
- Clearing browser cache, switching browsers, reinstalling via
install.sh did NOT fix the issue
- The bug only affects providers not natively hardcoded in the dashboard frontend (openai-codex, google)
- Confirmed regression: 2026.3.12 does not have this issue (model switching UI did not exist in 3.12, feature was introduced in 3.13)
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Regression (worked before, now fails)
Summary
Environment:
Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.
Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codexFailed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-previewThe gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending
ollama/instead of the correctprovider prefix).
Workaround:
Downgrade to 2026.3.12:
openclaw update --tag [email protected] --yesSteps to reproduce
openclaw configure(ChatGPT OAuth method)Expected behavior
Dashboard should send
openai-codex/gpt-5.2-codex, notollama/gpt-5.2-codex.TUI
/modelcommand works correctly — this is a dashboard-only regression.Actual behavior
Environment:
Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.
Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codexFailed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-previewThe gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending
ollama/instead of the correctprovider prefix).
OpenClaw version
26.03.13
Operating system
macos26.3.1a
Install method
No response
Model
openai codex all models
Provider / routing chain
openclaw->openai codex->all 7 models
Additional provider/model setup details
/modelcommand correctly switches models (backend routing is fine)install.shdid NOT fix the issueLogs, screenshots, and evidence
Impact and severity
No response
Additional information
No response