Skip to content

[Bug] Web Dashboard model switcher sends wrong provider prefix (ollama/) for openai-codex and google providers — regression in 2026.3.13[Bug]: #50585

@Ming-Sir-69

Description

@Ming-Sir-69

Bug type

Regression (worked before, now fails)

Summary

Environment:

  • OpenClaw version: 2026.3.13 (61d171a)
  • OS: macOS (Apple Silicon M4)
  • Auth method: OpenAI Codex (ChatGPT OAuth)

Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.

Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codex
Failed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-preview

The gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending ollama/ instead of the correct
provider prefix).

Workaround:
Downgrade to 2026.3.12:
openclaw update --tag [email protected] --yes

Steps to reproduce

  1. Configure openai-codex provider via openclaw configure (ChatGPT OAuth method)
  2. Upgrade to 2026.3.13
  3. Open Web Control Dashboard
  4. Use the model dropdown to switch to any openai-codex or google model
  5. Error appears immediately

Expected behavior

Dashboard should send openai-codex/gpt-5.2-codex, not ollama/gpt-5.2-codex.
TUI /model command works correctly — this is a dashboard-only regression.

Actual behavior

Environment:

  • OpenClaw version: 2026.3.13 (61d171a)
  • OS: macOS (Apple Silicon M4)
  • Auth method: OpenAI Codex (ChatGPT OAuth)

Describe the bug:
After upgrading from 2026.3.12 → 2026.3.13, the Web Control Dashboard model switcher
began sending incorrect provider prefixes when switching to openai-codex and google
provider models.

Error shown in dashboard:
Failed to set model: GatewayRequestError: model not allowed: ollama/gpt-5.2-codex
Failed to set model: GatewayRequestError: model not allowed: ollama/gemini-3.1-pro-preview

The gateway is correctly rejecting these requests — the bug is the dashboard
constructing the wrong model ref string (prepending ollama/ instead of the correct
provider prefix).

OpenClaw version

26.03.13

Operating system

macos26.3.1a

Install method

No response

Model

openai codex all models

Provider / routing chain

openclaw->openai codex->all 7 models

Additional provider/model setup details

  • The TUI /model command correctly switches models (backend routing is fine)
  • Clearing browser cache, switching browsers, reinstalling via install.sh did NOT fix the issue
  • The bug only affects providers not natively hardcoded in the dashboard frontend (openai-codex, google)
  • Confirmed regression: 2026.3.12 does not have this issue (model switching UI did not exist in 3.12, feature was introduced in 3.13)

Logs, screenshots, and evidence

Impact and severity

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingregressionBehavior that previously worked and now fails

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions