Skip to content

openrouter/auto fails with "Unknown model: openrouter/auto" — model resolution mismatch #5395

@hazem3500

Description

@hazem3500

Summary

When using openrouter/auto as the primary agent model (e.g. in agents.defaults.model.primary), the embedded agent fails before replying with:

Error: Unknown model: openrouter/auto
Embedded agent failed before reply: Unknown model: openrouter/auto

The Control UI Chat then appears stuck (agent shows "A ..." indefinitely).

Cause

  • OpenRouter supports openrouter/auto as a valid model (Auto Router): https://openrouter.ai/docs/guides/routing/auto-model-selection
  • Config uses primary: "openrouter/auto", which is parsed as provider = "openrouter", modelId = "auto" (see parseModelRef in model-selection.js).
  • The embedded runner calls modelRegistry.find(provider, modelId)find("openrouter", "auto").
  • In pi-ai's models.generated.js, the OpenRouter Auto model is stored with id: "openrouter/auto" (full string), not id: "auto".
  • The registry's find(provider, modelId) does m.provider === provider && m.id === modelId, so find("openrouter", "auto") does not match m.id === "openrouter/auto".

So the lookup fails and the user gets "Unknown model" even though openrouter/auto is valid for the API.

Environment

  • OpenClaw: 2026.1.29 (global npm)
  • Config: agents.defaults.model.primary: "openrouter/auto", auth profile openrouter:default with API key set
  • Observed in: Control UI → Chat; gateway logs in ~/.openclaw/logs/gateway.err.log

Suggested fix

In dist/agents/pi-embedded-runner/model.js resolveModel(), when modelRegistry.find(provider, modelId) returns no model and provider === "openrouter" and modelId === "auto", try modelRegistry.find(provider, "openrouter/auto") before falling through to inline/custom/fallback logic, since pi-ai stores that model under the full id.

(Alternatively, the pi-coding-agent model registry could normalize openrouter + auto to openrouter/auto when loading built-in models, or the model picker could stop hiding openrouter/auto and the resolver could accept both auto and openrouter/auto for that provider.)

Workaround

Use a concrete OpenRouter model id in config instead of openrouter/auto, e.g. openai/gpt-4o-mini or anthropic/claude-3-5-haiku, which are in the registry and resolve correctly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions