Skip to content

[Bug]: /model and /switch command reports success but does not actually change model #14783

@j-asketch

Description

@j-asketch

Summary

The /switch command reports success but does not actually change the model for the active session. After executing /switch V3, the session continues using the previous model (Kimi K2.5) instead of switching to DeepSeek V3. The status display continues showing the wrong model's context window (262k for Kimi instead of 128k for V3).

Steps to reproduce

  1. Verified V3 running and tested available and working.. Start with active session using Kimi K2.5 (confirmed via sessions_history showing "model": "k2p5")
  2. Execute /switch V3 command
  3. Observe system message: "Model reset to default (V3 (inferencer/deepseek-v3-4bit-mlx))"
  4. Run /status command
  5. Verify via sessions_history: most recent message still shows "model": "k2p5", "provider": "kimi-coding"
  6. List all sessions via openclaw sessions list: all sessions show k2p5 model, none show V3

Expected behavior

After /switch V3, the active session should use DeepSeek V3:

  • Status should show 128k context (V3's context window)
  • sessions_history should show "model": "deepseek-v3-4bit-mlx"
  • New requests should be sent to V3 on Inferencer

Actual behavior

Session continues using Kimi K2.5:

  • Status still shows 262k context (Kimi's 256k window)
  • sessions_history confirms "model": "k2p5", "provider": "kimi-coding"
  • All sessions in openclaw sessions list show k2p5 model
  • Silent failure — no error message, but no actual switch occurs

Environment

  • OpenClaw version: 2026.2.9 (33c75cb)
  • OS: Windows_NT 10.0.26200 (x64)
  • Node: v24.13.0
  • Install method: npm
  • Channel: webchat
  • Primary Model: inferencer/deepseek-v3-4bit-mlx (configured)
  • Fallback Model: kimi-coding/k2p5

Logs or screenshots

Config shows V3 as primary:

"agents": {
  "defaults": {
    "model": {
      "primary": "inferencer/deepseek-v3-4bit-mlx",
      "fallbacks": ["kimi-coding/k2p5"]
    }
  }
}

Session history confirms Kimi still active after /switch:

{
  "role": "assistant",
  "api": "anthropic-messages",
  "provider": "kimi-coding",
  "model": "k2p5",
  "timestamp": 1770913335735
}

Session list shows all sessions using k2p5:

Kind   Key                   Model    Tokens (ctx %) 
direct agent:main:main      k2p5     262k/262k (100%)
direct agent:main:subag...  k2p5     61k/262k (23%)

Status after /switch V3:

🦞 OpenClaw 2026.2.9 (33c75cb)
🧠 Model: inferencer/deepseek-v3-4bit-mlx · 🔑 unknown
🧮 Tokens: 9.1k in / 515 out
📚 Context: 262k/262k (100%) · 🧹 Compactions: 0

Note: Shows 262k context (Kimi's window) despite claiming V3 is active.

Solution: Fix Agent Runtime
Effort: 2-3 days | Risk: Low

Modify the agent to check session.modelOverride before falling back to the default config model. This fixes the root cause properly.

Pros: Clean fix, backward compatible
Cons: Requires agent runtime changes

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions