-
-
Notifications
You must be signed in to change notification settings - Fork 39.6k
Description
Summary
The /switch command reports success but does not actually change the model for the active session. After executing /switch V3, the session continues using the previous model (Kimi K2.5) instead of switching to DeepSeek V3. The status display continues showing the wrong model's context window (262k for Kimi instead of 128k for V3).
Steps to reproduce
- Verified V3 running and tested available and working.. Start with active session using Kimi K2.5 (confirmed via
sessions_historyshowing"model": "k2p5") - Execute
/switch V3command - Observe system message: "Model reset to default (V3 (inferencer/deepseek-v3-4bit-mlx))"
- Run
/statuscommand - Verify via
sessions_history: most recent message still shows"model": "k2p5","provider": "kimi-coding" - List all sessions via
openclaw sessions list: all sessions showk2p5model, none show V3
Expected behavior
After /switch V3, the active session should use DeepSeek V3:
- Status should show 128k context (V3's context window)
sessions_historyshould show"model": "deepseek-v3-4bit-mlx"- New requests should be sent to V3 on Inferencer
Actual behavior
Session continues using Kimi K2.5:
- Status still shows 262k context (Kimi's 256k window)
sessions_historyconfirms"model": "k2p5","provider": "kimi-coding"- All sessions in
openclaw sessions listshowk2p5model - Silent failure — no error message, but no actual switch occurs
Environment
- OpenClaw version: 2026.2.9 (33c75cb)
- OS: Windows_NT 10.0.26200 (x64)
- Node: v24.13.0
- Install method: npm
- Channel: webchat
- Primary Model: inferencer/deepseek-v3-4bit-mlx (configured)
- Fallback Model: kimi-coding/k2p5
Logs or screenshots
Config shows V3 as primary:
"agents": {
"defaults": {
"model": {
"primary": "inferencer/deepseek-v3-4bit-mlx",
"fallbacks": ["kimi-coding/k2p5"]
}
}
}Session history confirms Kimi still active after /switch:
{
"role": "assistant",
"api": "anthropic-messages",
"provider": "kimi-coding",
"model": "k2p5",
"timestamp": 1770913335735
}Session list shows all sessions using k2p5:
Kind Key Model Tokens (ctx %)
direct agent:main:main k2p5 262k/262k (100%)
direct agent:main:subag... k2p5 61k/262k (23%)
Status after /switch V3:
🦞 OpenClaw 2026.2.9 (33c75cb)
🧠 Model: inferencer/deepseek-v3-4bit-mlx · 🔑 unknown
🧮 Tokens: 9.1k in / 515 out
📚 Context: 262k/262k (100%) · 🧹 Compactions: 0
Note: Shows 262k context (Kimi's window) despite claiming V3 is active.
Solution: Fix Agent Runtime
Effort: 2-3 days | Risk: Low
Modify the agent to check session.modelOverride before falling back to the default config model. This fixes the root cause properly.
Pros: Clean fix, backward compatible
Cons: Requires agent runtime changes