-
-
Notifications
You must be signed in to change notification settings - Fork 39.8k
Description
Bug Report
Version: OpenClaw 2026.2.4 (npm global install)
OS: Linux 6.14.0-37-generic (x64), Node v22.22.0
Summary
Gateway returns error "request ended without sending any chunks" when attempting to deliver responses from LLM to agents. This happens:
- Across ALL model providers (MiniMax, Google Gemini, Anthropic, etc.)
- With both streaming ON and OFF
- On all channels (Slack, WhatsApp, TUI gateway)
- On both new
/newsessions and existing sessions - 100% of messages fail with this error
Steps to Reproduce
- Run
openclawin TUI mode - Send any message: "yo`
- Error:
request ended without sending any chunks - Try
/new- same error - Switch models (e.g.,
/model anthropic/claude-opus-4.5) - same error - Disable streaming - same error
Expected Behavior
Message should be delivered successfully. Gateway's chunk delivery system should either:
- Send chunks incrementally until complete, or
- Return non-chunked message normally
Actual Behavior
Gateway reports "request ended without sending any chunks" - appears to be treating no chunks received as a delivery error.
Error Pattern from Session Transcript
{"type":"message","role":"assistant","content":[],"api":"openclaw","provider":"minimax-portal","model":"MiniMax-M2.1-lightning","usage":{"input":0,"output":0,"cacheRead":0,"cacheWrite":0,"totalTokens":0},"stopReason":"error","timestamp":1770355978327,"errorMessage":"request ended without sending any chunks"}Note: usage shows 0 input/output tokens - this suggests no LLM response was actually delivered to the gateway for chunking.
Root Cause Analysis
This appears to be an issue in OpenClaw's gateway-to-agent streaming protocol where:
- Gateway sends a "prepare for agent" request
- Agent never starts or produces output
- Gateway times out waiting for chunks that never arrive
- Gateway reports "request ended without sending any chunks"
Possible locations:
src/gateway/server-agent.ts- Agent message handlersrc/gateway/chat.ts- TUI agent gateway- Chunk delivery buffer/timeout logic in agent protocol
Impact
- Severity: P0 - OpenClaw is nearly unusable
- Reproduction: 100% consistent - affects every message
- Workaround: None -
/newfails with same error, model/setting changes have no effect
Additional Context
- This is NOT a provider issue - happens with MiniMax, Google, Anthropic
- This is NOT a streaming setting issue - happens with streaming disabled
- This is NOT a lock file issue (diagnosed separately as [Bug]: Orphaned .jsonl.lock files cause "request ended without getting any chunks" and stuck sessions #10170)
- The error message is misleading - suggests "no chunks" when the actual problem is likely:
- Agent process crash/startup failure
- Gateway serialization failure
- Agent never responding to delivery request
Suggested Investigation Areas
- Agent startup: Check if subagent is being spawned correctly when gateway sends "agent" method
- Delivery timeout: Is gateway waiting too long for first chunk? Should timeout and return error instead
- Empty response handling: How does gateway handle LLM returning empty/no content response?
- Channel routing: Is there special handling for TUI vs remote channels that causes issue?
Request
High priority - This completely blocks OpenClaw usage. Please investigate gateway-to-agent streaming/chunk delivery system to understand why "no chunks received" is being reported when usage shows 0 input/output tokens.