Skip to content

[Bug]: Gateway consistently returns "request ended without sending any chunks" for all providers/channels #10210

@zenchantlive

Description

@zenchantlive

Bug Report

Version: OpenClaw 2026.2.4 (npm global install)
OS: Linux 6.14.0-37-generic (x64), Node v22.22.0

Summary

Gateway returns error "request ended without sending any chunks" when attempting to deliver responses from LLM to agents. This happens:

  • Across ALL model providers (MiniMax, Google Gemini, Anthropic, etc.)
  • With both streaming ON and OFF
  • On all channels (Slack, WhatsApp, TUI gateway)
  • On both new /new sessions and existing sessions
  • 100% of messages fail with this error

Steps to Reproduce

  1. Run openclaw in TUI mode
  2. Send any message: "yo`
  3. Error: request ended without sending any chunks
  4. Try /new - same error
  5. Switch models (e.g., /model anthropic/claude-opus-4.5) - same error
  6. Disable streaming - same error

Expected Behavior

Message should be delivered successfully. Gateway's chunk delivery system should either:

  • Send chunks incrementally until complete, or
  • Return non-chunked message normally

Actual Behavior

Gateway reports "request ended without sending any chunks" - appears to be treating no chunks received as a delivery error.

Error Pattern from Session Transcript

{"type":"message","role":"assistant","content":[],"api":"openclaw","provider":"minimax-portal","model":"MiniMax-M2.1-lightning","usage":{"input":0,"output":0,"cacheRead":0,"cacheWrite":0,"totalTokens":0},"stopReason":"error","timestamp":1770355978327,"errorMessage":"request ended without sending any chunks"}

Note: usage shows 0 input/output tokens - this suggests no LLM response was actually delivered to the gateway for chunking.

Root Cause Analysis

This appears to be an issue in OpenClaw's gateway-to-agent streaming protocol where:

  1. Gateway sends a "prepare for agent" request
  2. Agent never starts or produces output
  3. Gateway times out waiting for chunks that never arrive
  4. Gateway reports "request ended without sending any chunks"

Possible locations:

  • src/gateway/server-agent.ts - Agent message handler
  • src/gateway/chat.ts - TUI agent gateway
  • Chunk delivery buffer/timeout logic in agent protocol

Impact

  • Severity: P0 - OpenClaw is nearly unusable
  • Reproduction: 100% consistent - affects every message
  • Workaround: None - /new fails with same error, model/setting changes have no effect

Additional Context

Suggested Investigation Areas

  1. Agent startup: Check if subagent is being spawned correctly when gateway sends "agent" method
  2. Delivery timeout: Is gateway waiting too long for first chunk? Should timeout and return error instead
  3. Empty response handling: How does gateway handle LLM returning empty/no content response?
  4. Channel routing: Is there special handling for TUI vs remote channels that causes issue?

Request

High priority - This completely blocks OpenClaw usage. Please investigate gateway-to-agent streaming/chunk delivery system to understand why "no chunks received" is being reported when usage shows 0 input/output tokens.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions