Skip to content

Raw LLM API errors leak to end users via WhatsApp/messaging channels #20250

@aldoeliacim

Description

@aldoeliacim

Bug

When an LLM call fails with stopReason: "error" due to a transient server error (e.g., Anthropic 500 Internal Server Error), OpenClaw sends the raw error JSON details as a WhatsApp message to the end user.

Example leaked message:

LLM error api_error: Internal server error (request_id: req_011CYFmpt8r8CFFmnpgGL5cQ)

When it happened: 2026-02-18 at 9:30 AM CST — Anthropic returned 4 consecutive 500 errors, and the raw error text was delivered to an end user on WhatsApp.

Root Cause

formatRawAssistantErrorForUi() in src/agents/pi-embedded-helpers/errors.ts formats API error payloads but exposes raw details (error type, message, request_id) for transient server errors. These formatted strings get pushed as reply payloads via buildEmbeddedRunPayloads() and delivered to the messaging channel.

Rate limit and overloaded errors already have user-friendly messages, but generic api_error/server_error/internal_error types fall through to the raw formatter.

Expected Behavior

Transient LLM API errors (500, api_error, server_error, internal_error) should NEVER expose raw error details to end users. They should show a generic user-friendly message like: "The AI service encountered a temporary error. Please try again in a moment."

Impact

  • End users see confusing technical error messages with internal request IDs
  • Potential information leakage (API provider details, request IDs)
  • Poor user experience

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions