Skip to content

Forward OpenAI Responses text.verbosity from model params #47105

@merc1305

Description

@merc1305

Problem

OpenClaw already forwards several OpenAI Responses controls like max_output_tokens, reasoning.effort, and service_tier, but it does not currently forward OpenAI's text.verbosity setting.

That makes it harder to use an important OpenAI-native control for answer length/style without resorting to prompt hacks.

Why this matters

text.verbosity is distinct from reasoning depth:

  • models can think deeply internally
  • but still return a short external answer

This is useful for people who want "think more, say less" behavior from OpenAI models.

Proposed support

Allow model params such as:

agents: {
  defaults: {
    models: {
      "openai/gpt-5.4": {
        params: {
          textVerbosity: "low"
        }
      }
    }
  }
}

and forward that to OpenAI Responses payloads as:

{
  "text": { "verbosity": "low" }
}

Prefer supporting both alias styles:

  • textVerbosity
  • text_verbosity

Scope

  • OpenAI Responses payload shaping
  • config/model params passthrough
  • tests for payload injection / invalid values / precedence

I already have a patch prepared for this and will open a PR shortly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions