Skip to content

Handle dict reasoning_content from Gemini 2.5 models#5155

Merged
dsfaccini merged 4 commits intopydantic:mainfrom
JulieLiu99:fix/gemini-reasoning-content-dict-type
Apr 22, 2026
Merged

Handle dict reasoning_content from Gemini 2.5 models#5155
dsfaccini merged 4 commits intopydantic:mainfrom
JulieLiu99:fix/gemini-reasoning-content-dict-type

Conversation

@JulieLiu99
Copy link
Copy Markdown
Contributor

@JulieLiu99 JulieLiu99 commented Apr 21, 2026

Summary

Gemini 2.5 models (Flash and Pro) return reasoning_content as a dict in their OpenAI-compatible response, not a string. Other thinking/reasoning models via OpenAI-compatible proxies may exhibit the same behavior.

The existing truthy check (if reasoning:) treats a non-empty dict as valid, storing it as ThinkingPart.content. On the second API call, _into_message_param() calls '\n\n'.join(contents) which raises:

TypeError: sequence item 0: expected str instance, dict found

This crashes any multi-step / ReAct agent using Gemini 2.5 after the first tool call.

Fix

Add an isinstance(reasoning, str) guard in both the non-streamed and streamed _process_thinking paths so non-string values are silently ignored:

- if reasoning:  # pragma: no branch
+ if isinstance(reasoning, str) and reasoning:  # guard against non-string values (e.g. Gemini 2.5 returns a dict)

Reproduction

import types

msg = types.SimpleNamespace(
    reasoning_content={"reasoningContent": {"reasoningText": {"text": "", "signature": "abc"}}},
    reasoning=None,
)

reasoning = getattr(msg, "reasoning_content", None)

try:
    "\n\n".join([reasoning])
except TypeError as e:
    print(f"BUG: {e}")  # sequence item 0: expected str instance, dict found

if isinstance(reasoning, str) and reasoning:
    pass
else:
    print("Fix: non-string reasoning_content ignored")

Checklist

  • Any AI generated code has been reviewed line-by-line by the human PR author, who stands by it.
  • No breaking changes in accordance with the version policy.
  • PR title is fit for the release changelog.

Gemini 2.5 Flash returns reasoning_content as a dict in its
OpenAI-compatible response, not a string. The truthy check accepted
non-empty dicts, storing them as ThinkingPart.content. On the next
API call _into_message_param() called str.join() on the contents list,
raising TypeError: sequence item 0: expected str instance, dict found.

Fix: add isinstance(reasoning, str) guard in both the non-streamed
and streamed _process_thinking paths so non-string values are ignored.
@github-actions github-actions Bot added size: S Small PR (≤100 weighted lines) bug Report that something isn't working, or PR implementing a fix labels Apr 21, 2026
@JulieLiu99 JulieLiu99 changed the title fix: guard against non-string reasoning_content in _process_thinking Handle dict reasoning_content from Gemini 2.5 models Apr 21, 2026
@JulieLiu99 JulieLiu99 marked this pull request as ready for review April 21, 2026 17:56
devin-ai-integration[bot]

This comment was marked as resolved.

@dsfaccini
Copy link
Copy Markdown
Collaborator

this LGTM, I'd normally ask for a regression test + cassette but since this seems to be specific to envoyproxy I'll let it slide!

devin-ai-integration[bot]

This comment was marked as resolved.

Addresses Devin review feedback on pydantic#5155. Matches the precedent at
openai.py:2936 / :3207 for unexpected provider content: emit a
UserWarning pointing to the issue tracker rather than silently
skipping, so users can discover gateway/provider bugs.
@dsfaccini dsfaccini merged commit e7aeb2d into pydantic:main Apr 22, 2026
45 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Report that something isn't working, or PR implementing a fix size: S Small PR (≤100 weighted lines)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TypeError when Gemini 2.5 returns reasoning_content as a dict (multi-step agents crash on second LLM call)

2 participants