Handle dict reasoning_content from Gemini 2.5 models#5155
Merged
dsfaccini merged 4 commits intopydantic:mainfrom Apr 22, 2026
Merged
Handle dict reasoning_content from Gemini 2.5 models#5155dsfaccini merged 4 commits intopydantic:mainfrom
reasoning_content from Gemini 2.5 models#5155dsfaccini merged 4 commits intopydantic:mainfrom
Conversation
Gemini 2.5 Flash returns reasoning_content as a dict in its OpenAI-compatible response, not a string. The truthy check accepted non-empty dicts, storing them as ThinkingPart.content. On the next API call _into_message_param() called str.join() on the contents list, raising TypeError: sequence item 0: expected str instance, dict found. Fix: add isinstance(reasoning, str) guard in both the non-streamed and streamed _process_thinking paths so non-string values are ignored.
reasoning_content from Gemini 2.5 models
2 tasks
Collaborator
|
this LGTM, I'd normally ask for a regression test + cassette but since this seems to be specific to envoyproxy I'll let it slide! |
Addresses Devin review feedback on pydantic#5155. Matches the precedent at openai.py:2936 / :3207 for unexpected provider content: emit a UserWarning pointing to the issue tracker rather than silently skipping, so users can discover gateway/provider bugs.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Gemini 2.5 models (Flash and Pro) return
reasoning_contentas a dict in their OpenAI-compatible response, not a string. Other thinking/reasoning models via OpenAI-compatible proxies may exhibit the same behavior.The existing truthy check (
if reasoning:) treats a non-empty dict as valid, storing it asThinkingPart.content. On the second API call,_into_message_param()calls'\n\n'.join(contents)which raises:This crashes any multi-step / ReAct agent using Gemini 2.5 after the first tool call.
Fix
Add an
isinstance(reasoning, str)guard in both the non-streamed and streamed_process_thinkingpaths so non-string values are silently ignored:Reproduction
TypeErrorwhen Gemini 2.5 returnsreasoning_contentas a dict (multi-step agents crash on second LLM call) #5157Checklist