feat: support OpenAI Responses phase on assistant messages#5229
Merged
Conversation
GPT-5.3-codex / GPT-5.4+ label assistant messages with `phase` (`commentary` for preambles, `final_answer` for the answer); OpenAI recommends preserving and round-tripping it on every follow-up request to avoid preambles being mistaken for final answers in tool-heavy flows. Capture `phase` into `TextPart.provider_details` (non-streamed and streamed) and send it back when `OpenAIModelProfile.openai_supports_phase` is on; default off so unverified OpenAI-compatible APIs (vLLM, Bifrost, ...) never see the field.
Comment on lines
+2840
to
+2842
| # Track `phase` (commentary | final_answer) on assistant message items, captured | ||
| # from the `output_item.added`/`output_item.done` events and merged into the | ||
| # corresponding `TextPart.provider_details` on `output_text.done`. |
Contributor
There was a problem hiding this comment.
Nit: this comment says phase is captured "from the output_item.added/output_item.done events" but the code only captures it from output_item.added (line 2908). The ResponseOutputItemDoneEvent handler doesn't have a branch for ResponseOutputMessage to update _phase_by_item. Consider updating to just "from the output_item.added event".
Contributor
Docs Preview
|
Alex-Resch
pushed a commit
to Alex-Resch/pydantic-ai
that referenced
this pull request
Apr 29, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
GPT-5.3-codex / GPT-5.4+ label assistant messages with a new
phasefield on the OpenAI Responses API (commentaryfor preambles before tool calls,final_answerfor the completed answer). OpenAI recommends preserving and round-tripping it on every follow-up request — dropping it can cause preambles to be interpreted as final answers and degrade behavior in long-running or tool-heavy flows.This PR captures
phaseintoTextPart.provider_details['phase'](non-streamed and streamed) and sends it back on the assistantResponseOutputMessageParam/EasyInputMessageParamwhen the model profile opts in via a newOpenAIModelProfile.openai_supports_phaseflag. The flag is enabled forgpt-5.3-codex,gpt-5.4*, andgpt-5.5*and defaults toFalseeverywhere else, so unverified OpenAI-compatible APIs (vLLM, Bifrost, …) never see the field. Empirically the official OpenAI API silently ignoresphaseon older models (gpt-5.2, gpt-4o), so the flag is an extra guard for non-OpenAI providers rather than a hard requirement.Test plan
tests/models/test_openai_responses.py::test_openai_responses_phase_live— real cassette againstgpt-5.5with a tool call, capturescommentarypreamble andfinal_answer, and (visible in the cassette) verifies pydantic-ai sendsphase: commentaryback to the API on the follow-up turn.test_openai_responses_phase_non_streamed— non-streamed read intoprovider_details.test_openai_responses_phase_streamed— streamed read viaoutput_item.added→output_text.done.test_openai_responses_phase_round_trip— supporting model preserves bothcommentaryandfinal_answeron subsequent requests.test_openai_responses_phase_skipped_when_profile_unsupported—gpt-5.2(flag off) does NOT send the field even when it's in history.test_openai_responses_phase_profile_flag— flag tracks the documented model list.Checklist