-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Migrate conversation continuity to plugin-side encrypted reasoning items (Responses API) #9203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ems (Responses API) Summary We moved continuity off OpenAI servers and now maintain conversation state locally by persisting and replaying encrypted reasoning items. Requests are stateless (store=false) while retaining the performance/caching benefits of the Responses API. Why This aligns with how Roo manages context and simplifies our Responses API implementation while keeping all the benefits of continuity, caching, and latency improvements. What changed - All OpenAI models now use the Responses API; system instructions are passed via the top-level instructions field; requests include store=false and include=["reasoning.encrypted_content"]. - We persist encrypted reasoning items (type: "reasoning", encrypted_content, optional id) into API history and replay them on subsequent turns. - Reasoning summaries default to summary: "auto" when supported; text.verbosity only when supported. - Atomic persistence via safeWriteJson. Removed - previous_response_id flows, suppressPreviousResponseId/skipPrevResponseIdOnce, persistGpt5Metadata(), and GPT‑5 response ID metadata in UI messages. Kept - taskId and mode metadata for cross-provider features. Result - ZDR-friendly, stateless continuity with equal or better performance and a simpler codepath.
Reviewed commit 7bd5071. No new issues found - the commit adds proper TypeScript typing to conversation history processing without changing behavior. Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues. |
Continuity is stateless via encrypted reasoning items that we persist and replay. We now capture the top-level response id in OpenAiNativeHandler and persist the assistant message id into api_conversation_history.json solely for debugging/correlation with provider logs; it is not used for continuity or control flow. Also: silence request-body debug logging to avoid leaking prompts.
hannesrudolph
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implemented review fixes:
- Guarded request-body debug log in OpenAiNativeHandler to avoid leaking prompts.
- Retain response id from Responses API for troubleshooting/correlation only (not continuity). Continuity remains stateless via encrypted reasoning items persisted in API history.
- Persist assistant message id when present to aid debugging.
All tests pass across src and webview-ui.
…rgs in Task to address Dan's review
|
Addressed Dan's feedback: typed cleanConversationHistory and createMessage params in Task.attemptApiRequest(). Removed any casts and added a ReasoningItem type for encrypted_content. All src and webview-ui tests pass locally. |
Summary
We moved continuity off OpenAI servers and now maintain conversation state locally by persisting and replaying encrypted reasoning items. Requests are stateless (store=false) while retaining the performance/caching benefits of the Responses API. This has the side effect of removing complex logic around condensing.
Additional context https://cookbook.openai.com/examples/responses_api/reasoning_items
Why
This aligns with how Roo manages context and significantly simplifies our Responses API implementation while keeping all the benefits of continuity, caching, and latency improvements.
What changed
OpenAI provider (Responses API everywhere)
Task runtime and persistence
Defaults and gating
Removed (server-side continuity and legacy flags)
Kept (provider-agnostic metadata)
Result
Important
Migrate conversation continuity to stateless API requests using encrypted reasoning items, removing server-side continuity dependencies.
store=falseand use of encrypted reasoning items for continuity inOpenAiNativeHandler.OpenAiNativeHandler.getEncryptedContent()andTask.attemptApiRequest().OpenAiNativeHandlerandTask.ApiMessagetype inapiMessages.tsto includetype,summary, andencrypted_contentfor reasoning items.openai-native.spec.tsandTask.spec.ts.taskMessages.spec.tsto reflect changes in message persistence.metadata.gpt5.previous_response_idfromclineMessageSchemainmessage.ts.persistGpt5Metadata()and related logic fromTask.ts.This description was created by
for 79dadf9. You can customize this summary. It will automatically update as commits are pushed.