Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Sep 22, 2025

This PR attempts to address Issue #8215. Feedback and guidance are welcome.

Problem

When using OpenAI Compatible providers with streaming disabled, the system prompt was incorrectly being assigned a "user" role instead of "system" role. This inconsistency could cause issues with providers that expect proper role designation.

Solution

  • Changed the role from "user" to "system" for system prompts in non-streaming mode
  • Updated the corresponding test to reflect the corrected behavior
  • Ensures consistency between streaming and non-streaming modes

Changes

  • Modified src/api/providers/openai.ts line 222-224 to use ChatCompletionSystemMessageParam with "system" role
  • Updated test expectations in src/api/providers/__tests__/openai.spec.ts

Testing

  • All 42 tests in the OpenAI provider test suite pass successfully
  • Verified that system role is now consistently used in both streaming and non-streaming modes

Fixes #8215


Important

Corrects role assignment for system prompts from "user" to "system" in non-streaming mode in openai.ts, ensuring consistency with streaming mode.

  • Behavior:
    • Corrects role assignment for system prompts from "user" to "system" in non-streaming mode in openai.ts.
    • Ensures consistent role usage between streaming and non-streaming modes.
  • Testing:
    • Updates test in openai.spec.ts to expect "system" role for system prompts in non-streaming mode.
    • All 42 tests in the OpenAI provider test suite pass successfully.

This description was created by Ellipsis for 795f0e9. You can customize this summary. It will automatically update as commits are pushed.

… disabled

- Changed role from "user" to "system" for system prompt in non-streaming mode
- Updated test expectations to match the corrected behavior
- Ensures consistency between streaming and non-streaming modes

Fixes #8215
@roomote roomote bot requested review from cte, jr and mrubens as code owners September 22, 2025 16:10
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. bug Something isn't working labels Sep 22, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewing my own code is like debugging in a mirror - everything looks backwards but somehow still broken.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 22, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Sep 23, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Sep 23, 2025
@hannesrudolph
Copy link
Collaborator

@roomote go ahead and implement the suggestions

@roomote
Copy link
Contributor Author

roomote bot commented Oct 20, 2025

Acknowledged. Implementing the suggested review changes now: 1) clarify the inline comment about o1 models vs the general non-streaming path, and 2) add a short rationale for using the system role for OpenAI Compatible when streaming is disabled. I will push an update shortly.

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Oct 20, 2025
@roomote
Copy link
Contributor Author

roomote bot commented Oct 20, 2025

Review Summary

No issues found. The PR correctly addresses issue #8215 by:

  • Changing the role from "user" to "system" for system prompts in non-streaming mode
  • Updating the TypeScript type to match (ChatCompletionSystemMessageParam)
  • Ensuring consistency between streaming and non-streaming modes
  • Removing the misleading comment that was previously flagged

All tests pass and the implementation handles edge cases (o1/o3/o4 models, DeepSeek reasoner) appropriately.


View Job

@roomote
Copy link
Contributor Author

roomote bot commented Oct 20, 2025

Summary update:

  • Implemented the review suggestions:
    • Clarified the inline comment in src/api/providers/openai.ts around the non-streaming branch to explain that o1/o3/o4 models do not support system prompts and that this code path targets general OpenAI-compatible providers that do support system prompts.
    • Added rationale for using the "system" role in non-streaming mode for consistency with streaming behavior and provider expectations.
  • Verified locally: OpenAI provider tests all pass (42/42) using npx vitest run src/api/providers/tests/openai.spec.ts.
  • Pushed commit with the changes: 9d154de.

CI status:

Request:

  • I do not have permission to rerun the workflow (rerun denied by GitHub permissions). When convenient, please rerun the failed integration-test. If it continues to fail, I can investigate further, but it looks orthogonal to this change set.

@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. and removed size:S This PR changes 10-29 lines, ignoring generated files. labels Oct 21, 2025
@RooCodeInc RooCodeInc deleted a comment from roomote bot Oct 21, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Oct 21, 2025
@daniel-lxs daniel-lxs moved this from PR [Needs Prelim Review] to PR [Needs Review] in Roo Code Roadmap Oct 21, 2025
@mrubens
Copy link
Collaborator

mrubens commented Oct 22, 2025

I think this logic was there because of openai models that don't support the system role. Does this change break anything there? I wrote the original code so long ago...

@daniel-lxs
Copy link
Member

This change is for the OpenAI compatible provider. Should we worry about OpenAI models when the native provider is available?

Also, the user role is only used when disabling streaming, so if there are models that don't support the system role, these will only work with streaming off. Was that also intended?

@mrubens
Copy link
Collaborator

mrubens commented Oct 24, 2025

This change is for the OpenAI compatible provider. Should we worry about OpenAI models when the native provider is available?

Also, the user role is only used when disabling streaming, so if there are models that don't support the system role, these will only work with streaming off. Was that also intended?

Yeah, at the time I think this was to support using o1 through an openai proxy, not 100% sure the use case but a lot of people were interested in it. Maybe no longer an issue though, and this isn't a very elegant way to handle it either way.

@mrubens mrubens merged commit 1ba0150 into main Nov 6, 2025
17 of 18 checks passed
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Nov 6, 2025
@mrubens mrubens deleted the fix/openai-system-role-streaming branch November 6, 2025 21:39
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Nov 6, 2025