Skip to content

Add stream_options.include_usage for OpenAI-compatible API token usage#45812

Merged
benbrandt merged 2 commits intozed-industries:mainfrom
Oft3r:feature/stream-options-token-usage
Mar 17, 2026
Merged

Add stream_options.include_usage for OpenAI-compatible API token usage#45812
benbrandt merged 2 commits intozed-industries:mainfrom
Oft3r:feature/stream-options-token-usage

Conversation

@Oft3r
Copy link
Copy Markdown
Contributor

@Oft3r Oft3r commented Dec 29, 2025

Summary

This PR enables token usage reporting in streaming responses for OpenAI-compatible APIs (OpenAI, xAI/Grok, OpenRouter, etc).

Problem

Currently, the token counter UI in the Agent Panel doesn't display usage for some OpenAI-compatible providers because they don't return usage data during streaming by default. According to OpenAI's API documentation, the stream_options.include_usage parameter must be set to true to receive usage statistics in streaming responses.

Solution

  • Added StreamOptions struct with include_usage field to the open_ai crate
  • Added stream_options field to the Request struct
  • Automatically set stream_options: { include_usage: true } when stream: true
  • Updated edit_prediction requests with stream_options: None (non-streaming)

Testing

Tested with xAI Grok models - token counter now correctly shows usage after sending a message.

References

Release Notes:

  • openai: Support usage tracking when streaming responses from openai providers

@cla-bot cla-bot bot added the cla-signed The user has signed the Contributor License Agreement label Dec 29, 2025
@SomeoneToIgnore SomeoneToIgnore added the area:ai Improvement related to Agent Panel, Edit Prediction, Copilot, or other AI features label Dec 29, 2025
@benbrandt benbrandt requested review from agu-z and removed request for agu-z February 12, 2026 13:15
@benbrandt benbrandt removed the request for review from agu-z February 12, 2026 13:16
This enables token usage reporting in streaming responses for
OpenAI-compatible APIs (OpenAI, xAI/Grok, etc).

Without this parameter, APIs do not return usage data during
streaming, which prevents the token counter UI from displaying
current usage.

Changes:
- Add StreamOptions struct with include_usage field to open_ai crate
- Add stream_options field to Request struct
- Set stream_options to include_usage: true when stream is enabled
- Update edit_prediction requests with stream_options: None (non-streaming)
Copy link
Copy Markdown
Member

@benbrandt benbrandt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, works great!

@benbrandt benbrandt force-pushed the feature/stream-options-token-usage branch from 56b1b98 to c7be54d Compare March 17, 2026 10:11
@benbrandt benbrandt enabled auto-merge (squash) March 17, 2026 10:11
Co-authored-by: Smit Barmase <[email protected]>
@benbrandt benbrandt merged commit 905d28c into zed-industries:main Mar 17, 2026
29 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:ai Improvement related to Agent Panel, Edit Prediction, Copilot, or other AI features cla-signed The user has signed the Contributor License Agreement

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants