Skip to content

Commit f7dba06

Browse files
feat: #1186 expose createAiSdkUiMessageStream (#1187)
Co-authored-by: Kazuhiro Sera <[email protected]>
1 parent 8c416ff commit f7dba06

6 files changed

Lines changed: 150 additions & 10 deletions

File tree

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
'@openai/agents-extensions': patch
3+
---
4+
5+
feat: #1186 expose createAiSdkUiMessageStream for AI SDK UI chunks

docs/src/content/docs/extensions/ai-sdk.mdx

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ import aiSdkProviderDataExample from '../../../../../examples/docs/extensions/ai
1010
import aiSdkProviderMetadataExample from '../../../../../examples/docs/extensions/ai-sdk-providerMetadata.ts?raw';
1111
import aiSdkSetupExample from '../../../../../examples/docs/extensions/ai-sdk-setup.ts?raw';
1212
import aiSdkTransformOutputTextExample from '../../../../../examples/docs/extensions/ai-sdk-transform-output-text.ts?raw';
13+
import aiSdkUiMessageStreamExample from '../../../../../examples/docs/extensions/ai-sdk-ui-message-stream.ts?raw';
1314
import aiSdkUiMessageStreamResponseExample from '../../../../../examples/docs/extensions/ai-sdk-ui-message-stream-response.ts?raw';
1415
import aiSdkTextStreamResponseExample from '../../../../../examples/docs/extensions/ai-sdk-text-stream-response.ts?raw';
1516

@@ -113,18 +114,27 @@ There are two related integrations in `@openai/agents-extensions`:
113114
`@openai/agents-extensions/ai-sdk-ui` provides response helpers for wiring Agents SDK streams into AI SDK UI routes:
114115

115116
- `createAiSdkTextStreamResponse(source, options?)` for plain text streaming responses.
117+
- `createAiSdkUiMessageStream(source)` for a lower-level `ReadableStream<UIMessageChunk>`.
116118
- `createAiSdkUiMessageStreamResponse(source, options?)` for `UIMessageChunk` streaming responses.
117119

118-
Both helpers accept a `StreamedRunResult`, stream-like source, or compatible wrapper object and return a `Response` with streaming-friendly headers.
120+
These helpers accept a `StreamedRunResult`, stream-like source, or compatible wrapper object. The response helpers return a `Response` with streaming-friendly headers.
119121

120-
Use `createAiSdkUiMessageStreamResponse(...)` when your UI needs structured chunks such as tool calls or reasoning parts. Use `createAiSdkTextStreamResponse(...)` when you only want plain text.
122+
Use `createAiSdkUiMessageStreamResponse(...)` when your route should return the AI SDK response directly. Use `createAiSdkUiMessageStream(...)` when you want to own the response or rendering layer while still using the maintained Agents SDK to AI SDK `UIMessageChunk` translation. Use `createAiSdkTextStreamResponse(...)` when you only want plain text.
121123

122-
Both helpers also accept optional response settings through `options`:
124+
The response helpers also accept optional response settings through `options`:
123125

124126
- `headers`: additional response headers to merge into the streaming response.
125127
- `status`: the HTTP status code for the returned `Response`.
126128
- `statusText`: the HTTP status text for the returned `Response`.
127129

130+
Example lower-level UI message stream:
131+
132+
<Code
133+
lang="typescript"
134+
code={aiSdkUiMessageStreamExample}
135+
title="UI message stream"
136+
/>
137+
128138
Example Next.js route for UI message streaming:
129139

130140
<Code
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
import { Agent, run } from '@openai/agents';
2+
import { createAiSdkUiMessageStream } from '@openai/agents-extensions/ai-sdk-ui';
3+
4+
const agent = new Agent({
5+
name: 'Assistant',
6+
instructions: 'Reply with a short answer.',
7+
});
8+
9+
export async function createStream() {
10+
const stream = await run(agent, 'Hello there.', { stream: true });
11+
return createAiSdkUiMessageStream(stream);
12+
}

packages/agents-extensions/src/ai-sdk-ui/uiMessageStream.ts

Lines changed: 15 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -483,15 +483,15 @@ async function* buildUiMessageStream(
483483
}
484484

485485
/**
486-
* Creates a UI message stream Response compatible with the AI SDK data stream protocol.
486+
* Creates a UI message stream compatible with the AI SDK data stream protocol.
487487
*/
488-
export function createAiSdkUiMessageStreamResponse(
488+
export function createAiSdkUiMessageStream(
489489
source: AiSdkUiMessageStreamSource,
490-
options: AiSdkUiMessageStreamResponseOptions = {},
491-
): Response {
490+
): ReadableStream<UIMessageChunk> {
492491
const events = resolveEventSource(source);
493492
const iterator = buildUiMessageStream(events)[Symbol.asyncIterator]();
494-
const stream = new ReadableStream<UIMessageChunk>({
493+
494+
return new ReadableStream<UIMessageChunk>({
495495
async pull(controller) {
496496
const { value, done } = await iterator.next();
497497
if (done) {
@@ -506,6 +506,16 @@ export function createAiSdkUiMessageStreamResponse(
506506
}
507507
},
508508
});
509+
}
510+
511+
/**
512+
* Creates a UI message stream Response compatible with the AI SDK data stream protocol.
513+
*/
514+
export function createAiSdkUiMessageStreamResponse(
515+
source: AiSdkUiMessageStreamSource,
516+
options: AiSdkUiMessageStreamResponseOptions = {},
517+
): Response {
518+
const stream = createAiSdkUiMessageStream(source);
509519

510520
return createUIMessageStreamResponse({
511521
stream,

packages/agents-extensions/test/ai-sdk-ui/uiMessageStream.test.ts

Lines changed: 104 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,12 @@ import {
1111
RunToolSearchCallItem,
1212
RunToolSearchOutputItem,
1313
} from '@openai/agents';
14+
import type { RunStreamEvent } from '@openai/agents';
1415
import type { UIMessageChunk } from 'ai';
15-
import { createAiSdkUiMessageStreamResponse } from '../../src/ai-sdk-ui/index';
16+
import {
17+
createAiSdkUiMessageStream,
18+
createAiSdkUiMessageStreamResponse,
19+
} from '../../src/ai-sdk-ui/index';
1620

1721
async function readResponseText(response: Response): Promise<string> {
1822
if (!response.body) {
@@ -61,7 +65,106 @@ async function readUiMessageChunks(
6165
return chunks;
6266
}
6367

68+
async function readUiMessageStream(
69+
stream: ReadableStream<UIMessageChunk>,
70+
): Promise<UIMessageChunk[]> {
71+
const reader = stream.getReader();
72+
const chunks: UIMessageChunk[] = [];
73+
74+
try {
75+
while (true) {
76+
const { done, value } = await reader.read();
77+
if (done) {
78+
break;
79+
}
80+
chunks.push(value);
81+
}
82+
} finally {
83+
reader.releaseLock();
84+
}
85+
86+
return chunks;
87+
}
88+
89+
function createRunEventStream(
90+
events: RunStreamEvent[],
91+
): ReadableStream<RunStreamEvent> {
92+
return new ReadableStream<RunStreamEvent>({
93+
start(controller) {
94+
for (const event of events) {
95+
controller.enqueue(event);
96+
}
97+
controller.close();
98+
},
99+
});
100+
}
101+
64102
describe('createAiSdkUiMessageStreamResponse', () => {
103+
test('creates a raw UI message chunk stream from toStream sources', async () => {
104+
const agent = new Agent({ name: 'Test Agent' });
105+
106+
const messageOutput = new RunMessageOutputItem(
107+
{
108+
type: 'message',
109+
role: 'assistant',
110+
status: 'completed',
111+
content: [{ type: 'output_text', text: 'Raw stream message' }],
112+
},
113+
agent,
114+
);
115+
116+
const stream = createAiSdkUiMessageStream({
117+
toStream: () =>
118+
createRunEventStream([
119+
new RunItemStreamEvent('message_output_created', messageOutput),
120+
]),
121+
});
122+
123+
const chunks = await readUiMessageStream(stream);
124+
125+
expect(chunks.map((chunk) => chunk.type)).toEqual([
126+
'start',
127+
'start-step',
128+
'text-start',
129+
'text-delta',
130+
'text-end',
131+
'finish-step',
132+
'finish',
133+
]);
134+
135+
const textDelta = chunks.find((chunk) => chunk.type === 'text-delta');
136+
expect(textDelta).toMatchObject({ delta: 'Raw stream message' });
137+
});
138+
139+
test('cancels the underlying event iterator when a raw stream is cancelled', async () => {
140+
let cancelled = false;
141+
142+
const events = (async function* () {
143+
try {
144+
yield new RunRawModelStreamEvent({ type: 'response_started' });
145+
yield new RunRawModelStreamEvent({
146+
type: 'output_text_delta',
147+
delta: 'unread',
148+
});
149+
} finally {
150+
cancelled = true;
151+
}
152+
})();
153+
154+
const stream = createAiSdkUiMessageStream(events);
155+
const reader = stream.getReader();
156+
157+
const first = await reader.read();
158+
expect(first).toMatchObject({
159+
done: false,
160+
value: { type: 'start' },
161+
});
162+
163+
await reader.cancel();
164+
165+
expect(cancelled).toBe(true);
166+
});
167+
65168
test('maps run stream events to UI message chunks', async () => {
66169
const agent = new Agent({ name: 'Test Agent' });
67170

packages/agents-extensions/tsconfig.dist-check.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"noEmit": true,
55
"skipLibCheck": false,
66
"types": ["node"],
7-
"lib": ["ES2018", "DOM"],
7+
"lib": ["ES2018", "ES2022.Intl", "DOM"],
88
"baseUrl": ".",
99
"paths": {
1010
"@openai/agents-core/_shims": [

0 commit comments

Comments
 (0)