Skip to content

Commit 1bd757e

Browse files
committed
Fix: Use api.getModel() directly instead of cachedStreamingModel
Addresses review comment: cachedStreamingModel is set during streaming but buildCleanConversationHistory is called before streaming starts. Using the cached value could cause stale model info when switching models between requests. Now directly uses this.api.getModel().info.preserveReasoning to ensure we always check the current model's flag, not a potentially stale cached value.
1 parent 219e73d commit 1bd757e

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

src/core/task/Task.ts

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3437,8 +3437,9 @@ export class Task extends EventEmitter<TaskEvents> implements TaskLike {
34373437
// Check if the model's preserveReasoning flag is set
34383438
// If true, include the reasoning block in API requests
34393439
// If false/undefined, strip it out (stored for history only, not sent back to API)
3440-
const modelInfo = this.cachedStreamingModel?.info ?? this.api.getModel().info
3441-
const shouldPreserveForApi = modelInfo.preserveReasoning === true
3440+
// Note: Use api.getModel() directly instead of cachedStreamingModel since
3441+
// buildCleanConversationHistory is called before streaming starts
3442+
const shouldPreserveForApi = this.api.getModel().info.preserveReasoning === true
34423443
let assistantContent: Anthropic.Messages.MessageParam["content"]
34433444

34443445
if (shouldPreserveForApi) {

0 commit comments

Comments
 (0)