Skip to content

fix(session): fix token usage double-counting w/ anthropic & bedrock due to AI SDK v6 upgrade#19758

Merged
rekram1-node merged 3 commits intoanomalyco:devfrom
ualtinok:fix/token-usage-double-counting
Mar 29, 2026
Merged

fix(session): fix token usage double-counting w/ anthropic & bedrock due to AI SDK v6 upgrade#19758
rekram1-node merged 3 commits intoanomalyco:devfrom
ualtinok:fix/token-usage-double-counting

Conversation

@ualtinok
Copy link
Copy Markdown
Contributor

Issue for this PR

Closes #19757

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

AI SDK v6 (merged in v1.3.4 via #18433) normalized inputTokens to include cached tokens for all providers. Previously Anthropic/Bedrock excluded cache from inputTokens, and the excludesCachedTokens flag handled this by skipping the cache subtraction for those providers.

After the v6 upgrade, inputTokens includes cache for Anthropic too, but excludesCachedTokens still skipped the subtraction — so cache tokens were counted once in adjustedInputTokens and again when added back for the total. This doubled the reported total and inflated tokens.input.

The fix removes the excludesCachedTokens provider check and always subtracts cache from inputTokens, matching v6 semantics where all providers report inputTokens inclusive of cache.

How did you verify your code works?

  • Updated the 4 failing tests that assumed old v5 Anthropic behavior to match v6 semantics
  • Full test suite: 1623 pass, 0 fail
  • Verified against real session data showing input ≈ cache.read + cache.write + 1 (confirming v6 includes cache in inputTokens)

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

AI SDK v6 normalized inputTokens to include cached tokens for all
providers including Anthropic/Bedrock. The excludesCachedTokens flag
assumed the old v5 behavior where Anthropic excluded cache from
inputTokens, causing cache tokens to be counted twice in total and
inflating cost calculations.
@rekram1-node rekram1-node changed the title fix(session): fix token usage double-counting after AI SDK v6 upgrade fix(session): fix token usage double-counting w/ anthropic & bedrock due to AI SDK v6 upgrade Mar 29, 2026
@rekram1-node rekram1-node force-pushed the fix/token-usage-double-counting branch from 417bb48 to e6d7eb4 Compare March 29, 2026 17:29
@rekram1-node rekram1-node merged commit 72c77d0 into anomalyco:dev Mar 29, 2026
8 checks passed
e-n-0 pushed a commit to e-n-0/opencode that referenced this pull request Mar 29, 2026
…due to AI SDK v6 upgrade (anomalyco#19758)

Co-authored-by: Aiden Cline <[email protected]>
Co-authored-by: Aiden Cline <[email protected]>
loocor pushed a commit to loocor/opencode that referenced this pull request Mar 30, 2026
…due to AI SDK v6 upgrade (anomalyco#19758)

Co-authored-by: Aiden Cline <[email protected]>
Co-authored-by: Aiden Cline <[email protected]>
afanty2021 pushed a commit to afanty2021/opencode that referenced this pull request Mar 30, 2026
…due to AI SDK v6 upgrade (anomalyco#19758)

Co-authored-by: Aiden Cline <[email protected]>
Co-authored-by: Aiden Cline <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Token usage double-counting after AI SDK v6 upgrade

2 participants