Skip to content

Bug: Pre and post summarization shows same token count, token count not immediately updated #3538

@cbruyndoncx

Description

@cbruyndoncx

Describe the bug
Pre and post summarization shows same token count, only after next llm call this is updated

$ goose session -r
resuming session | provider: openrouter model: anthropic/claude-sonnet-4
logging to /home/cb/.local/share/goose/sessions/20250720_141858.jsonl
working directory: /home/cb/projects/github/goose

Goose is running! Enter your instructions, or try asking what goose can do.

Context: ●●●●●●●○○○ 68% (136479/200000 tokens)
( O)> /summarize
◇ Are you sure you want to summarize this conversation? This will condense the message history.
│ Yes

Summarizing conversation...
Conversation has been summarized.
Key information has been preserved while reducing context length.
Context: ●●●●●●●○○○ 68% (136479/200000 tokens)
Carine B (BE - CET TZ) — 14:45
after the next call, it shows correctly
Context: ●●●○○○○○○○ 26% (52824/200000 tokens)

Expected behavior
After summarization; correctly reflecting the reduced token count - or an estimate or a message saying it will be updated after next call - anything that gives confidence that it worked

Please provide following information:

  • OS & Arch: Ubuntu 22.04 x86
  • Interface: CLI
  • Version: v1.1.3
  • Extensions enabled: developer
  • Provider & Model: openrouter anthropic/claude-sonnet-4

Metadata

Metadata

Assignees

Labels

compactionp1Priority 1 - High (supports roadmap)

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions