Skip to content

feat(memory): MemScene consolidation, compress_context tool, A-MAC admission control#2355

Merged
bug-ops merged 1 commit intomainfrom
memory-engram-lifecycle
Mar 28, 2026
Merged

feat(memory): MemScene consolidation, compress_context tool, A-MAC admission control#2355
bug-ops merged 1 commit intomainfrom
memory-engram-lifecycle

Conversation

@bug-ops
Copy link
Copy Markdown
Owner

@bug-ops bug-ops commented Mar 28, 2026

Summary

Implements three research-backed memory improvements:

All new LLM-calling subsystems expose *_provider config fields. New [memory.admission] and [memory.tiers] scene fields added to config.

New config

[memory.admission]
enabled = true
threshold = 0.4
admission_provider = "fast"
[memory.admission.weights]
future_utility = 0.3
factual_confidence = 0.2
semantic_novelty = 0.25
temporal_recency = 0.15
content_type_prior = 0.1

[memory.tiers]
scene_enabled = true
scene_similarity_threshold = 0.80
scene_batch_size = 50
scene_sweep_interval_secs = 7200
scene_provider = "fast"

[memory.compression]
strategy = "autonomous"
compress_provider = "quality"

Breaking changes

  • remember() returns Result<Option<MessageId>> (was Result<MessageId>) — all call sites updated
  • remember_with_parts() returns Result<(Option<MessageId>, bool)> — all call sites updated

Test plan

  • Verify cargo nextest run --workspace --features full --lib --bins passes (7074 tests)
  • Enable [memory.admission] in testing.toml and verify A-MAC gate logs admission decisions
  • Run a session with strategy = "autonomous" and call compress_context manually
  • Verify scene consolidation loop starts with scene_enabled = true (log: scene consolidation: starting sweep)

Follow-up

  • compress_provider runtime resolution for compress_context is deferred to a follow-up issue

Closes #2332
Closes #2218
Closes #2317

@github-actions github-actions bot added documentation Improvements or additions to documentation memory zeph-memory crate (SQLite) rust Rust code changes core zeph-core crate enhancement New feature or request size/XL Extra large PR (500+ lines) labels Mar 28, 2026
@bug-ops bug-ops enabled auto-merge (squash) March 28, 2026 17:02
…mission control

Implements three memory improvements based on research papers:

- EverMemOS (#2332): brain-inspired engram lifecycle with MemCell→MemScene
  consolidation. New SQLite tables (migration 049), background scene
  consolidation loop decoupled from tier promotion, cosine-similarity
  clustering, LLM-generated label/profile per scene, scene members
  excluded from regular recall results.

- Focus (#2218): agent-invokable compress_context native tool. Knowledge
  block in ContextBuilder survives compaction. AtomicBool concurrency
  guard prevents double-compression. Index-based message removal.
  CompressionStrategy::Autonomous variant with compress_provider config.

- A-MAC (#2317): write-time admission gate on memory_save path. Five-factor
  scoring (future_utility, factual_confidence, semantic_novelty,
  temporal_recency, content_type_prior) with normalized weights. Fast path
  skips LLM call when heuristic score exceeds threshold+margin. remember()
  return type changed to Result<Option<MessageId>>. Wired in AppBuilder.
  Admission decisions logged via AuditLogger.

All new LLM-calling subsystems expose *_provider config fields. New
[memory.admission] and [memory.tiers] scene fields added to config.
Migration step added for remember() call-site changes.

Closes #2332
Closes #2218
Closes #2317
@bug-ops bug-ops force-pushed the memory-engram-lifecycle branch from 14f7c16 to e167496 Compare March 28, 2026 17:25
@bug-ops bug-ops merged commit eccacd5 into main Mar 28, 2026
25 checks passed
@bug-ops bug-ops deleted the memory-engram-lifecycle branch March 28, 2026 17:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

core zeph-core crate documentation Improvements or additions to documentation enhancement New feature or request memory zeph-memory crate (SQLite) rust Rust code changes size/XL Extra large PR (500+ lines)

Projects

None yet

1 participant