Stop re-explaining your project to your AI every session. memtomem turns your notes, docs, and code into a searchable memory that any MCP-compatible agent can use — across sessions, across agents, all on your machine.
All context is lost when a session ends. Architecture decisions, coding patterns, and debugging history must be re-explained every time.
Knowledge from Claude Code can't be carried over to Cursor. Each agent is trapped in its own isolated memory silo.
Current memory systems only work when agents explicitly search, are locked to specific runtimes, and offer only a single LTM layer.
Your agent doesn't have to ask. STM watches tool calls and slips in relevant past memories automatically — tuned by your feedback.
STM sits invisibly between your agent and its tools. Existing MCP servers keep working unchanged — no code, no new integrations.
Define an agent or skill once. Context Gateway syncs it to Claude Code, Cursor, Codex CLI, and more — in each tool's native format.
Big tool responses get trimmed to fit your context window. Ten strategies for JSON, markdown, or free-form text, plus a zero-loss progressive mode for huge payloads.
Each agent gets its own private memory, plus a shared one. Knowledge flows between agents — or from you to every agent at once.
SQLite + ONNX under the hood. No GPU, no external API, no cloud dependency — your memory stays on your machine.
From install to first memory in under 5 minutes. uv tool install → interactive mm init → ask your agent.
GuideHow BM25 + vector + RRF fusion search works and how to tune it.
LTM10 strategies, auto-selection logic, and query-aware budget allocation.
STM5-level gating, feedback loop, min_score auto-tuning deep dive.
STM5-step session workflow, namespace inheritance, cross-agent sharing.
LTMCross-runtime sync, format conversion, LangGraph adapter.
LTMNo GPU. No external services. One uv install is all you need.