⠀⠀⠀⠀⠀⠀⠀⠀⣠⣤⣤⣤⣤⣄⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⢀⣤⣾⣿⣿⣿⣿⣿⣿⣿⣿⡀⠀⠀⠀⠀ ⠀⠀⠀⣴⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣧⠀⠀⠀⠀ ⠀⠀⠀⣿⣿⣿⣿⣿⣿⣿⣿⣿⠈⠯⢹⣿⠀⠀⠀⠀ ⠀⠀⠀⣿⣿⣿⠟⠋⠉⠙⣿⣿⠀⠀⠀⠻⠷⠖⠀⠀ ⠀⠀⠐⠛⠛⠛⠀⠀⠀⠀⠛⠛⠃⠀⠀⠀⠀⠀⠀⠀

███████╗██╗ ██╗ ██████╗ ██████╗ ██╗ ██╗ ██╔════╝██║ ██║██╔═══██╗██╔══██╗██║ ██║ ███████╗███████║██║ ██║██║ ██║███████║ ╚════██║██╔══██║██║ ██║██║ ██║██╔══██║ ███████║██║ ██║╚██████╔╝██████╔╝██║ ██║ ╚══════╝╚═╝ ╚═╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝
shodh-memory — Persistent Cognitive Memory for AI Agents
Persistent memory for AI agents — memories strengthen with use, decay naturally over time, and form associative networks. Hebbian learning, knowledge graphs, and three-tier architecture based on Cowan's model. Runs offline, single binary, no cloud required.
Features
What makes shodh-memory different
Hebbian Learning
Connections that fire together wire together. Frequently accessed associations become permanent through Long-Term Potentiation.
Runs Offline
Single ~30MB binary with no cloud dependency. Works on Raspberry Pi, Jetson, industrial PCs, air-gapped systems.
Sub-Millisecond
Graph lookups in <1μs. Semantic search in 34-58ms. Fast enough for real-time agent decision making.
Neuroscience-Grounded
3-tier architecture based on Cowan's working memory model. Hybrid decay (exponential + power-law) from cognitive research.
Knowledge Graph
Not just vector search—includes spreading activation, interference detection, and memory replay during consolidation.
MCP Integration
First-class Model Context Protocol support. Works with Claude Code, Cursor, and any MCP-compatible agent.
Why shodh-memory?
What makes this different from mem0, zep, cognee, and other memory solutions
Not another vector database
Most "memory" solutions are just vector search with a wrapper. Shodh-memory has a knowledge graph, temporal indices, and hybrid ranking. Connections between memories strengthen when accessed together—like biological synapses.
No cloud required
Mem0, Zep, and others are cloud-first. Shodh-memory is a single ~30MB binary. No API keys, no Docker, no external dependencies. Your agent's memory runs on your hardware.
Memory that gets smarter
Static storage forgets nothing and learns nothing. Shodh-memory uses Hebbian learning—frequently co-accessed memories form stronger bonds. Rarely used knowledge decays naturally. Just like your brain.
Knows what it doesn't know
Claude.md files get compacted. Rules degrade to suggestions at 60-70% enforcement. Shodh's knowledge graph detects blind spots, shallow knowledge, orphaned clusters, and stale zones — surfacing what your agent has forgotten before it matters.
Edge-first architecture
Designed for robots, IoT, air-gapped systems. Runs on Raspberry Pi Zero. Sub-microsecond graph lookups. Your drone doesn't need WiFi to remember.
| Feature | shodh | mem0 | zep | cognee |
|---|---|---|---|---|
| Runs fully offline | ✓ | — | — | — |
| Single binary, no Docker | ✓ | — | — | — |
| Hebbian learning | ✓ | — | — | — |
| Knowledge graph | ✓ | — | ✓ | ✓ |
| Memory decay model | ✓ | — | — | — |
| Blind spot detection | ✓ | — | — | — |
| Runs on Raspberry Pi | ✓ | — | — | — |
| Sub-millisecond lookup | ✓ | — | — | — |
| Open source | ✓ | ✓ | ✓ | ✓ |
Others give you storage. We give you cognition.
See the full comparison →Context Durability
How long do memories actually last? Here's the science.
| Time | Normal | Potentiated |
|---|---|---|
| Day 1 | 50% | 70% |
| Day 7 | 35% | 55% |
| Day 30 | 18% | 40% |
| Day 90 | 10% | 28% |
| Day 365 | >1% | >5% |
Strength
100% |*
| *
70% | * <- Potentiated (10+ accesses)
| *____
50% | * \____
| * \________
30% | * \___________
| \____
10% |----------------------------------------\__
| Normal decay *
1% |--------------------------------------------*-
+----+----+----+----+----+----+----+----+----+->
3d 7d 14d 30d 60d 90d 180d 365d
Time
[====] Exponential [----] Power-law (heavy tail)
(0-3 days) (3+ days)
Hybrid Decay Model
Exponential decay for the first 3 days (consolidation phase), then power-law for long-term retention. Memories never truly hit zero.
Long-Term Potentiation
Memories accessed 10+ times become "potentiated" and decay 10x slower. Effective half-life jumps from 14 days to ~140 days.
Hebbian Strengthening
Co-accessed memories form stronger bonds. Fire together, wire together. Associations strengthen with use, weaken with neglect.
Memory Replay
During maintenance cycles, important memories are replayed and strengthened—mimicking hippocampal replay during sleep.
Memories accessed 10+ times become permanent.
Even rarely-accessed memories retain >1% strength after a year due to power-law decay.
See the research behind our memory model →Architecture
Cowan's working memory model, implemented
Sensory Buffer
Immediate context window. Raw input before processing.
Working Memory
Active manipulation space. Current task context and associations.
Long-Term Memory
Persistent storage. Episodic + semantic with Hebbian strengthening.
┌─────────────────────────────────┐
│ MCP / API Layer │
│ remember recall forget ... │
└───────────────┬─────────────────┘
│
▼
┌───────────────────────────────────────────────────────────────────┐
│ MEMORY CORE (Rust) │
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
│ │ SENSORY │ │ WORKING │ │ LONG-TERM │ │
│ │ BUFFER │─▶│ MEMORY │──▶│ MEMORY │ │
│ │ ~7 items │ │ ~4 chunks │ │ unlimited │ │
│ │ decay:<1s │ │ decay:mins │ │ decay:power-law │ │
│ └─────────────┘ └─────────────┘ └─────────────────────────┘ │
│ │ │ │ │
│ └────attention────┴────consolidation─────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ RETRIEVAL SUBSYSTEM │ │
│ │ │ │
│ │ VECTOR INDEX KNOWLEDGE GRAPH TEMPORAL INDEX │ │
│ │ (HNSW) (Hebbian) (decay) │ │
│ │ │ │
│ │ │ │ │ │ │
│ │ └──────────────────┼───────────────────┘ │ │
│ │ ▼ │ │
│ │ HYBRID RANKING │ │
│ │ vector + graph + time │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ RocksDB │ │
│ │ memories | graph | vectors | episodes │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
└───────────────────────────────────────────────────────────────────┘
│
┌─────────────────────┴─────────────────────┐
▼ ▼
┌──────────────────────────┐ ┌───────────────────────────┐
│ HEBBIAN CONSOLIDATION │◀────────▶ │ INTERFERENCE ENGINE │
│ │ │ │
│co-activation strengthens │ │ similar memories compete │
│ edge.weight += η·Δw │ │ old decays when new fits │
└──────────────────────────┘ └───────────────────────────┘
Installation
Get started in seconds
Try It
Type commands to interact with a simulated memory system
Ready to add memory to your AI agents?
FAQ
Frequently asked questions about shodh-memory
Vector databases give you similarity search. Shodh-memory gives you cognition—memories strengthen when accessed together (Hebbian learning), decay naturally over time (power-law forgetting), and form associative networks. It's the difference between storage and memory.
No. Shodh-memory runs 100% offline. The embeddings, vector index, knowledge graph—everything runs locally. Perfect for edge devices, air-gapped systems, or anywhere you need data privacy.
The binary is ~30MB. Models add ~50MB (22MB MiniLM embeddings + 14MB NER model + 14MB ONNX runtime). Each memory entry uses roughly 2-5KB. A system with 10,000 memories uses approximately 50MB of storage.
Yes. Shodh-memory is designed for edge deployment. It runs on Raspberry Pi Zero, Jetson Nano, industrial PCs, and other resource-constrained devices. Graph lookups are <1μs.
We use a hybrid model: exponential decay for the first 3 days (consolidation phase), then power-law decay for long-term retention. Memories accessed 10+ times become 'potentiated' and decay 10x slower. Based on Wixted & Ebbesen (1991).
"Cells that fire together, wire together." When memories are accessed together, their connection strengthens. When memories compete, interference effects occur. It's how biological brains work, now in your AI agent.
No, and that's intentional. Shodh-memory is built for local-first, privacy-preserving AI. Your agent's memories stay on your hardware. If you need multi-device sync, you can replicate the RocksDB storage yourself.
The core is Rust. We provide: MCP server (for Claude, Cursor), Python bindings (via PyO3/maturin), and a REST API. The Rust crate can be embedded directly in your application.
Check out github.com/varun29ankuS/shodh-memory. Open issues, submit PRs, or join discussions. The codebase is well-documented with 688+ tests. All constants have neuroscience citations.
More questions? Ask on GitHub Discussions