Shodh Logo
███████╗██╗  ██╗ ██████╗ ██████╗ ██╗  ██╗
██╔════╝██║  ██║██╔═══██╗██╔══██╗██║  ██║
███████╗███████║██║   ██║██║  ██║███████║
╚════██║██╔══██║██║   ██║██║  ██║██╔══██║
███████║██║  ██║╚██████╔╝██████╔╝██║  ██║
╚══════╝╚═╝  ╚═╝ ╚═════╝ ╚═════╝ ╚═╝  ╚═╝
M E M O R Y
> Memory that learns with use

shodh-memory — Persistent Cognitive Memory for AI Agents

Persistent memory for AI agents — memories strengthen with use, decay naturally over time, and form associative networks. Hebbian learning, knowledge graphs, and three-tier architecture based on Cowan's model. Runs offline, single binary, no cloud required.

[01]

Features

What makes shodh-memory different

hebbian-learning.rs
🧠

Hebbian Learning

Connections that fire together wire together. Frequently accessed associations become permanent through Long-Term Potentiation.

runs-offline.rs
🔌

Runs Offline

Single ~30MB binary with no cloud dependency. Works on Raspberry Pi, Jetson, industrial PCs, air-gapped systems.

sub-millisecond.rs

Sub-Millisecond

Graph lookups in <1μs. Semantic search in 34-58ms. Fast enough for real-time agent decision making.

neuroscience-grounded.rs
🔬

Neuroscience-Grounded

3-tier architecture based on Cowan's working memory model. Hybrid decay (exponential + power-law) from cognitive research.

knowledge-graph.rs
🌐

Knowledge Graph

Not just vector search—includes spreading activation, interference detection, and memory replay during consolidation.

mcp-integration.rs
🔧

MCP Integration

First-class Model Context Protocol support. Works with Claude Code, Cursor, and any MCP-compatible agent.

[02]

Why shodh-memory?

What makes this different from mem0, zep, cognee, and other memory solutions

Not another vector database

Most "memory" solutions are just vector search with a wrapper. Shodh-memory has a knowledge graph, temporal indices, and hybrid ranking. Connections between memories strengthen when accessed together—like biological synapses.

🔒

No cloud required

Mem0, Zep, and others are cloud-first. Shodh-memory is a single ~30MB binary. No API keys, no Docker, no external dependencies. Your agent's memory runs on your hardware.

🧠

Memory that gets smarter

Static storage forgets nothing and learns nothing. Shodh-memory uses Hebbian learning—frequently co-accessed memories form stronger bonds. Rarely used knowledge decays naturally. Just like your brain.

🔍

Knows what it doesn't know

Claude.md files get compacted. Rules degrade to suggestions at 60-70% enforcement. Shodh's knowledge graph detects blind spots, shallow knowledge, orphaned clusters, and stale zones — surfacing what your agent has forgotten before it matters.

📡

Edge-first architecture

Designed for robots, IoT, air-gapped systems. Runs on Raspberry Pi Zero. Sub-microsecond graph lookups. Your drone doesn't need WiFi to remember.

comparison.md
Featureshodhmem0zepcognee
Runs fully offline
Single binary, no Docker
Hebbian learning
Knowledge graph
Memory decay model
Blind spot detection
Runs on Raspberry Pi
Sub-millisecond lookup
Open source

Others give you storage. We give you cognition.

See the full comparison →
[03]

Context Durability

How long do memories actually last? Here's the science.

retention_curve.rs
// Power-law decay: memories never truly hit zero
TimeNormalPotentiated
Day 150%70%
Day 735%55%
Day 3018%40%
Day 9010%28%
Day 365>1%>5%
Potentiated = accessed 10+ times
decay_model.txt

Strength
  100% |*
       | *
   70% |  *  <- Potentiated (10+ accesses)
       |   *____
   50% |    *   \____
       |         *   \________
   30% |              *       \___________
       |                                  \____
   10% |----------------------------------------\__
       |  Normal decay                            *
    1% |--------------------------------------------*-
       +----+----+----+----+----+----+----+----+----+->
           3d   7d   14d  30d  60d  90d  180d 365d
                        Time

  [====] Exponential   [----] Power-law (heavy tail)
         (0-3 days)           (3+ days)

Hybrid Decay Model

Exponential decay for the first 3 days (consolidation phase), then power-law for long-term retention. Memories never truly hit zero.

Long-Term Potentiation

Memories accessed 10+ times become "potentiated" and decay 10x slower. Effective half-life jumps from 14 days to ~140 days.

Based on: Bi & Poo (1998)

Hebbian Strengthening

Co-accessed memories form stronger bonds. Fire together, wire together. Associations strengthen with use, weaken with neglect.

Memory Replay

During maintenance cycles, important memories are replayed and strengthened—mimicking hippocampal replay during sleep.

Memories accessed 10+ times become permanent.

Even rarely-accessed memories retain >1% strength after a year due to power-law decay.

See the research behind our memory model →
[03]

Architecture

Cowan's working memory model, implemented

👁

Sensory Buffer

Capacity: ~7 items
Decay: < 1 second

Immediate context window. Raw input before processing.

💭

Working Memory

Capacity: ~4 chunks
Decay: Minutes

Active manipulation space. Current task context and associations.

🧠

Long-Term Memory

Capacity: Unlimited
Decay: Power-law

Persistent storage. Episodic + semantic with Hebbian strengthening.

architecture.rs

                    ┌─────────────────────────────────┐
                    │         MCP / API Layer         │
                    │  remember  recall  forget  ...  │
                    └───────────────┬─────────────────┘
                                    │
                                    ▼
┌───────────────────────────────────────────────────────────────────┐
│                        MEMORY CORE (Rust)                         │
│                                                                   │
│  ┌─────────────┐   ┌─────────────┐   ┌─────────────────────────┐  │
│  │   SENSORY   │   │   WORKING   │   │       LONG-TERM         │  │
│  │   BUFFER    │─▶│   MEMORY    │──▶│        MEMORY           │  │
│  │   ~7 items  │   │  ~4 chunks  │   │      unlimited          │  │
│  │  decay:<1s  │   │  decay:mins │   │    decay:power-law      │  │
│  └─────────────┘   └─────────────┘   └─────────────────────────┘  │ 
│         │                 │                      │                │
│         └────attention────┴────consolidation─────┘                │
│                                  │                                │
│                                  ▼                                │
│  ┌────────────────────────────────────────────────────────────┐   │
│  │                    RETRIEVAL SUBSYSTEM                     │   │
│  │                                                            │   │
│  │   VECTOR INDEX      KNOWLEDGE GRAPH      TEMPORAL INDEX    │   │
│  │      (HNSW)          (Hebbian)            (decay)          │   │
│  │                                                            │   │
│  │        │                  │                   │            │   │
│  │        └──────────────────┼───────────────────┘            │   │
│  │                           ▼                                │   │
│  │                    HYBRID RANKING                          │   │
│  │               vector + graph + time                        │   │
│  └────────────────────────────────────────────────────────────┘   │
│                                  │                                │
│                                  ▼                                │
│  ┌────────────────────────────────────────────────────────────┐   │
│  │                         RocksDB                            │   │
│  │         memories | graph | vectors | episodes              │   │
│  └────────────────────────────────────────────────────────────┘   │
│                                                                   │
└───────────────────────────────────────────────────────────────────┘
                                    │
              ┌─────────────────────┴─────────────────────┐
              ▼                                           ▼
┌──────────────────────────┐            ┌───────────────────────────┐
│  HEBBIAN CONSOLIDATION   │◀────────▶ │   INTERFERENCE ENGINE     │
│                          │            │                           │
│co-activation strengthens │            │  similar memories compete │
│  edge.weight += η·Δw     │            │  old decays when new fits │
└──────────────────────────┘            └───────────────────────────┘
Vector Index
HNSW for semantic similarity
Knowledge Graph
Entities + relationships + spreading activation
Temporal Index
Time-based retrieval and decay
Episode Manager
Conversation threading and context
[04]

Installation

Get started in seconds

terminal
$ npx -y @shodh/memory-mcp # Run with npx (recommended)

# Or add to Claude Code:
$ claude mcp add shodh-memory npx -y @shodh/memory-mcp
[05]

Try It

Type commands to interact with a simulated memory system

shodh-memory --interactive
shodh-memory v0.1.90 - Cognitive Memory System
Type "help" for available commands
$

Ready to add memory to your AI agents?

[06]

FAQ

Frequently asked questions about shodh-memory

Vector databases give you similarity search. Shodh-memory gives you cognition—memories strengthen when accessed together (Hebbian learning), decay naturally over time (power-law forgetting), and form associative networks. It's the difference between storage and memory.

No. Shodh-memory runs 100% offline. The embeddings, vector index, knowledge graph—everything runs locally. Perfect for edge devices, air-gapped systems, or anywhere you need data privacy.

The binary is ~30MB. Models add ~50MB (22MB MiniLM embeddings + 14MB NER model + 14MB ONNX runtime). Each memory entry uses roughly 2-5KB. A system with 10,000 memories uses approximately 50MB of storage.

Yes. Shodh-memory is designed for edge deployment. It runs on Raspberry Pi Zero, Jetson Nano, industrial PCs, and other resource-constrained devices. Graph lookups are <1μs.

We use a hybrid model: exponential decay for the first 3 days (consolidation phase), then power-law decay for long-term retention. Memories accessed 10+ times become 'potentiated' and decay 10x slower. Based on Wixted & Ebbesen (1991).

"Cells that fire together, wire together." When memories are accessed together, their connection strengthens. When memories compete, interference effects occur. It's how biological brains work, now in your AI agent.

No, and that's intentional. Shodh-memory is built for local-first, privacy-preserving AI. Your agent's memories stay on your hardware. If you need multi-device sync, you can replicate the RocksDB storage yourself.

The core is Rust. We provide: MCP server (for Claude, Cursor), Python bindings (via PyO3/maturin), and a REST API. The Rust crate can be embedded directly in your application.

Check out github.com/varun29ankuS/shodh-memory. Open issues, submit PRs, or join discussions. The codebase is well-documented with 688+ tests. All constants have neuroscience citations.