Skip to content

Local GGUF memory embeddings can deadlock due to Promise.all concurrency (node-llama-cpp) #7547

@Asentient

Description

@Asentient

Summary\nWhen using local GGUF embeddings for memory search, Memory index updated (main)./ can hang indefinitely (repeating "batch start" with 0 chunks indexed).\n\nRoot cause appears to be concurrent embedding calls using against the node-llama-cpp embedding context, which can deadlock/hang when is executed concurrently.\n\n### Environment\n- OpenClaw: 2026.1.29\n- MemorySearch provider: (GGUF)\n- Runtime: node-llama-cpp (via OpenClaw local embeddings)\n- Symptom: Memory Index (main)

Provider: local (requested: local)
Model: /home/gaius/.openclaw/models/embeddings/nomic-embed-text-v1.5.Q4_K_M.gguf
Sources: memory (MEMORY.md + ~/.openclaw/workspace/memory/*.md)

[memory] sync: indexing memory files
Memory index updated (main). prints repeatedly and never completes; index stays ; process may eventually get SIGKILLed.\n\n### Fix\nChange embedding generation from concurrent to sequential.\n\nBefore (deadlocks/hangs):\n\n\nAfter (works):\n\n\n### Location\nIn the installed build this was at:\n- (around lines ~54–61)\n\n### Notes\nAfter applying the sequential loop, memory indexing completed successfully (5/5 files, 55 chunks) and began returning results normally.\n\nIf desired, I can help test a follow-up improvement (e.g., limited concurrency of 1 / configurable concurrency) once the deadlock is addressed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingstaleMarked as stale due to inactivity

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions