-
-
Notifications
You must be signed in to change notification settings - Fork 69.4k
Local GGUF memory embeddings can deadlock due to Promise.all concurrency (node-llama-cpp) #7547
Description
Summary\nWhen using local GGUF embeddings for memory search, Memory index updated (main)./ can hang indefinitely (repeating "batch start" with 0 chunks indexed).\n\nRoot cause appears to be concurrent embedding calls using against the node-llama-cpp embedding context, which can deadlock/hang when is executed concurrently.\n\n### Environment\n- OpenClaw: 2026.1.29\n- MemorySearch provider: (GGUF)\n- Runtime: node-llama-cpp (via OpenClaw local embeddings)\n- Symptom: Memory Index (main)
Provider: local (requested: local)
Model: /home/gaius/.openclaw/models/embeddings/nomic-embed-text-v1.5.Q4_K_M.gguf
Sources: memory (MEMORY.md + ~/.openclaw/workspace/memory/*.md)
[memory] sync: indexing memory files
Memory index updated (main). prints repeatedly and never completes; index stays ; process may eventually get SIGKILLed.\n\n### Fix\nChange embedding generation from concurrent to sequential.\n\nBefore (deadlocks/hangs):\n\n\nAfter (works):\n\n\n### Location\nIn the installed build this was at:\n- (around lines ~54–61)\n\n### Notes\nAfter applying the sequential loop, memory indexing completed successfully (5/5 files, 55 chunks) and began returning results normally.\n\nIf desired, I can help test a follow-up improvement (e.g., limited concurrency of 1 / configurable concurrency) once the deadlock is addressed.