fix: llm-task tool schema compatibility with llama.cpp + doctor SecretRef crash#35452
fix: llm-task tool schema compatibility with llama.cpp + doctor SecretRef crash#35452jmzlx wants to merge 1 commit intoopenclaw:mainfrom
Conversation
…tRef crash
Two fixes:
1. llm-task plugin: Type.Unknown() produces JSON Schema without a 'type'
field, which llama.cpp rejects during schema-to-grammar conversion.
Replace with Type.Unsafe<unknown>({type: 'object', ...}) to emit
a concrete type. Fixes openclaw#35443.
2. openclaw doctor: resolved?.remote?.apiKey?.trim() crashes when apiKey
is a SecretRef object (not a string). Guard with typeof check.
Fixes openclaw#35444.
|
Splitting into separate PRs — one per fix. |
Greptile SummaryThis PR contains two targeted bug fixes: one addressing JSON Schema compatibility for the Both fixes are correct and narrowly scoped. The llm-task fix resolves the reported llama.cpp incompatibility (400 error on schema validation). The doctor fix correctly handles SecretRef objects without triggering a TypeError. No regressions are introduced. Confidence Score: 5/5
Last reviewed commit: 0909f2b |
Summary
Two small fixes for local LLM backend compatibility and
openclaw doctorreliability.1.
llm-tasktool schema — breaks llama.cpp backends (#35443)The
llm-taskplugin definesinputandschemaparameters usingType.Unknown(), which produces JSON Schema without atypefield:{"description": "Optional input payload for the task."}llama.cpp's OpenAI-compatible endpoint rejects this during JSON schema-to-grammar conversion:
Fix: Replace
Type.Unknown()withType.Unsafe<unknown>({type: "object", ...})to emit a concretetypefield. Both parameters accept JSON objects in practice (inputis a payload,schemais a JSON Schema definition).Impact: Any agent with
llm-taskenabled that routes through llama.cpp-based backends (directly or via TensorZero) gets a 400 on every request. Streaming requests returndata: [DONE]immediately with no content.2.
openclaw doctorcrashes on SecretRef apiKey (#35444)doctor-memory-search.tscalls.trim()onresolved?.remote?.apiKey, which crashes when the value is a SecretRef object (env var reference) rather than a resolved string:Fix: Guard with
typeof rawApiKey === "string"before calling.trim().Testing
llm-task fix verified by sending tool definitions with and without
typefields to llama-server (Qwen3.5-35B-A3B, Qwen3.5-9B):type: 400 error (reproducible)type: "object": 200 OKdoctor fix verified by confirming the TypeError occurs at the exact line changed, and that the SecretRef is a non-string object at runtime while the gateway resolves it correctly.
Environment tested on