Skip to content

CLI headless recipe runs fail with 'Scheduler not available' error on Linux AARCH64 (v1.18.0+) #6405

@blackgirlbytes

Description

@blackgirlbytes

Bug Description

When running recipes in headless mode via CLI on Linux AARCH64, tool calls consistently fail with the error:

-32603: Scheduler not available. This tool only works in server mode.

Additionally, some extensions (like mcp-kubernetes) fail with:

-32603: Transport closed

Note: filesystem extension works fine, and interactive mode works correctly.

Environment

  • OS: DGX Spark Linux AARCH64
  • Mode: CLI headless (goose --recipe xxx)
  • LLM Provider: Ollama
  • Workload: ~20 recipes with 9-10 subrecipes running concurrently
  • Affected versions: v1.18.0, v1.19.1
  • Working version: v1.15.0

Steps to Reproduce

  1. Install goose v1.18.0 or later on Linux AARCH64
  2. Run a recipe in headless mode: goose --recipe <recipe_file>
  3. Observe the scheduler error when tool calls are made

Expected Behavior

Recipe should execute successfully as it does in:

  • Interactive mode (same system)
  • v1.15.0 headless mode

Actual Behavior

  • Tool calls fail with -32603: Scheduler not available error
  • Some extensions fail with -32603: Transport closed

Relevant Code

The error originates from schedule_tool.rs#L28:

let scheduler = match self.scheduler_service.lock().await.as_ref() {
    Some(s) => s.clone(),
    None => {
        return Err(ErrorData::new(
            ErrorCode::INTERNAL_ERROR,
            "Scheduler not available. This tool only works in server mode.".to_string(),
            None,
        ))
    }
};

Analysis

It appears that in CLI headless mode, the scheduler_service is not being initialized, causing the schedule tool to fail. This regression was introduced somewhere between v1.15.0 and v1.18.0.

The issue may be related to how the agent is initialized differently in CLI mode vs server mode, where the scheduler service is expected but not available.

The concurrent workload (9-10 subrecipes running simultaneously) and use of Ollama as the LLM provider may also be relevant factors.

Additional Context

  • Same recipe works perfectly on v1.15.0
  • Interactive mode always works (even on affected versions)
  • Issue is consistent and reproducible
  • User runs ~20 recipes with 9-10 subrecipes concurrently

Reported By

Community member Sp2k4w via Discord

Metadata

Metadata

Assignees

Labels

p1Priority 1 - High (supports roadmap)

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions