Skip to content

Ollama streaming ignores chat mode - still sends tools to models that don't support them #6117

@blackgirlbytes

Description

@blackgirlbytes

Bug Description

When using chat mode with Ollama models that don't support tool calling (e.g., deepseek-coder:6.7b), users still get the error:

Request failed: Bad request (400): registry.ollama.ai/library/deepseek-coder:6.7b does not support tools.

This happens even with chat mode enabled and tools disabled.

Root Cause

In crates/goose/src/providers/ollama.rs, the complete_with_model function correctly filters out tools when in chat mode:

let goose_mode = config.get_goose_mode().unwrap_or(GooseMode::Auto);
let filtered_tools = if goose_mode == GooseMode::Chat {
    &[]
} else {
    tools
};

However, the stream function is missing this check and always passes tools through:

let mut payload = create_request(
    &self.model,
    system,
    messages,
    tools,  // BUG: Not filtered for chat mode
    &super::utils::ImageFormat::OpenAi,
)?;

Since streaming is enabled by default (supports_streaming: true), the streaming code path is used, and tools are sent to Ollama regardless of the chat mode setting.

Fix

Add the same GooseMode::Chat check to the stream function:

async fn stream(
    &self,
    system: &str,
    messages: &[Message],
    tools: &[Tool],
) -> Result<MessageStream, ProviderError> {
    let config = crate::config::Config::global();
    let goose_mode = config.get_goose_mode().unwrap_or(GooseMode::Auto);
    let filtered_tools = if goose_mode == GooseMode::Chat {
        &[]
    } else {
        tools
    };

    let mut payload = create_request(
        &self.model,
        system,
        messages,
        filtered_tools,
        &super::utils::ImageFormat::OpenAi,
    )?;
    // ...
}

Steps to Reproduce

  1. Configure goose to use Ollama with a model that doesn't support tools (e.g., deepseek-coder:6.7b)
  2. Enable chat mode
  3. Try to send a message
  4. Observe the 400 error about tools not being supported

Expected Behavior

Chat mode should work with any Ollama model, regardless of tool support, since tools should not be sent in chat mode.

Affected Versions

Current main branch
version: 1.16.0+
Provider: Ollama
Model: deepseek-coder:6.7b (any model without tool support)
Mode: Chat

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions