Skip to content

Block version 1.1.* not displaying LLM output when used with GitHub Models #3615

@rzhade3

Description

@rzhade3

LLM output is not being displayed when using Goose version 1.1.* with GitHub Models specified configuration. The problem persists despite validating the configuration file and doing a complete fresh reinstallation.

This configuration continues to work properly for Goose version 1.0.35


To Reproduce

Steps to reproduce the behavior:

  1. Ensure the following configuration is used:
    OPENAI_HOST: https://api.githubcopilot.com
    OPENAI_CUSTOM_HEADERS: Copilot-Integration-Id=playground-dev
    OPENAI_BASE_PATH: chat/completions
    GOOSE_MODEL: gpt-4o
    extensions:
      computercontroller:
        bundled: true
        display_name: Computer Controller
        enabled: false
        name: computercontroller
        timeout: 300
        type: builtin
      developer:
        bundled: true
        display_name: Developer
        enabled: true
        name: developer
        timeout: 300
        type: builtin
      jetbrains:
        bundled: true
        display_name: Jetbrains
        enabled: false
        name: jetbrains
        timeout: 300
        type: builtin
      memory:
        bundled: true
        display_name: Memory
        enabled: false
        name: memory
        timeout: 300
        type: builtin
      tutorial:
        bundled: true
        display_name: Tutorial
        enabled: false
        name: tutorial
        timeout: 300
        type: builtin
    GOOSE_MODE: chat
    GOOSE_PROVIDER: openai
  2. Open Goose.
  3. Attempt any task that should display LLM output.
  4. Observe the absence of output in the interface.

Expected behavior

LLM output should be displayed clearly when using Goose version 1.1.* with the given configuration.


Screenshots

Screenshot of Goose 1.1.3 behavior:

Image

See that Goose is properly summarizing the input, however it is not actually displaying or handling any output.


Please provide the following information:

  • OS & Arch: macOS Sequoia 15.5 M1 Max [e.g. macOS Ventura 13.0 ARM64]
  • Interface: UI and CLI
  • Version: 1.1.3
  • Extensions enabled: None
  • Provider & Model: OpenAI - gpt-4o (with GitHub Models)

Additional context

Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions