Skip to content

github-copilot: gpt-5.3-codex fails - config models missing /responses support #13487

@sahilchouksey

Description

@sahilchouksey

Problem

When adding new GitHub Copilot models via config (e.g., gpt-5.3-codex), they fail with "model not supported" error even though the model works in VS Code Copilot.

{
  "provider": {
    "github-copilot": {
      "models": {
        "gpt-5.3-codex": {
          "modalities": {
            "input": ["text", "image"],
            "output": ["text"]
          }
        }
      }
    }
  }
}

Root Cause

Two issues compound to cause this:

  1. Wrong SDK package: Config-defined models fall back to @ai-sdk/openai-compatible which lacks the responses() method needed for /responses endpoint. Models like gpt-5.3-codex require this endpoint, not /chat/completions.

  2. Plugin loading blocked: External auth plugins (like opencode-copilot-auth) that use the correct GitHub App client ID (Iv1.b507a08c87ecfe98) were explicitly skipped from loading in plugin/index.ts.

Technical Details

  • The bundled @ai-sdk/github-copilot SDK in ./sdk/copilot/ has the responses() method
  • The npm package @ai-sdk/openai-compatible does not
  • The npm override in provider.ts ran BEFORE config processing, so config-defined models never got the correct SDK
  • Line 56 in plugin/index.ts contained: if (plugin.includes("opencode-copilot-auth")) continue which prevented external plugins from loading

Related

Fix

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions