-
Notifications
You must be signed in to change notification settings - Fork 15k
github-copilot: gpt-5.3-codex fails - config models missing /responses support #13487
Copy link
Copy link
Closed as duplicate of#13312
Closed as duplicate of#13312
Copy link
Description
Problem
When adding new GitHub Copilot models via config (e.g., gpt-5.3-codex), they fail with "model not supported" error even though the model works in VS Code Copilot.
{
"provider": {
"github-copilot": {
"models": {
"gpt-5.3-codex": {
"modalities": {
"input": ["text", "image"],
"output": ["text"]
}
}
}
}
}
}Root Cause
Two issues compound to cause this:
-
Wrong SDK package: Config-defined models fall back to
@ai-sdk/openai-compatiblewhich lacks theresponses()method needed for/responsesendpoint. Models likegpt-5.3-codexrequire this endpoint, not/chat/completions. -
Plugin loading blocked: External auth plugins (like
opencode-copilot-auth) that use the correct GitHub App client ID (Iv1.b507a08c87ecfe98) were explicitly skipped from loading inplugin/index.ts.
Technical Details
- The bundled
@ai-sdk/github-copilotSDK in./sdk/copilot/has theresponses()method - The npm package
@ai-sdk/openai-compatibledoes not - The npm override in
provider.tsran BEFORE config processing, so config-defined models never got the correct SDK - Line 56 in
plugin/index.tscontained:if (plugin.includes("opencode-copilot-auth")) continuewhich prevented external plugins from loading
Related
- models.dev PR: feat(github-copilot): add gpt-5.3-codex model models.dev#857
- Discussion thread with affected users reporting same issue
Fix
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels