fix(copilot): enable gpt-5.3-codex and other models requiring /responses endpoint#13485
fix(copilot): enable gpt-5.3-codex and other models requiring /responses endpoint#13485sahilchouksey wants to merge 4 commits intoanomalyco:devfrom
Conversation
…esponses endpoint support Models like gpt-5.3-codex require the /responses endpoint, which only the bundled @ai-sdk/github-copilot SDK supports. Without this fix, config-defined models fall back to @ai-sdk/openai-compatible which lacks the responses() method, causing 'model not accessible via /chat/completions' errors. This override must run after config processing to catch user-defined models not present in models.dev.
The opencode-copilot-auth npm plugin was being skipped by an explicit check, preventing users from using the external plugin with the correct GitHub App client ID (Iv1.b507a08c87ecfe98) needed for token exchange. This allows the external plugin to load and take precedence over the built-in plugin when specified in opencode.json config, enabling models like gpt-5.3-codex that require proper OAuth token exchange for the /responses endpoint.
|
Thanks for your contribution! This PR doesn't have a linked issue. All PRs must reference an existing issue. Please:
See CONTRIBUTING.md for details. |
There was a problem hiding this comment.
Pull request overview
Enables GitHub Copilot models (including config-defined ones like gpt-5.3-codex) to route through an SDK path that supports the /responses endpoint, and allows the external opencode-copilot-auth plugin to load from config for correct OAuth behavior.
Changes:
- Force all
github-copilot/github-copilot-enterprisemodels to use@ai-sdk/github-copilotafter config processing. - Stop skipping
opencode-copilot-authduring plugin loading so it can be installed/initialized from config.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| packages/opencode/src/provider/provider.ts | Post-config override to ensure Copilot models use the bundled Copilot SDK (responses-capable). |
| packages/opencode/src/plugin/index.ts | Allows opencode-copilot-auth to be loaded from config by removing the explicit skip. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| for (const providerID of ["github-copilot", "github-copilot-enterprise"]) { | ||
| if (database[providerID]) { | ||
| for (const model of Object.values(database[providerID].models)) { | ||
| model.api.npm = "@ai-sdk/github-copilot" | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
github-copilot-enterprise is cloned from github-copilot before config provider/model overrides are applied (see earlier creation of database["github-copilot-enterprise"]). That means user-defined/copied models added under provider.github-copilot.models in config will not appear under github-copilot-enterprise, even though this loop tries to patch both providers. Consider rebuilding/refreshing the enterprise provider after config processing (or cloning lazily) so both providers see the same post-config model set.
| // Force github-copilot models to use @ai-sdk/github-copilot instead of @ai-sdk/openai-compatible. | ||
| // Models like gpt-5.3-codex require the /responses endpoint, which only @ai-sdk/github-copilot supports. | ||
| // This must run after config processing to catch user-defined models not present in models.dev. | ||
| for (const providerID of ["github-copilot", "github-copilot-enterprise"]) { | ||
| if (database[providerID]) { | ||
| for (const model of Object.values(database[providerID].models)) { | ||
| model.api.npm = "@ai-sdk/github-copilot" | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
This change introduces important behavior for config-defined Copilot models (forcing model.api.npm to @ai-sdk/github-copilot after config parsing), but there doesn’t appear to be a unit test covering it. Add a test case similar to the existing “custom model inherits npm package from models.dev provider config” test, but for provider.github-copilot.models (and ideally github-copilot-enterprise too) to prevent regressions back to @ai-sdk/openai-compatible.
The @ai-sdk/github-copilot SDK has its own fetch wrapper that sets x-initiator based on message content. Overriding it in the chat.headers hook causes 'invalid initiator' validation errors from Copilot API when the message structure doesn't match the forced 'agent' initiator.
Additional Fix: x-initiator Override ConflictAdded a third commit to fix an issue discovered during testing: Problem: When using Solution: Skip the // Skip x-initiator override when using @ai-sdk/github-copilot
if (incoming.model.api.npm === "@ai-sdk/github-copilot") returnFiles changed:
|
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 3 out of 3 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Is this PR going to be merged? |
|
Im also waiting for the issue to get fixed asap please. |
|
Ran on local build today for the entire day, no issues, works nicely. Hope will be merged for tomorrow 🤞 |
|
@thdxr any chance to get this merged soon? |
|
@sahilchouksey can you advise how can I test it? can you share your model config? |
1. Add to your opencode.json: "$schema": "https://opencode.ai/config.json",
"plugin": [
"opencode-anthropic-auth@latest",
"opencode-copilot-auth@latest"
],
"provider": {
"github-copilot": {
"models": {
"gpt-5.3-codex": {
"modalities": {
"input": ["text", "image"],
"output": ["text"]
}
},
},
},
},2. Run opencode (locally) and select
|
|
@sahilchouksey works perfectly, thanks |
|
Hi @sahilchouksey , I have done this steps, but I'm missing something:
Have you any idea what I'm missing? FYI, I've installed opencode with curl. |
|
@JReis23 you need changes from this branch. I built it locally (with some other changes which stuck in PR's limbo) |
|
any update on this pull requests' merging ? |
|
What is the hold up on getting this merged? It's a big shame not being able to use 5.3-codex. |
Yeah, I still don't see 5.3 in IntelliJ, so they are still rolling it out slowly. |
|
problem with intelij is their plugin. |
|
not quite, as it is mentioned anomalyco/models.dev#912 (comment) rollout is not complete. IntelliJ plugin is also a plugin just like vscode one. It is just another client id, that they decided not to enable yet. Or it might be per user I don't know I don't use vscode, just IntelliJ and opencode. |
respectfully disagree. I have it in vscode, it works with the changes in the branch. But I still don't see it in Idea plugin (updated). They probably need to whitelist/ add model definition. also this comment is fresher - than the statement you're reffering to. not saying OSS devs must react in seconds to fix our problems, but there should be no external blockers to add the support |
|
Well yes, 5.3 codex (as seen in this PR) requires So they rolled out 5.3-codex to all users, but not all editors. |
@krzyk The opencode/packages/opencode/src/provider/provider.ts Lines 50 to 60 in 8ebdbe0 The issue isn't endpoint support it's that GPT-5.3-Codex hasn't been rolled out to third-party clients like opencode (or presumably IntelliJ). GitHub appears to have restricted this model to VS Code only for now. Additionally, the VS Code team has explicitly prohibited opencode from using VS code's client ID, which means we cannot bypass these client-based restrictions or access the model without their client ID.
|
Can we just get a comment by the OpenCode team to confirm whether this is the issue? |
|
Copying my response from the issues: We already route to correct api endpoints, it's not an issue of that, it's the client id. |
@JReis23 You need to merge this PR. Then build opencode and install it from the opencode/ folder |
|
You can also use the official opencode build and instead copy the index.mjs from opencode-copilot-auth plugin, rename to index.js e put it in project/.opencode/plugin/ or the global folder. |
|
Hi, I see the rollout has happened for some other 3rd party clients, has the rollout completed to opencode? |
|
@sahilchouksey Hey any way we can change the thinking effort of 5.3 with your current workaround?
|
@Khang5687 you can add variants to your |
|
Easiest solution for me following this PR:
Hope it saves someones time! |
|
@rekram1-node any official updates from GitHub copilot team about this? |
|
it's obvious copilot is trying to pull people to use their copilot cli. |
|
It seems to be rolled out ! |
|
Yes guys, it’s rolled out now! |
|
The reason he says it is, might be because it is now in the model dev list. But the gist of it is that it's available, through other suppliers. Not CoPilot yet.
|
|
closed without comment? WTF? |
GPT-5.3-Codex is now officially available in the opencode client. You can use it by adding the model and provider details to your config. Copilot OAuth workaround is no longer needed. Since there are already multiple open PRs for GPT-5.3-Codex in @rekram1-node please merge one of the existing Codex 5.3 PRs (#1058, #993) so it is available by default for everyone in opencode. |
|
I can confirm adding it manually works
|
|
As a small addition: Limits could be added to the config, so that opencode correctly displays the percentage of used tokens (and i guess it then also correctly knows when to do compaction). "provider": {
"github-copilot": {
"models": {
"gpt-5.3-codex": {
"modalities": {
"input": ["text", "image"],
"output": ["text"]
},
"limit": {
"context": 400000,
"input": 272000,
"output": 128000
},
"variants": {
"low": {
"reasoningEffort": "low",
"reasoningSummary": "auto",
"include": ["reasoning.encrypted_content"]
},
"medium": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"include": ["reasoning.encrypted_content"]
},
"high": {
"reasoningEffort": "high",
"reasoningSummary": "auto",
"include": ["reasoning.encrypted_content"]
},
"xhigh": {
"reasoningEffort": "xhigh",
"reasoningSummary": "auto",
"include": ["reasoning.encrypted_content"]
}
}
}
}
}
} |











Summary
github-copilotmodels to use@ai-sdk/github-copilotSDK after config processingFixes #13487
Changes
provider.tsAdded npm override loop after config processing to ensure config-defined github-copilot models use the bundled SDK with
responses()support instead of falling back to@ai-sdk/openai-compatible.plugin/index.tsRemoved explicit skip for
opencode-copilot-authplugin, allowing external auth plugins to load when specified in user config.copilot.tsSkip
x-initiatorheader override when model uses@ai-sdk/github-copilotSDK. The bundled SDK has its own fetch wrapper that setsx-initiatorbased on message content - overriding it caused "invalid initiator" validation errors from Copilot API.