Skip to content

fix(copilot): enable gpt-5.3-codex and other models requiring /responses endpoint#13485

Closed
sahilchouksey wants to merge 4 commits intoanomalyco:devfrom
sahilchouksey:fix/copilot-responses-endpoint-routing
Closed

fix(copilot): enable gpt-5.3-codex and other models requiring /responses endpoint#13485
sahilchouksey wants to merge 4 commits intoanomalyco:devfrom
sahilchouksey:fix/copilot-responses-endpoint-routing

Conversation

@sahilchouksey
Copy link
Copy Markdown

@sahilchouksey sahilchouksey commented Feb 13, 2026

Summary

  • Force all github-copilot models to use @ai-sdk/github-copilot SDK after config processing
  • Allow external copilot auth plugins to load from config
  • Skip x-initiator override when using bundled SDK to prevent validation errors

Fixes #13487

Changes

provider.ts

Added npm override loop after config processing to ensure config-defined github-copilot models use the bundled SDK with responses() support instead of falling back to @ai-sdk/openai-compatible.

plugin/index.ts

Removed explicit skip for opencode-copilot-auth plugin, allowing external auth plugins to load when specified in user config.

copilot.ts

Skip x-initiator header override when model uses @ai-sdk/github-copilot SDK. The bundled SDK has its own fetch wrapper that sets x-initiator based on message content - overriding it caused "invalid initiator" validation errors from Copilot API.

…esponses endpoint support

Models like gpt-5.3-codex require the /responses endpoint, which only the bundled
@ai-sdk/github-copilot SDK supports. Without this fix, config-defined models fall
back to @ai-sdk/openai-compatible which lacks the responses() method, causing
'model not accessible via /chat/completions' errors.

This override must run after config processing to catch user-defined models
not present in models.dev.
The opencode-copilot-auth npm plugin was being skipped by an explicit check,
preventing users from using the external plugin with the correct GitHub App
client ID (Iv1.b507a08c87ecfe98) needed for token exchange.

This allows the external plugin to load and take precedence over the built-in
plugin when specified in opencode.json config, enabling models like gpt-5.3-codex
that require proper OAuth token exchange for the /responses endpoint.
Copilot AI review requested due to automatic review settings February 13, 2026 14:19
@github-actions
Copy link
Copy Markdown
Contributor

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

@sahilchouksey
Copy link
Copy Markdown
Author

image

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Enables GitHub Copilot models (including config-defined ones like gpt-5.3-codex) to route through an SDK path that supports the /responses endpoint, and allows the external opencode-copilot-auth plugin to load from config for correct OAuth behavior.

Changes:

  • Force all github-copilot / github-copilot-enterprise models to use @ai-sdk/github-copilot after config processing.
  • Stop skipping opencode-copilot-auth during plugin loading so it can be installed/initialized from config.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
packages/opencode/src/provider/provider.ts Post-config override to ensure Copilot models use the bundled Copilot SDK (responses-capable).
packages/opencode/src/plugin/index.ts Allows opencode-copilot-auth to be loaded from config by removing the explicit skip.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +843 to +849
for (const providerID of ["github-copilot", "github-copilot-enterprise"]) {
if (database[providerID]) {
for (const model of Object.values(database[providerID].models)) {
model.api.npm = "@ai-sdk/github-copilot"
}
}
}
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

github-copilot-enterprise is cloned from github-copilot before config provider/model overrides are applied (see earlier creation of database["github-copilot-enterprise"]). That means user-defined/copied models added under provider.github-copilot.models in config will not appear under github-copilot-enterprise, even though this loop tries to patch both providers. Consider rebuilding/refreshing the enterprise provider after config processing (or cloning lazily) so both providers see the same post-config model set.

Copilot uses AI. Check for mistakes.
Comment on lines +840 to +849
// Force github-copilot models to use @ai-sdk/github-copilot instead of @ai-sdk/openai-compatible.
// Models like gpt-5.3-codex require the /responses endpoint, which only @ai-sdk/github-copilot supports.
// This must run after config processing to catch user-defined models not present in models.dev.
for (const providerID of ["github-copilot", "github-copilot-enterprise"]) {
if (database[providerID]) {
for (const model of Object.values(database[providerID].models)) {
model.api.npm = "@ai-sdk/github-copilot"
}
}
}
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change introduces important behavior for config-defined Copilot models (forcing model.api.npm to @ai-sdk/github-copilot after config parsing), but there doesn’t appear to be a unit test covering it. Add a test case similar to the existing “custom model inherits npm package from models.dev provider config” test, but for provider.github-copilot.models (and ideally github-copilot-enterprise too) to prevent regressions back to @ai-sdk/openai-compatible.

Copilot uses AI. Check for mistakes.
The @ai-sdk/github-copilot SDK has its own fetch wrapper that sets
x-initiator based on message content. Overriding it in the chat.headers
hook causes 'invalid initiator' validation errors from Copilot API
when the message structure doesn't match the forced 'agent' initiator.
@sahilchouksey
Copy link
Copy Markdown
Author

Additional Fix: x-initiator Override Conflict

Added a third commit to fix an issue discovered during testing:

Problem: When using @ai-sdk/github-copilot SDK with subagent sessions, the chat.headers hook was overriding x-initiator to agent, conflicting with the SDK's fetch wrapper which sets it based on message content. This caused "invalid initiator" validation errors from Copilot API.

Solution: Skip the x-initiator override when the model uses @ai-sdk/github-copilot SDK, letting the SDK's message-based logic handle it correctly.

// Skip x-initiator override when using @ai-sdk/github-copilot
if (incoming.model.api.npm === "@ai-sdk/github-copilot") return

Files changed:

  • copilot.ts - Skip x-initiator override for bundled SDK

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 3 out of 3 changed files in this pull request and generated no new comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@JReis23
Copy link
Copy Markdown

JReis23 commented Feb 16, 2026

Is this PR going to be merged?

@RagnarokButMemorySafe
Copy link
Copy Markdown

Im also waiting for the issue to get fixed asap please.

@antonio-ivanovski
Copy link
Copy Markdown

Ran on local build today for the entire day, no issues, works nicely. Hope will be merged for tomorrow 🤞

@NicolaiOksen
Copy link
Copy Markdown

@thdxr any chance to get this merged soon?

@roman-makarov-yunex
Copy link
Copy Markdown

@sahilchouksey can you advise how can I test it? can you share your model config?

@sahilchouksey
Copy link
Copy Markdown
Author

@sahilchouksey can you advise how can I test it? can you share your model config?

1. Add to your opencode.json:

  "$schema": "https://opencode.ai/config.json",
  "plugin": [
    "opencode-anthropic-auth@latest",
    "opencode-copilot-auth@latest"
  ],
  "provider": {
    "github-copilot": {
      "models": {
        "gpt-5.3-codex": {
          "modalities": {
            "input": ["text", "image"],
            "output": ["text"]
          }
        },
      },
    },
  },

2. Run opencode (locally) and select gpt-5.3-codex as your model.

Note: You must use opencode-copilot-auth plugin (not the built-in copilot auth) because gpt-5.3-codex isn't available yet with opencode's OAuth app - only vs code's OAuth flow has access currently

@roman-makarov-yunex
Copy link
Copy Markdown

@sahilchouksey works perfectly, thanks

@JReis23
Copy link
Copy Markdown

JReis23 commented Feb 17, 2026

Hi @sahilchouksey ,

I have done this steps, but I'm missing something:

image image

Have you any idea what I'm missing?

FYI, I've installed opencode with curl.

@roman-makarov-yunex
Copy link
Copy Markdown

@JReis23 you need changes from this branch. I built it locally (with some other changes which stuck in PR's limbo)

@MoonFuji
Copy link
Copy Markdown

any update on this pull requests' merging ?

@NicolaiOksen
Copy link
Copy Markdown

What is the hold up on getting this merged? It's a big shame not being able to use 5.3-codex.

@krzyk
Copy link
Copy Markdown

krzyk commented Feb 19, 2026

For those longing for 5.3: there is a workaround posted above that also explains that this PR isn't the only thing you need: #13485 (comment)

A bit more details: anomalyco/models.dev#912 (comment)

Yeah, I still don't see 5.3 in IntelliJ, so they are still rolling it out slowly.

@roman-makarov-yunex
Copy link
Copy Markdown

problem with intelij is their plugin.
they said on reddit that the rollout of 5.3 codex is finished and it should be available everywhere (if you enabled that in copilot features). you can check copilot cli or vscode plugin for the reference.

@krzyk
Copy link
Copy Markdown

krzyk commented Feb 19, 2026

not quite, as it is mentioned anomalyco/models.dev#912 (comment) rollout is not complete. IntelliJ plugin is also a plugin just like vscode one. It is just another client id, that they decided not to enable yet. Or it might be per user I don't know I don't use vscode, just IntelliJ and opencode.

@roman-makarov-yunex
Copy link
Copy Markdown

not quite, as it is mentioned anomalyco/models.dev#912 (comment) rollout is not complete. IntelliJ plugin is also a plugin just like vscode one. It is just another client id, that they decided not to enable yet. Or it might be per user I don't know I don't use vscode, just IntelliJ and opencode.

respectfully disagree. I have it in vscode, it works with the changes in the branch. But I still don't see it in Idea plugin (updated). They probably need to whitelist/ add model definition.

also this comment is fresher - than the statement you're reffering to.
https://www.reddit.com/r/GithubCopilot/comments/1r5rxgc/comment/o5nftxb/

not saying OSS devs must react in seconds to fix our problems, but there should be no external blockers to add the support

@krzyk
Copy link
Copy Markdown

krzyk commented Feb 19, 2026

Well yes, 5.3 codex (as seen in this PR) requires /responses not sure if that is a UI for user to select answers or some other API endpoint, but this probably makes it bit harder to implement in IntelliJ as well as in opencode.

So they rolled out 5.3-codex to all users, but not all editors.

@sahilchouksey
Copy link
Copy Markdown
Author

sahilchouksey commented Feb 19, 2026

Well yes, 5.3 codex (as seen in this PR) requires /responses not sure if that is a UI for user to select answers or some other API endpoint, but this probably makes it bit harder to implement in IntelliJ as well as in opencode.

So they rolled out 5.3-codex to all users, but not all editors.

@krzyk The /responses endpoint is actually already implemented in opencode's codebase:

function isGpt5OrLater(modelID: string): boolean {
const match = /^gpt-(\d+)/.exec(modelID)
if (!match) {
return false
}
return Number(match[1]) >= 5
}
function shouldUseCopilotResponsesApi(modelID: string): boolean {
return isGpt5OrLater(modelID) && !modelID.startsWith("gpt-5-mini")
}

The issue isn't endpoint support
it's that GPT-5.3-Codex hasn't been rolled out to third-party clients like opencode (or presumably IntelliJ). GitHub appears to have restricted this model to VS Code only for now.

Additionally, the VS Code team has explicitly prohibited opencode from using VS code's client ID, which means we cannot bypass these client-based restrictions or access the model without their client ID.

This is just a workaround that may not get merged due to the restriction on using their client ID.

@sweepies
Copy link
Copy Markdown

Well yes, 5.3 codex (as seen in this PR) requires /responses not sure if that is a UI for user to select answers or some other API endpoint, but this probably makes it bit harder to implement in IntelliJ as well as in opencode.
So they rolled out 5.3-codex to all users, but not all editors.

@krzyk The /responses endpoint is actually already implemented in opencode's codebase:

function isGpt5OrLater(modelID: string): boolean {
const match = /^gpt-(\d+)/.exec(modelID)
if (!match) {
return false
}
return Number(match[1]) >= 5
}
function shouldUseCopilotResponsesApi(modelID: string): boolean {
return isGpt5OrLater(modelID) && !modelID.startsWith("gpt-5-mini")
}

The issue isn't endpoint support
it's that GPT-5.3-Codex hasn't been rolled out to third-party clients like opencode (or presumably IntelliJ). GitHub appears to have restricted this model to VS Code only for now.
Additionally, the VS Code team has explicitly prohibited opencode from using VS code's client ID, which means we cannot bypass these client-based restrictions or access the model without their client ID.

This is just a workaround that may not get merged due to the restriction on using their client ID.

Can we just get a comment by the OpenCode team to confirm whether this is the issue?

@rekram1-node
Copy link
Copy Markdown
Collaborator

rekram1-node commented Feb 20, 2026

Copying my response from the issues:

Yeah It's not supported in opencode yet, it hasn't been fully rolled out.

It should roll out soon.

You may ask well I can use it in github copilot cli but not opencode why is that?

This is because WE have a separate client id than the official copilot cli, so they havent rolled it out to all clients quite yet.

We already route to correct api endpoints, it's not an issue of that, it's the client id.

@Khang5687
Copy link
Copy Markdown

Hi @sahilchouksey ,

Have you any idea what I'm missing?

FYI, I've installed opencode with curl.

@JReis23 You need to merge this PR.
Clone opencode then merge it

git clone https://github.com/anomalyco/opencode.git
  cd opencode
  git fetch origin pull/13485/head:pr-13485
  git merge pr-13485 --no-edit

Then build opencode and install it from the opencode/ folder

@dedonnodev
Copy link
Copy Markdown

You can also use the official opencode build and instead copy the index.mjs from opencode-copilot-auth plugin, rename to index.js e put it in project/.opencode/plugin/ or the global folder.

@antonio-ivanovski
Copy link
Copy Markdown

Best way that works for me is by cloning the branch with the fix and running a dev server in "serve" mode (bun run dev serve). I then can attach any opencode to it, for example you can have the "prod" OpenCode Desktop, click on the "Status" in the top bar, add the dev server with fix and use it

image

@TheAnig
Copy link
Copy Markdown

TheAnig commented Feb 24, 2026

Hi, I see the rollout has happened for some other 3rd party clients, has the rollout completed to opencode?

@Khang5687
Copy link
Copy Markdown

@sahilchouksey Hey any way we can change the thinking effort of 5.3 with your current workaround?
The official 5.2 codex model can change the thinking effort.

CleanShot 2026-02-24 at 17 23 19@2x CleanShot 2026-02-24 at 17 23 31@2x

@sahilchouksey
Copy link
Copy Markdown
Author

@sahilchouksey Hey any way we can change the thinking effort of 5.3 with your current workaround? The official 5.2 codex model can change the thinking effort.

CleanShot 2026-02-24 at 17 23 19@2x CleanShot 2026-02-24 at 17 23 31@2x

@Khang5687 you can add variants to your opencode.json config

"github-copilot": {
  "models": {
    "gpt-5.3-codex": {
      "modalities": {
        "input": ["text", "image"],
        "output": ["text"]
      },
     "variants": {
          "low": {
            "reasoningEffort": "low",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "medium": {
            "reasoningEffort": "medium",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "high": {
            "reasoningEffort": "high",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "xhigh": {
            "reasoningEffort": "xhigh",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          }
      } 
    }
  }
}

@matheusfalcaopinto
Copy link
Copy Markdown

Easiest solution for me following this PR:

  1. Clone, fetch and merge opencode repo with this PR:
git clone https://github.com/anomalyco/opencode.git
cd opencode
git fetch origin pull/13485/head:pr-13485
git merge pr-13485 --no-edit

bun run install
  1. Edit opencode.jsonc adding (with reasoning parameter):
"plugin": [
    "opencode-anthropic-auth@latest",
    "opencode-copilot-auth@latest"
  ],
  "provider": {
    "github-copilot": {
        "models": {
          "gpt-5.3-codex": {
            "modalities": {
              "input": ["text", "image"],
              "output": ["text"]
            },
           "variants": {
                "low": {
                  "reasoningEffort": "low",
                  "reasoningSummary": "auto",
                  "include": ["reasoning.encrypted_content"]
                },
                "medium": {
                  "reasoningEffort": "medium",
                  "reasoningSummary": "auto",
                  "include": ["reasoning.encrypted_content"]
                },
                "high": {
                  "reasoningEffort": "high",
                  "reasoningSummary": "auto",
                  "include": ["reasoning.encrypted_content"]
                },
                "xhigh": {
                  "reasoningEffort": "xhigh",
                  "reasoningSummary": "auto",
                  "include": ["reasoning.encrypted_content"]
                }
            } 
          }
        }
      }
  1. Run merged opencode auth and server:
bun run dev auth login
bun run dev serve
  1. Attach TUI with normal opencode anywhere:
    opencode attach <url>

Hope it saves someones time!

@Waishnav
Copy link
Copy Markdown

@rekram1-node any official updates from GitHub copilot team about this?

@dlukt
Copy link
Copy Markdown

dlukt commented Feb 25, 2026

it's obvious copilot is trying to pull people to use their copilot cli.
today copilot cli GA was announced, half baked at best.
"Not fully rolled out" is obviously fake news, by github.
Micro$oft doing M$ things.
I for one, for me the only reason to get a copilot subscription was its official availability in opencode.

@AleksanderBondar
Copy link
Copy Markdown

It seems to be rolled out !

@sahilchouksey
Copy link
Copy Markdown
Author

Yes guys, it’s rolled out now!
Closing this PR.

@aysbg
Copy link
Copy Markdown

aysbg commented Mar 3, 2026

No its not.

Screenshot 2026-03-03 at 10 23 46

@Bubbi
Copy link
Copy Markdown

Bubbi commented Mar 3, 2026

The reason he says it is, might be because it is now in the model dev list.
https://models.dev/?search=5.3+codex (let the page load for the search to kick in - it's slow)

But the gist of it is that it's available, through other suppliers. Not CoPilot yet.

image

@dlukt
Copy link
Copy Markdown

dlukt commented Mar 3, 2026

closed without comment? WTF?

@sahilchouksey
Copy link
Copy Markdown
Author

@sahilchouksey Hey any way we can change the thinking effort of 5.3 with your current workaround? The official 5.2 codex model can change the thinking effort.
CleanShot 2026-02-24 at 17 23 19@2x CleanShot 2026-02-24 at 17 23 31@2x

@Khang5687 you can add variants to your opencode.json config

"github-copilot": {
  "models": {
    "gpt-5.3-codex": {
      "modalities": {
        "input": ["text", "image"],
        "output": ["text"]
      },
     "variants": {
          "low": {
            "reasoningEffort": "low",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "medium": {
            "reasoningEffort": "medium",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "high": {
            "reasoningEffort": "high",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "xhigh": {
            "reasoningEffort": "xhigh",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          }
      } 
    }
  }
}

GPT-5.3-Codex is now officially available in the opencode client.

You can use it by adding the model and provider details to your config. Copilot OAuth workaround is no longer needed.

Since there are already multiple open PRs for GPT-5.3-Codex in models.dev, I am not opening another duplicate PR.

@rekram1-node please merge one of the existing Codex 5.3 PRs (#1058, #993) so it is available by default for everyone in opencode.

@michaellopez
Copy link
Copy Markdown

I can confirm adding it manually works
image

@sahilchouksey Hey any way we can change the thinking effort of 5.3 with your current workaround? The official 5.2 codex model can change the thinking effort.
CleanShot 2026-02-24 at 17 23 19@2x CleanShot 2026-02-24 at 17 23 31@2x

@Khang5687 you can add variants to your opencode.json config

"github-copilot": {
  "models": {
    "gpt-5.3-codex": {
      "modalities": {
        "input": ["text", "image"],
        "output": ["text"]
      },
     "variants": {
          "low": {
            "reasoningEffort": "low",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "medium": {
            "reasoningEffort": "medium",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "high": {
            "reasoningEffort": "high",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          },
          "xhigh": {
            "reasoningEffort": "xhigh",
            "reasoningSummary": "auto",
            "include": ["reasoning.encrypted_content"]
          }
      } 
    }
  }
}

GPT-5.3-Codex is now officially available in the opencode client.

You can use it by adding the model and provider details to your config. Copilot OAuth workaround is no longer needed.

Since there are already multiple open PRs for GPT-5.3-Codex in models.dev, I am not opening another duplicate PR.

@rekram1-node please merge one of the existing Codex 5.3 PRs (#1058, #993) so it is available by default for everyone in opencode.

@rbcb-dev
Copy link
Copy Markdown

rbcb-dev commented Mar 4, 2026

As a small addition: Limits could be added to the config, so that opencode correctly displays the percentage of used tokens (and i guess it then also correctly knows when to do compaction).

  "provider": {
    "github-copilot": {
      "models": {
        "gpt-5.3-codex": {
          "modalities": {
            "input": ["text", "image"],
            "output": ["text"]
          },
          "limit": {
            "context": 400000,
            "input": 272000,
            "output": 128000
          },
          "variants": {
            "low": {
              "reasoningEffort": "low",
              "reasoningSummary": "auto",
              "include": ["reasoning.encrypted_content"]
            },
            "medium": {
              "reasoningEffort": "medium",
              "reasoningSummary": "auto",
              "include": ["reasoning.encrypted_content"]
            },
            "high": {
              "reasoningEffort": "high",
              "reasoningSummary": "auto",
              "include": ["reasoning.encrypted_content"]
            },
            "xhigh": {
              "reasoningEffort": "xhigh",
              "reasoningSummary": "auto",
              "include": ["reasoning.encrypted_content"]
            }
          }
        }
      }
    }
  }

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

github-copilot: gpt-5.3-codex fails - config models missing /responses support