Skip to content

Commit 9fb90f3

Browse files
authored
docs: clarify Codex subscription runtime (#75910)
1 parent f6cb44a commit 9fb90f3

10 files changed

Lines changed: 143 additions & 101 deletions

File tree

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ Docs: https://docs.openclaw.ai
66

77
### Changes
88

9+
- Docs/Codex: clarify that ChatGPT/Codex subscription setups should use `openai/gpt-*` with `agentRuntime.id: "codex"` for native Codex runtime, while `openai-codex/*` remains the PI OAuth route. Thanks @pashpashpash.
910
- Plugins/source checkout: load bundled plugins from the `extensions/*` pnpm workspace tree in source checkouts, so plugin-local dependencies and edits are used directly while packaged installs keep using the built runtime tree. Thanks @vincentkoc.
1011
- Plugins/beta: prepare Brave, Codex, Feishu, Synology Chat, Tlon, and Twitch for `2026.5.1-beta.1` npm and ClawHub publishing. Thanks @vincentkoc.
1112
- Providers/xAI: add Grok 4.3 to the bundled catalog and make it the default xAI chat model.

docs/concepts/agent-runtimes.md

Lines changed: 19 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -37,25 +37,26 @@ There are two runtime families:
3737
through Claude CLI." `claude-cli` is not an embedded harness id and must not
3838
be passed to AgentHarness selection.
3939

40-
## Three things named Codex
40+
## Codex surfaces
4141

42-
Most confusion comes from three different surfaces sharing the Codex name:
42+
Most confusion comes from several different surfaces sharing the Codex name:
4343

44-
| Surface | OpenClaw name/config | What it does |
45-
| ---------------------------------------------------- | ------------------------------------ | --------------------------------------------------------------------------------------------------- |
46-
| Codex OAuth provider route | `openai-codex/*` model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
47-
| Native Codex app-server runtime | `agentRuntime.id: "codex"` | Runs the embedded agent turn through the bundled Codex app-server harness. |
48-
| Codex ACP adapter | `runtime: "acp"`, `agentId: "codex"` | Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
49-
| Native Codex chat-control command set | `/codex ...` | Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
50-
| OpenAI Platform API route for GPT/Codex-style models | `openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `runtime: "codex"`, runs the turn. |
44+
| Surface | OpenClaw name/config | What it does |
45+
| ---------------------------------------------------- | ------------------------------------------ | ---------------------------------------------------------------------------------------------------------- |
46+
| Native Codex app-server runtime | `openai/*` plus `agentRuntime.id: "codex"` | Runs the embedded agent turn through Codex app-server. This is the usual ChatGPT/Codex subscription setup. |
47+
| Codex OAuth provider route | `openai-codex/*` model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
48+
| Codex ACP adapter | `runtime: "acp"`, `agentId: "codex"` | Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
49+
| Native Codex chat-control command set | `/codex ...` | Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
50+
| OpenAI Platform API route for GPT/Codex-style models | `openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `agentRuntime.id: "codex"`, runs the turn. |
5151

5252
Those surfaces are intentionally independent. Enabling the `codex` plugin makes
5353
the native app-server features available; it does not rewrite
5454
`openai-codex/*` into `openai/*`, does not change existing sessions, and does
5555
not make ACP the Codex default. Selecting `openai-codex/*` means "use the Codex
5656
OAuth provider route" unless you separately force a runtime.
5757

58-
The common Codex setup uses the `openai` provider with the `codex` runtime:
58+
The common ChatGPT/Codex subscription setup uses Codex OAuth for auth, but keeps
59+
the model ref as `openai/*` and selects the `codex` runtime:
5960

6061
```json5
6162
{
@@ -71,8 +72,9 @@ The common Codex setup uses the `openai` provider with the `codex` runtime:
7172
```
7273

7374
That means OpenClaw selects an OpenAI model ref, then asks the Codex app-server
74-
runtime to run the embedded agent turn. It does not mean the channel, model
75-
provider catalog, or OpenClaw session store becomes Codex.
75+
runtime to run the embedded agent turn. It does not mean "use API billing," and
76+
it does not mean the channel, model provider catalog, or OpenClaw session store
77+
becomes Codex.
7678

7779
When the bundled `codex` plugin is enabled, natural-language Codex control
7880
should use the native `/codex` command surface (`/codex bind`, `/codex threads`,
@@ -85,7 +87,8 @@ This is the agent-facing decision tree:
8587

8688
1. If the user asks for **Codex bind/control/thread/resume/steer/stop**, use the
8789
native `/codex` command surface when the bundled `codex` plugin is enabled.
88-
2. If the user asks for **Codex as the embedded runtime**, use
90+
2. If the user asks for **Codex as the embedded runtime** or wants the normal
91+
subscription-backed Codex agent experience, use
8992
`openai/<model>` with `agentRuntime.id: "codex"`.
9093
3. If the user asks for **Codex OAuth/subscription auth on the normal OpenClaw
9194
runner**, use `openai-codex/<model>` and leave the runtime as PI.
@@ -142,10 +145,10 @@ OpenClaw chooses an embedded runtime after provider and model resolution:
142145
`fallback: "none"` to make unmatched `auto`-mode selection fail instead.
143146

144147
Explicit plugin runtimes fail closed by default. For example,
145-
`runtime: "codex"` means Codex or a clear selection error unless you set
148+
`agentRuntime.id: "codex"` means Codex or a clear selection error unless you set
146149
`fallback: "pi"` in the same override scope. A runtime override does not inherit
147-
a broader fallback setting, so an agent-level `runtime: "codex"` is not silently
148-
routed back to PI just because defaults used `fallback: "pi"`.
150+
a broader fallback setting, so an agent-level `agentRuntime.id: "codex"` is not
151+
silently routed back to PI just because defaults used `fallback: "pi"`.
149152

150153
CLI backend aliases are different from embedded harness ids. The preferred
151154
Claude CLI form is:

docs/concepts/model-providers.md

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -29,15 +29,15 @@ Reference for **LLM/model providers** (not chat channels like WhatsApp/Telegram)
2929
<Accordion title="OpenAI provider/runtime split">
3030
OpenAI-family routes are prefix-specific:
3131

32-
- `openai/<model>` uses the direct OpenAI API-key provider in PI.
32+
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness. This is the usual ChatGPT/Codex subscription setup.
3333
- `openai-codex/<model>` uses Codex OAuth in PI.
34-
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness.
34+
- `openai/<model>` without a Codex runtime override uses the direct OpenAI API-key provider in PI.
3535

3636
See [OpenAI](/providers/openai) and [Codex harness](/plugins/codex-harness). If the provider/runtime split is confusing, read [Agent runtimes](/concepts/agent-runtimes) first.
3737

3838
Plugin auto-enable follows the same boundary: `openai-codex/<model>` belongs to the OpenAI plugin, while the Codex plugin is enabled by `agentRuntime.id: "codex"` or legacy `codex/<model>` refs.
3939

40-
GPT-5.5 is available through `openai/gpt-5.5` for direct API-key traffic, `openai-codex/gpt-5.5` in PI for Codex OAuth, and the native Codex app-server harness when `agentRuntime.id: "codex"` is set.
40+
GPT-5.5 is available through the native Codex app-server harness when `agentRuntime.id: "codex"` is set, through `openai-codex/gpt-5.5` in PI for Codex OAuth, and through `openai/gpt-5.5` in PI for direct API-key traffic when your account exposes it.
4141

4242
</Accordion>
4343
<Accordion title="CLI runtimes">
@@ -148,11 +148,18 @@ Anthropic staff told us OpenClaw-style Claude CLI usage is allowed again, so Ope
148148
- Shares the same `/fast` toggle and `params.fastMode` config as direct `openai/*`; OpenClaw maps that to `service_tier=priority`
149149
- `openai-codex/gpt-5.5` uses the Codex catalog native `contextWindow = 400000` and default runtime `contextTokens = 272000`; override the runtime cap with `models.providers.openai-codex.models[].contextTokens`
150150
- Policy note: OpenAI Codex OAuth is explicitly supported for external tools/workflows like OpenClaw.
151-
- Use `openai-codex/gpt-5.5` when you want the Codex OAuth/subscription route; use `openai/gpt-5.5` when your API-key setup and local catalog expose the public API route.
151+
- For the common subscription plus native Codex runtime route, sign in with `openai-codex` auth but configure `openai/gpt-5.5` plus `agents.defaults.agentRuntime.id: "codex"`.
152+
- Use `openai-codex/gpt-5.5` only when you want the Codex OAuth/subscription route through PI; use `openai/gpt-5.5` without the Codex runtime override when your API-key setup and local catalog expose the public API route.
152153

153154
```json5
154155
{
155-
agents: { defaults: { model: { primary: "openai-codex/gpt-5.5" } } },
156+
plugins: { entries: { codex: { enabled: true } } },
157+
agents: {
158+
defaults: {
159+
model: { primary: "openai/gpt-5.5" },
160+
agentRuntime: { id: "codex", fallback: "none" },
161+
},
162+
},
156163
}
157164
```
158165

docs/concepts/models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ sidebarTitle: "Models CLI"
2323
</Card>
2424
</CardGroup>
2525

26-
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. See [Agent runtimes](/concepts/agent-runtimes).
26+
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. In Codex runtime mode, the `openai/gpt-*` ref does not imply API-key billing; auth can come from a Codex account or `openai-codex` auth profile. See [Agent runtimes](/concepts/agent-runtimes).
2727

2828
## How model selection works
2929

docs/gateway/doctor.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -260,7 +260,7 @@ That stages grounded durable candidates into the short-term dreaming store while
260260
Doctor does not repair this automatically because both routes are valid:
261261

262262
- `openai-codex/*` + PI means "use Codex OAuth/subscription auth through the normal OpenClaw runner."
263-
- `openai/*` + `runtime: "codex"` means "run the embedded turn through native Codex app-server."
263+
- `openai/*` + `agentRuntime.id: "codex"` means "run the embedded turn through native Codex app-server."
264264
- `/codex ...` means "control or bind a native Codex conversation from chat."
265265
- `/acp ...` or `runtime: "acp"` means "use the external ACP/acpx adapter."
266266

docs/help/faq-first-run.md

Lines changed: 11 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -594,26 +594,29 @@ and troubleshooting see the main [FAQ](/help/faq).
594594

595595
<Accordion title="How does Codex auth work?">
596596
OpenClaw supports **OpenAI Code (Codex)** via OAuth (ChatGPT sign-in). Use
597-
`openai-codex/gpt-5.5` for Codex OAuth through the default PI runner. Use
598-
`openai/gpt-5.5` for direct OpenAI API-key access. GPT-5.5 can also use
599-
subscription/OAuth via `openai-codex/gpt-5.5` or native Codex app-server
600-
runs with `openai/gpt-5.5` and `agentRuntime.id: "codex"`.
597+
`openai/gpt-5.5` with `agentRuntime.id: "codex"` for the common setup:
598+
ChatGPT/Codex subscription auth plus native Codex app-server execution. Use
599+
`openai-codex/gpt-5.5` only when you want Codex OAuth through the default
600+
PI runner. Use `openai/gpt-5.5` without the Codex runtime override for
601+
direct OpenAI API-key access.
601602
See [Model providers](/concepts/model-providers) and [Onboarding (CLI)](/start/wizard).
602603
</Accordion>
603604

604605
<Accordion title="Why does OpenClaw still mention openai-codex?">
605606
`openai-codex` is the provider and auth-profile id for ChatGPT/Codex OAuth.
606607
It is also the explicit PI model prefix for Codex OAuth:
607608

608-
- `openai/gpt-5.5` = current direct OpenAI API-key route in PI
609+
- `openai/gpt-5.5` + `agentRuntime.id: "codex"` = ChatGPT/Codex subscription auth with native Codex runtime
609610
- `openai-codex/gpt-5.5` = Codex OAuth route in PI
610-
- `openai/gpt-5.5` + `agentRuntime.id: "codex"` = native Codex app-server route
611+
- `openai/gpt-5.5` without a Codex runtime override = direct OpenAI API-key route in PI
611612
- `openai-codex:...` = auth profile id, not a model ref
612613

613614
If you want the direct OpenAI Platform billing/limit path, set
614615
`OPENAI_API_KEY`. If you want ChatGPT/Codex subscription auth, sign in with
615-
`openclaw models auth login --provider openai-codex` and use
616-
`openai-codex/*` model refs for PI runs.
616+
`openclaw models auth login --provider openai-codex`. For native Codex
617+
runtime, keep the model ref as `openai/gpt-5.5` and set
618+
`agentRuntime.id: "codex"`. Use `openai-codex/*` model refs only for PI
619+
runs.
617620

618621
</Accordion>
619622

docs/help/faq-models.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -145,11 +145,12 @@ troubleshooting, see the main [FAQ](/help/faq).
145145
</Accordion>
146146

147147
<Accordion title="Can I use GPT 5.5 for daily tasks and Codex 5.5 for coding?">
148-
Yes. Set one as default and switch as needed:
148+
Yes. Treat model choice and runtime choice separately:
149149

150-
- **Quick switch (per session):** `/model openai/gpt-5.5` for current direct OpenAI API-key tasks or `/model openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth tasks.
151-
- **Default:** set `agents.defaults.model.primary` to `openai/gpt-5.5` for API-key usage or `openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth usage.
152-
- **Sub-agents:** route coding tasks to sub-agents with a different default model.
150+
- **Native Codex coding agent:** set `agents.defaults.model.primary` to `openai/gpt-5.5` and `agents.defaults.agentRuntime.id` to `"codex"`. Sign in with `openclaw models auth login --provider openai-codex` when you want ChatGPT/Codex subscription auth.
151+
- **Direct OpenAI API tasks through PI:** use `/model openai/gpt-5.5` without a Codex runtime override and configure `OPENAI_API_KEY`.
152+
- **Codex OAuth through PI:** use `/model openai-codex/gpt-5.5` only when you intentionally want the normal PI runner with Codex OAuth.
153+
- **Sub-agents:** route coding tasks to a Codex-only agent with its own model and `agentRuntime` default.
153154

154155
See [Models](/concepts/models) and [Slash commands](/tools/slash-commands).
155156

0 commit comments

Comments
 (0)