You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Docs/Codex: clarify that ChatGPT/Codex subscription setups should use `openai/gpt-*` with `agentRuntime.id: "codex"` for native Codex runtime, while `openai-codex/*` remains the PI OAuth route. Thanks @pashpashpash.
9
10
- Plugins/source checkout: load bundled plugins from the `extensions/*` pnpm workspace tree in source checkouts, so plugin-local dependencies and edits are used directly while packaged installs keep using the built runtime tree. Thanks @vincentkoc.
10
11
- Plugins/beta: prepare Brave, Codex, Feishu, Synology Chat, Tlon, and Twitch for `2026.5.1-beta.1` npm and ClawHub publishing. Thanks @vincentkoc.
11
12
- Providers/xAI: add Grok 4.3 to the bundled catalog and make it the default xAI chat model.
| Codex OAuth provider route|`openai-codex/*`model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
47
-
|Native Codex app-server runtime |`agentRuntime.id: "codex"`| Runs the embedded agent turn through the bundled Codex app-server harness.|
48
-
| Codex ACP adapter |`runtime: "acp"`, `agentId: "codex"`| Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
49
-
| Native Codex chat-control command set |`/codex ...`| Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
50
-
| OpenAI Platform API route for GPT/Codex-style models |`openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `runtime: "codex"`, runs the turn.|
|Native Codex app-server runtime |`openai/*`plus `agentRuntime.id: "codex"`| Runs the embedded agent turn through Codex app-server. This is the usual ChatGPT/Codex subscription setup.|
47
+
| Codex OAuth provider route|`openai-codex/*` model refs| Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
48
+
| Codex ACP adapter |`runtime: "acp"`, `agentId: "codex"`| Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked.|
49
+
| Native Codex chat-control command set |`/codex ...`| Binds, resumes, steers, stops, and inspects Codex app-server threads from chat.|
50
+
| OpenAI Platform API route for GPT/Codex-style models |`openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `agentRuntime.id: "codex"`, runs the turn. |
51
51
52
52
Those surfaces are intentionally independent. Enabling the `codex` plugin makes
53
53
the native app-server features available; it does not rewrite
54
54
`openai-codex/*` into `openai/*`, does not change existing sessions, and does
55
55
not make ACP the Codex default. Selecting `openai-codex/*` means "use the Codex
56
56
OAuth provider route" unless you separately force a runtime.
57
57
58
-
The common Codex setup uses the `openai` provider with the `codex` runtime:
58
+
The common ChatGPT/Codex subscription setup uses Codex OAuth for auth, but keeps
59
+
the model ref as `openai/*` and selects the `codex` runtime:
59
60
60
61
```json5
61
62
{
@@ -71,8 +72,9 @@ The common Codex setup uses the `openai` provider with the `codex` runtime:
71
72
```
72
73
73
74
That means OpenClaw selects an OpenAI model ref, then asks the Codex app-server
74
-
runtime to run the embedded agent turn. It does not mean the channel, model
75
-
provider catalog, or OpenClaw session store becomes Codex.
75
+
runtime to run the embedded agent turn. It does not mean "use API billing," and
76
+
it does not mean the channel, model provider catalog, or OpenClaw session store
77
+
becomes Codex.
76
78
77
79
When the bundled `codex` plugin is enabled, natural-language Codex control
78
80
should use the native `/codex` command surface (`/codex bind`, `/codex threads`,
@@ -85,7 +87,8 @@ This is the agent-facing decision tree:
85
87
86
88
1. If the user asks for **Codex bind/control/thread/resume/steer/stop**, use the
87
89
native `/codex` command surface when the bundled `codex` plugin is enabled.
88
-
2. If the user asks for **Codex as the embedded runtime**, use
90
+
2. If the user asks for **Codex as the embedded runtime** or wants the normal
91
+
subscription-backed Codex agent experience, use
89
92
`openai/<model>` with `agentRuntime.id: "codex"`.
90
93
3. If the user asks for **Codex OAuth/subscription auth on the normal OpenClaw
91
94
runner**, use `openai-codex/<model>` and leave the runtime as PI.
@@ -142,10 +145,10 @@ OpenClaw chooses an embedded runtime after provider and model resolution:
142
145
`fallback: "none"` to make unmatched `auto`-mode selection fail instead.
143
146
144
147
Explicit plugin runtimes fail closed by default. For example,
145
-
`runtime: "codex"` means Codex or a clear selection error unless you set
148
+
`agentRuntime.id: "codex"` means Codex or a clear selection error unless you set
146
149
`fallback: "pi"` in the same override scope. A runtime override does not inherit
147
-
a broader fallback setting, so an agent-level `runtime: "codex"` is not silently
148
-
routed back to PI just because defaults used `fallback: "pi"`.
150
+
a broader fallback setting, so an agent-level `agentRuntime.id: "codex"` is not
151
+
silently routed back to PI just because defaults used `fallback: "pi"`.
149
152
150
153
CLI backend aliases are different from embedded harness ids. The preferred
Copy file name to clipboardExpand all lines: docs/concepts/model-providers.md
+12-5Lines changed: 12 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,15 +29,15 @@ Reference for **LLM/model providers** (not chat channels like WhatsApp/Telegram)
29
29
<Accordiontitle="OpenAI provider/runtime split">
30
30
OpenAI-family routes are prefix-specific:
31
31
32
-
- `openai/<model>` uses the direct OpenAI API-key provider in PI.
32
+
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness. This is the usual ChatGPT/Codex subscription setup.
33
33
- `openai-codex/<model>` uses Codex OAuth in PI.
34
-
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness.
34
+
- `openai/<model>` without a Codex runtime override uses the direct OpenAI API-key provider in PI.
35
35
36
36
See [OpenAI](/providers/openai) and [Codex harness](/plugins/codex-harness). If the provider/runtime split is confusing, read [Agent runtimes](/concepts/agent-runtimes) first.
37
37
38
38
Plugin auto-enable follows the same boundary: `openai-codex/<model>` belongs to the OpenAI plugin, while the Codex plugin is enabled by `agentRuntime.id: "codex"` or legacy `codex/<model>` refs.
39
39
40
-
GPT-5.5 is available through `openai/gpt-5.5` for direct API-key traffic, `openai-codex/gpt-5.5` in PI for Codex OAuth, and the native Codex app-server harness when `agentRuntime.id: "codex"` is set.
40
+
GPT-5.5 is available through the native Codex app-server harness when `agentRuntime.id: "codex"` is set, through `openai-codex/gpt-5.5` in PI for Codex OAuth, and through `openai/gpt-5.5` in PI for direct API-key traffic when your account exposes it.
41
41
42
42
</Accordion>
43
43
<Accordiontitle="CLI runtimes">
@@ -148,11 +148,18 @@ Anthropic staff told us OpenClaw-style Claude CLI usage is allowed again, so Ope
148
148
- Shares the same `/fast` toggle and `params.fastMode` config as direct `openai/*`; OpenClaw maps that to `service_tier=priority`
149
149
-`openai-codex/gpt-5.5` uses the Codex catalog native `contextWindow = 400000` and default runtime `contextTokens = 272000`; override the runtime cap with `models.providers.openai-codex.models[].contextTokens`
150
150
- Policy note: OpenAI Codex OAuth is explicitly supported for external tools/workflows like OpenClaw.
151
-
- Use `openai-codex/gpt-5.5` when you want the Codex OAuth/subscription route; use `openai/gpt-5.5` when your API-key setup and local catalog expose the public API route.
151
+
- For the common subscription plus native Codex runtime route, sign in with `openai-codex` auth but configure `openai/gpt-5.5` plus `agents.defaults.agentRuntime.id: "codex"`.
152
+
- Use `openai-codex/gpt-5.5` only when you want the Codex OAuth/subscription route through PI; use `openai/gpt-5.5` without the Codex runtime override when your API-key setup and local catalog expose the public API route.
Copy file name to clipboardExpand all lines: docs/concepts/models.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@ sidebarTitle: "Models CLI"
23
23
</Card>
24
24
</CardGroup>
25
25
26
-
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. See [Agent runtimes](/concepts/agent-runtimes).
26
+
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. In Codex runtime mode, the `openai/gpt-*` ref does not imply API-key billing; auth can come from a Codex account or `openai-codex` auth profile. See [Agent runtimes](/concepts/agent-runtimes).
Copy file name to clipboardExpand all lines: docs/help/faq-models.md
+5-4Lines changed: 5 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -145,11 +145,12 @@ troubleshooting, see the main [FAQ](/help/faq).
145
145
</Accordion>
146
146
147
147
<Accordiontitle="Can I use GPT 5.5 for daily tasks and Codex 5.5 for coding?">
148
-
Yes. Set one as default and switch as needed:
148
+
Yes. Treat model choice and runtime choice separately:
149
149
150
-
- **Quick switch (per session):** `/model openai/gpt-5.5` for current direct OpenAI API-key tasks or `/model openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth tasks.
151
-
- **Default:** set `agents.defaults.model.primary` to `openai/gpt-5.5` for API-key usage or `openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth usage.
152
-
- **Sub-agents:** route coding tasks to sub-agents with a different default model.
150
+
- **Native Codex coding agent:** set `agents.defaults.model.primary` to `openai/gpt-5.5` and `agents.defaults.agentRuntime.id` to `"codex"`. Sign in with `openclaw models auth login --provider openai-codex` when you want ChatGPT/Codex subscription auth.
151
+
- **Direct OpenAI API tasks through PI:** use `/model openai/gpt-5.5` without a Codex runtime override and configure `OPENAI_API_KEY`.
152
+
- **Codex OAuth through PI:** use `/model openai-codex/gpt-5.5` only when you intentionally want the normal PI runner with Codex OAuth.
153
+
- **Sub-agents:** route coding tasks to a Codex-only agent with its own model and `agentRuntime` default.
153
154
154
155
See [Models](/concepts/models) and [Slash commands](/tools/slash-commands).
0 commit comments