Skip to content

Legacy manual openai-codex provider override breaks Codex OAuth after #37558 / #38026; doctor should detect and fix #40066

@sene1337

Description

@sene1337

Summary

Users who manually configured models.providers.openai-codex before the recent Codex OAuth fixes can remain broken even after upgrading to a version that includes:

In our case, Codex OAuth continued failing until we manually removed the legacy openai-codex provider override from openclaw.json.

Once that override was removed, openai-codex/gpt-5.4 worked immediately.

This appears to be a stale manual-config shadowing problem, not a fresh OAuth bug.

Why this matters

A lot of users likely manually set up Codex weeks ago during the period when people were experimenting with:

  • ChatGPT/Codex OAuth
  • custom models.providers.openai-codex
  • explicit baseUrl
  • explicit api
  • manually defined gpt-5.3-codex / related model entries

Those manual overrides can survive upgrades and silently block the newer built-in Codex OAuth provider behavior.

So the upstream fixes may be present, but affected users still stay broken.

Environment

  • macOS
  • local gateway
  • openai-codex:default auth profile with mode: "oauth"
  • primary model set to openai-codex/gpt-5.4
  • upgraded from an older manual-Codex-config era setup

Symptoms

Before removing the override:

  • Codex appeared configured correctly
  • requests failed with 401 / scope-related behavior
  • model often fell through to fallback providers
  • behavior looked like “Codex OAuth still broken”

After removing the override:

  • same auth profile
  • same account
  • same target model
  • openai-codex/gpt-5.4 worked immediately

Root cause

We had a legacy manual config block under:

models.providers.openai-codex

That override was still forcing the old explicit provider shape, instead of letting the built-in Codex OAuth provider synthesize the correct behavior after the recent fixes.

Repro sketch

  1. On an older OpenClaw version, manually configure models.providers.openai-codex
  2. Add explicit base URL / API / model entries for Codex
  3. Upgrade OpenClaw to a version containing fix(auth): remove bogus codex oauth responses probe #37558 and fix(openai): Enable gpt-5.4 support via Chat Completions fallback on scope error #38026
  4. Re-auth Codex OAuth
  5. Try openai-codex/gpt-5.4
  6. Observe failure / fallback behavior
  7. Remove the manual models.providers.openai-codex override
  8. Retry
  9. Observe success

Expected behavior

OpenClaw should detect that a legacy manual openai-codex provider override may be shadowing the built-in OAuth provider and do one of:

  1. Warn loudly

    • “Legacy manual models.providers.openai-codex override detected. This may break built-in Codex OAuth behavior after recent fixes.”
  2. Offer an automatic migration

    • remove or rewrite stale openai-codex provider config to the supported modern form
  3. Teach openclaw doctor to catch this

    • detect manual models.providers.openai-codex
    • check whether an OAuth profile exists for openai-codex
    • warn if the manual override is likely shadowing the built-in provider
    • suggest or optionally apply a fix

Strong suggestion: update openclaw doctor

This feels like exactly the kind of thing doctor should catch.

Suggested doctor check:

  • if auth.profiles contains an OAuth profile for openai-codex
  • and config also contains a manual models.providers.openai-codex
  • then warn that the manual override may be stale and may block the built-in Codex OAuth provider

Suggested output:

Detected manual models.providers.openai-codex override alongside Codex OAuth auth profile.
This may be a legacy configuration that shadows the built-in Codex OAuth provider.
Recent Codex OAuth fixes may not apply until this override is removed or migrated.

Even better: doctor --fix could offer to back up config and remove the stale override automatically.

Related fixes / context

This issue looks like a downstream / legacy-config follow-up to:

Those fixes appear to solve the runtime/provider logic, but users with older manual Codex config can still be stuck because their config shadows the repaired built-in path.

Workaround

Remove the manual models.providers.openai-codex override from config and restart OpenClaw.

In our case, that was the key step that made openai-codex/gpt-5.4 start working.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions