codex 与 cc Switch的配置问题

搞不定了;怎么配置都不对,尝试了好多次;不懂为什么会请求openai官方的地址,而不是公益站的地址;

求佬 提供解决思路



model_provider = "custom"
model = "gpt-5.3-codex"
model_reasoning_effort = "high"
disable_response_storage = true

[model_providers.custom]
name = "custom"
wire_api = "responses"
requires_openai_auth = true

model = "gpt-5.3-codex"
model_reasoning_effort = "xhigh"
network_access = "enabled"
disable_response_storage = true
windows_wsl_setup_acknowledged = true
model_verbosity = "high"

[mcp_servers.amap-maps]
type = "stdio"
command = 'C:\adminSofts\nodejs\npx.cmd'
args = ["-y", "@amap/amap-maps-mcp-server"]

[mcp_servers.amap-maps.env]
AMAP_MAPS_API_KEY = "****"

[projects.'C:\Users\admin']
trust_level = "trusted"

[projects.'C:\Users\admin\Downloads']
trust_level = "trusted"

[windows]
sandbox = "elevated"
base_url = "https://openai.api-test.us.ci"

ulr要加 /v1

把base_url放这一行之后,最后要加/v1,base_url = "https://openai.api-test.us.ci/v1”

你要不尝试下开启代理模式,我模仿你的参数,正常使用确实出现openai的官方,但是开启代理模式,则是401 token无效

1 个赞

仿照他的,我尝试添加v1,也会出现

我开启代理了;虚拟网卡模式;

[model_providers.custom]
name = "custom"
wire_api = "responses"
base_url = "https://ai.zhansi.top/v1"

这样的吧;还是不行;

话说你们都能用吗?
公益站

用我的ccc试试吧
npm install -g @tkpdx01/ccc

ccc new ,选 codex,按提示输入配置名称,url(加v1哟)和key

假如配置名称叫做 ddd,用ccc ddd就可以启动codex
如果想覆盖codex配置,用ccc apply ddd,以后直接运行codex就可以用公益站的url和key启动codex了

最大限度不影响环境变量
也支持claude code 哟

不是网络代理,是ccs也有个代理模式

我的昨天开始也不行了不知道为啥

model_provider = "custom"
model = "gpt-5.3-codex"
model_reasoning_effort = "high"

[model_providers.custom]
name = "custom"
wire_api = "responses"
base_url = ""
requires_openai_auth = true

这样应该没问题的,我用GGBOOM的可以


测试可用哦

1 个赞