Local-first · BYOK · MIT

Free-Way

Local LLM Control Plane

One localhost gateway for the free LLM APIs you already have keys for.

把你已有 Key 的免费 LLM API,收敛到一个 localhost 网关。

Bring your own provider keys. Free-Way discovers models, normalizes OpenAI / Anthropic protocols, routes requests, and falls back across compatible providers. No hosted proxy, no shared quota pool.

No hosted proxy Your keys stay local No shared quota pool OpenAI + Anthropic
git clone https://github.com/GoDiao/Free-Way.git
cd Free-Way
npm install && npm run build && npm start
Free-Way Console ONLINE
localhost:8787
Endpoints
POST /v1/chat/completions OpenAI
POST /v1/messages Anthropic
GET /v1/models Models
Route Trace
01 request received 12ms
02 normalize OpenAI → Provider
03 resolve model llama-3.3-70b
04 select provider groq
05 provider error rate_limit
06 fallback route openrouter
07 stream response 200 OK
Provider Health
groq healthy
mistral healthy
openrouter fallback-ready
cloudflare healthy
gateway online 14 providers indexed 120+ models discovered OpenAI / Anthropic BYOK runtime

A local routing layer for fragmented free-tier APIs.

Free-Way acts as a local control plane between your AI tools and provider APIs. It normalizes protocol differences, resolves models, checks route availability, and falls back when a provider fails — all from localhost.

Local Control Plane

Runs entirely on your machine. Manage provider keys, monitor health, browse models, and inspect usage — from a single web console at localhost:8787. No hosted proxy. No shared quota pool.

keyslocal
consolelocalhost:8787
proxynone
modeBYOK

Protocol Normalization

Expose OpenAI and Anthropic compatible endpoints from one server. Most clients only need a Base URL change.

Model Discovery

Fetch available models from supported providers and keep a unified free-tier catalog updated where possible.

Fallback Routing

Free-tier quotas shift constantly. When one route is rate-limited or unavailable, Free-Way tries another compatible provider.

Health Checks

Monitor provider availability and latency from the console. Run on-demand checks for individual providers or all at once.

Agent Workflow

Point local-capable AI coding tools at the same address. Configure once — Claude Code, Continue.dev, OpenCode, and clients that call custom base URLs directly.

One localhost endpoint, multiple provider routes.

Your tools talk to Free-Way once. Free-Way maps each request to a compatible provider route, normalizes the protocol, and streams the response back — falling back across providers when needed.

Clients
Claude Code
Continue.dev
OpenCode
Cline
+ local clients
Free-Way Gateway
localhost:8787
/v1 · /v1/messages
normalize · route · fallback
Providers
Groq · Mistral
Cerebras · Cohere
NVIDIA · Cloudflare
OpenRouter · GitHub
+6 more

Bring your own keys from supported free-tier providers.

Free-Way does not provide free API access. You register your own keys with each provider, configure them in the console, and Free-Way handles the rest.

High-speed inference
Groq Cerebras NVIDIA NIM
General model platforms
OpenRouter Mistral Cohere GitHub Models
Regional & ecosystem providers
Cloudflare SiliconFlow Zhipu OpenCode LLM7 Kilo ZenMux

* Free-tier limits, quotas, and availability are controlled by each provider and may change at any time.

Point any AI coding agent here.

Most AI coding tools can point to custom OpenAI or Anthropic-compatible endpoints. Configure once, then use any model Free-Way discovers.

Claude Code
Anthropic
/v1/messages
Cursor
Known limitation
localhost may not work
Continue.dev
OpenAI
/v1
OpenCode
OpenAI
/v1
Cline
OpenAI
/v1
Aider
OpenAI
/v1
Codex CLI
OpenAI
/v1
OpenClaw
Anthropic
/v1/messages

Detailed setup guides: docs/agents/ →