I’ve been playing with the xAI API recently, and noticed something interesting about their pricing model. The API returns usage costs as integers, but the numbers didn’t immediately make sense.
UPDATE: we now know what they call this unit of measure: the USD tick. See https://github.com/xai-org/xai-proto/pull/48
For example, a Grok Imagine generation costs 330,000,000. On their pricing page, that lists as $0.033.
Or take grok-4-1-fast-reasoning, which returns prices like:
"promptTextTokenPrice": "2000",
"cachedPromptTokenPrice": "500",
"completionTextTokenPrice": "5000"
Represented as follows on the Models page:
Input: $0.20
Cached input: $0.05
Output: $0.50
Doing the math, that means each unit is 1/10,000,000,000 (one ten-billionth) of a dollar ($10^{-10}$). I needed a name for this conceptual unit to keep things sane, so I started calling it a bitcent.
I’ve no idea what other folks are calling this unit of measure, but there seemed to be little concensus after a quick Grok search.
The problem: Dust Pricing
We are in the era of “dust pricing”. API calls often cost a microscopic fraction of a penny. Dealing with this usually leads to one of two bad places:
- Floating point dollars:
$0.00000001— Hello rounding errors and scientific notation confusion. - Naming confusion: “Microdollars”? “Nanodollars”? “Token-cents”?
xAI’s approach of using an integer with a fixed precision is actually the right way to handle currency (as any fintech dev will tell you). But we need a shared vocabulary for it.
Introducing the Bitcent
The bitcent represents exactly $10^{-10}$ dollars.
This is what xAI is already using
Why that specific number? It’s the sweet spot for avoiding floating point math while covering the vast range of AI pricing.
- 1 Bitcent = $0.0000000001
- 1 Cent = 100,000,000 Bitcents
- 1 Dollar = 10,000,000,000 Bitcents
This allows us to express prices that are currently “dust” as clean, readable integers:
| Item | Cost ($) | Cost (Bitcents) |
|---|---|---|
| 1 GPT-4o Token | ~$0.000005 | 50,000 bc |
| 1 Haiku Token | ~$0.00000025 | 2,500 bc |
| Grok 4.1 Fast (In) | $0.0000002 | 2,000 bc |
| Grok 4.1 Fast (Out) | $0.0000005 | 5,000 bc |
| 1 Grok Image | $0.033 | 330,000,000 bc |
Why “Bitcent”?
I considered other names, but they all fall short:
- Nanodollar: Technically $10^{-9}$ ($0.000000001). It maps poorly to the $10^{-10}$ scale xAI is using.
- Pico-something: Too small ($10^{-12}$).
- Tokencent (or token-cent): a bit long, and “price per token” will typically be in the same sentence, too many “token” words.
- Neurcent (or neurocent): interesting reference to “neural” nets, but perhaps too pretentious?
“Bitcent” works because it hints at digital currency (“bit”) while grounding it in familiar financial terms (“cent”). It sounds like a natural subdivision of modern digital money.
If we want to standardize how we talk about AI costs across providers, adopting the bitcent as the standard unit for $10^{-10}$ USD seems like a solid move. It turns “0.000025 cents” into “2,500 bitcents”—a number humans can actually read and compare.
For example, these are the current (as of Jan/2026) prices for some leading LLMs:
| Provider | Model Variant | Input ($$/1M tokens) | Output ($$/1M tokens) | Input (bitcents/1M tokens) | Output (bitcents/1M tokens) | Cached Input ($$/1M tokens) | Cached Input (bitcents/1M tokens) |
|---|---|---|---|---|---|---|---|
| OpenAI | GPT-5 mini | $0.25 | $2.00 | 2,500 | 20,000 | $0.025 | 250 |
| Gemini 3 Flash | $0.50 | $3.00 | 5,000 | 30,000 | $0.10 | 1,000 | |
| xAI | Grok 4.1-fast (reasoning & non-reasoning) | $0.20 | $0.50 | 2,000 | 5,000 | $0.05 | 500 |
| Anthropic | Claude Sonnet 4.5 | $3.00 | $15.00 | 30,000 | 150,000 | $0.75 | 7,500 |
| OpenAI | GPT-5.2 | $1.75 | $14.00 | 17,500 | 140,000 | $0.175 | 1,750 |
As prices continue to plummet, I think bitcents will make even more sense!
/kzu dev↻d