← Back to blog
·4 min read

The AI Cost Curve Is Lying to You

The AI Cost Curve Is Lying to You

I remember the first time we tried to generate a playable game from a text description.

The output was bad. Like, genuinely bad. And it cost a small fortune in tokens to get there.

That was 18 months ago.

We do that same thing at BeyondPlay today for cents.

I've been thinking about that gap a lot lately. Because everyone I talk to still has the old mental model — AI is expensive and infrastructure costs are only going to spiral upwards. The headlines are full of billion dollar data centre builds. The bubble callers are pointing at capex like it proves something.

What if they are looking at the wrong number?

GPT-4 launched at $30 per million tokens. GPT-5.4 is $2.50. Better model. Twelve times cheaper.

Cost per token is falling. Smaller models are punching way above their weight. The stuff that was economically absurd in 2024 is now just... a product you can build on a weekend.

There's a fair counterargument here. Some of what looks like cheap inference is just subsidised by VC dollars. The big labs are pricing below cost to win market share. Today's pricing isn't real pricing.

That's true. And worth saying.

But here's what doesn't go away when the subsidies do: the efficiency gains are structural. Smaller models hitting 90% of frontier quality at 10% of the cost. That's not a pricing strategy. That's a permanent shift in what the stack can do. The floor moved permanently.

Builders who understand the difference are building for the world where the costs go down while the capabilities go up. That's the pattern repeating across every vertical.

The capex story is real. But capex is the cost to train frontier models. That's not your cost. That's not my cost.

Our cost is inference. And inference is in freefall.

The people saying AI is too expensive are watching one game. The builders are playing a different one entirely.

The barrier isn't compute anymore.

It's imagination.