People are underestimating how heavily subsidised this all is right now. The labs are eating cost to drive adoption while we rebuild our workflows around their tools. When that changes, the "AI saves money" story gets a lot less obvious.

The Verge recently published a piece laying out the economics of the AI industry in stark terms. Gartner estimates that between 2024 and 2029, capital investment in AI data centres will hit $6.3 trillion. To avoid a write-down of those assets, providers need to generate roughly $7 trillion in AI-driven revenue by 2030. That means token consumption needs to grow by 50,000 to 100,000x.

Read that again. Not 50%. Not 10x. Fifty to one hundred thousand times.

Where do you think that money is coming from?

The subsidy era

Right now, the AI labs are in a land grab. OpenAI, Anthropic, Google, and the rest are spending faster than any technology sector in history, burning through venture capital and strategic investment to acquire users, establish habits, and embed themselves into workflows.

This is not new. We have seen this exact pattern before.

Uber subsidised rides until entire cities forgot how to hail a cab. WeWork subsidised office space until startups couldn't imagine signing a lease. DoorDash and Deliveroo subsidised delivery until restaurants dismantled their own drivers.

In every case, the strategy was the same: price below cost, acquire users, build dependency, then reprice once switching becomes painful.

The AI labs are running the same playbook. The only difference is the scale. This time it is trillions, not billions.

What the numbers actually say

The Verge article quotes Gartner's Will Sommer, who breaks down the return-on-invested-capital thresholds. At 25% ROIC, investors get what they expect. Below 12%, capital moves elsewhere. Below 7%, you are in write-down territory, which Sommer describes as "an unmitigated disaster for all of the investors in this technology."

To hit even that catastrophic floor, the industry needs nearly $7 trillion in cumulative AI revenue by 2029.

OpenAI alone has made $600 billion in spending commitments through 2030. Even in best-case scenarios, Gartner predicts the lab would only hit a fraction of the spend required for a 7% return.

So either the money has to come from somewhere, or a lot of very expensive data centres become very expensive mistakes. The investors are not going to eat that loss quietly. They are going to come looking for their return. And the people who have rebuilt their operations around cheap AI tokens are the ones holding the bill.

The dependency mechanics

Here is how it works in practice.

A business starts using AI tools. They are cheap, often free. The tools are good. People build them into their daily work. Processes get redesigned. Headcount decisions get made. Budgets get reallocated. Internal capabilities that the AI replaced get quietly decommissioned.

Over twelve to eighteen months, the AI tools are no longer optional. They are structural. The business cannot operate at its current throughput without them.

Then the pricing changes.

Anthropic has already started. They restricted third-party tools that were consuming too much compute. Enterprise plans shifted to token-based pricing. OpenAI introduced advertising into ChatGPT. Both companies are moving toward metered usage models.

This is the gentle version. The real repricing has not started yet.

The token economics problem

Every time you use a reasoning model, it thinks. It considers paths, backtracks, verifies, explores. A one-sentence prompt can generate tens of thousands of tokens behind the scenes. AI agents, the tools that are supposed to handle complex work autonomously, consume vastly more tokens than the chatbot models that came before them.

Right now, the labs are absorbing the cost of that. They have to, because the agents are the growth story, and the growth story is what keeps the capital flowing. But every one of those tokens has a real cost in compute, electricity, and infrastructure. And as the models get more capable, they get more expensive to run, not less.

Georgia Tech's Mark Riedl puts it bluntly: "Maybe the economics are a little upside down right now."

They are. And "upside down" is a polite way of saying that you are currently paying a fraction of the true cost of the intelligence you are consuming. The difference is being covered by investors who expect a return, and that return will come from you.

Where this gets uncomfortable

The industry narrative is that AI reduces costs. And it does, right now, at subsidised prices.

But the honest calculation should not compare "human labour vs current AI cost." It should compare "human labour vs future AI cost, at a price point that allows the provider to survive."

Nobody is running that calculation. Because when you do, the "AI saves money" story gets a lot more complicated.

Consider a business that has replaced three administrative roles with AI-assisted automation. Current cost: a few hundred pounds per month in subscriptions and API usage. Future cost, once pricing reflects reality: potentially thousands per month, with no upper bound if usage is metered and the workflows have expanded.

The business cannot go back. Those roles are gone. The institutional knowledge that went with them is gone. The processes have been redesigned around AI. The dependency is structural.

This is not a hypothetical. This is the trajectory that thousands of businesses are on right now.

The consolidation squeeze

Gartner predicts that no more than two large language model providers will survive in any regional market. That means the current competitive pressure that keeps prices low is temporary.

When consolidation happens, the surviving providers will have pricing power. They will also have detailed usage data showing exactly how dependent each customer is. The businesses that built the deepest dependencies will have the weakest negotiating position.

This is a well-understood dynamic in enterprise software. Ask anyone who has been through an Oracle or SAP licensing renewal. Now imagine the same dynamic, but applied to tools that have been woven into the daily operations of entire departments.

The sovereign alternative

There is another way to use this window.

Instead of building dependency on cloud-hosted AI services, use the current cheap period to build sovereign capability. That means:

Own your data pipeline. Do not let your operational data flow exclusively through third-party AI systems. Build your own data infrastructure, your own embeddings, your own retrieval systems. These are not expensive to build and they are yours.

Run local models where it matters. Open-source models are good enough for many production tasks. They run on hardware you own. They do not come with usage meters or surprise pricing changes. They are not as capable as the frontier models for every task, but they are capable enough for most operational automation.

Build abstraction layers. If you must use cloud AI, do not wire it directly into your core systems. Build an abstraction layer that lets you swap providers, route between cloud and local models, and fall back gracefully when a provider changes terms.

Invest in your own tooling. The current cheap window is an opportunity to build internal tools, scripts, workflows, and automation that use AI as a component rather than a dependency. The goal is to make the AI layer replaceable, not structural.

Keep human capability. Do not fully decommission the skills and knowledge that AI is currently augmenting. The people who understand the work are your insurance policy against pricing shocks.

The real timeline

The question is not whether pricing will change. It will. The question is when, and how sharply.

Based on the investment timelines, the pressure starts building seriously in 2027 and 2028. That is when the early venture money expects returns. That is when data centre amortisation schedules start demanding revenue. That is when the current "growth at all costs" phase gives way to "show me the margin."

Businesses that have spent 2025 and 2026 building dependency without building sovereignty will discover that their AI efficiency gains were borrowed against future pricing power that they gave away voluntarily.

The dealer analogy is not an exaggeration

I said it on X and I will say it here more plainly.

The first hit is cheap. That is the entire point. The labs need you using their tools, building your workflows around their infrastructure, making decisions that assume current pricing will last. Every token you consume, every process you redesign, every person you do not hire because "AI handles that now" deepens the dependency.

Dependency is where they make their money. Not today, but soon. And by the time the repricing hits, you will have expanded your workload so far beyond what your team can handle without AI that going back is not an option.

By the time most people realise they are locked in, it will be too late to build an alternative. The sovereign window is now, while the tools are cheap and the models are open.

Use it, or accept the terms that come later.

What to do this week

If you run a business that is adopting AI tools, here are five things worth doing now:

  1. Audit your AI dependency. Map every process, workflow, and role that now relies on AI tooling. Understand what breaks if the pricing doubles, or the provider disappears.

  2. Calculate your real exposure. Work out what your AI spend would look like at 3x, 5x, and 10x current token prices. If any of those numbers change your business case, you have a sovereignty problem.

  3. Evaluate local alternatives. For every cloud AI workflow, ask whether an open-source model running on your own hardware could handle 80% of the task. For many operational uses, it can.

  4. Build abstraction now. If you are integrating AI APIs, put an abstraction layer between your systems and the provider. Make it possible to switch without rewriting your stack.

  5. Keep your humans sharp. The worst outcome is a team that has forgotten how to do the work without AI, at the exact moment the AI becomes too expensive to use.

The cheap window will not last. What you build during it will determine whether AI becomes a genuine competitive advantage or just another vendor lock-in story with better marketing.