OpenAI confirmed it is doing roughly $2 billion a month in revenue as of April 2026, a $24 billion annualized run rate that would have been unthinkable just two years ago. Leaked internal projections suggest the company may burn as much as $17 billion in cash this year. Separate projections show it will still lose somewhere around $14 billion in 2026, even with revenue projected to climb past $28 billion.
The most valuable AI company on the planet, backed by Microsoft and basically every venture capitalist on earth, is running a cash burn rate that swallows most of what it brings in.
Here is what happens when the subsidy ends.
Anthropic is in the same boat. By early 2026, it hit a $30 billion annualized revenue run rate. And one analyst estimated the company is losing 200% to 3,000% of each customer’s subscription fee on power users of its Claude Code tool.
But the money keeps flowing anyway. Big Tech is on track to spend $700 billion on AI infrastructure in 2026, up from about $400 billion the year before. Nvidia became the most valuable company in the world for a hot minute in June 2024. AI startups are raising at valuations that assume revenue will materialize out of thin air.
It will not. The gap between what is being spent and what is being earned was already $600 billion as of mid-2024, according to Sequoia Capital’s David Cahn, who started asking this question back in 2023. That was before capex roughly doubled. The actual gap today is almost certainly larger.
Something has to give.
The subscription lie
Anthropic wants $200 a month for the highest tier of Claude Max. That sounds absurd until you look at what a power user actually costs them.
The Decoder reported that Anthropic’s $200 Claude Code Max subscription can consume as much as $5,000 in compute per power user. Some analysts dispute the methodology and put the real cost closer to $500. Either way, Anthropic is subsidizing power users at scale.
OpenAI follows the same playbook with ChatGPT Plus at $20 a month and the $200 Pro plan, both priced to grab market share rather than make money on individual users.
This is the subscription lie. You are not paying for the product. You are getting a subsidized demo.
RELATED: A new phone hack can drain your bank account. Here’s the fix.
NanoStockk/Getty Images
The playbook itself is not new. Amazon lost money for nine years after its 1994 founding, and Bezos called it his “famously unprofitable company” in a 2000 BBC interview while the stock kept climbing. Uber racked up close to $30 billion in operating losses before its first annual profit in 2023 by subsidizing cheap rides with investor cash and then jacking up prices once it owned the market.
The playbook works until the funding dries up, and when it does, the bill always lands on the customer.
The corporate firing spree
Tech companies have shed tens of thousands of jobs in 2026, with Oracle cutting thousands, Amazon laying off 16,000, and Meta cutting about 8,000 roles in April, all to fund AI infrastructure. Salesforce’s CEO said AI agents replaced 4,000 customer support roles. Coinbase just announced that it is laying off 14% of its workforce to make way for AI “hubs.”
But will these companies actually save money in the long haul? Nvidia is one of the leading suppliers of AI-capable computing hardware, and Bryan Catanzaro, Nvidia’s vice president of applied deep learning, told Axios that for his own team, “The cost of compute is far beyond the costs of the employees.”
Not only can compute cost more than a human being, but the AI’s outputs have to be checked.
Amazon just learned it the hard way. The company laid off tens of thousands of engineers, triggering WARN filings across four states as Amazon shifted resources to AI. Then in March 2026, the company’s own AI coding tool, Q, contributed to a production change that caused millions of lost orders.
Amazon SVP Dave Treadwell convened an emergency engineering meeting and instituted a 90-day code safety reset. Under the new rules, junior engineers must get senior sign-off on any AI-assisted changes, and internal memos called the problem “high blast radius changes” where AI-generated updates propagated too broadly.
You’re already paying for it
AI is more than just software. It is steel, copper, and megawatts. AI models take massive quantities of computing power and electricity to operate. The tokens you rent for $20 per month are cooked in billion-dollar data centers that did not exist five years ago, and the power bill is not being paid by venture capital alone.
The scale is enormous. The International Energy Agency estimates U.S. data centers consumed 415 terawatt-hours in 2024, tripling by 2035. And you’re eating the cost.
Residential electricity prices have jumped roughly 30% since 2020, rising at twice the rate of inflation, and the increases are worse in areas where data centers are going up. Near those data centers, wholesale electricity prices have climbed as much as 267% over the past five years, according to a Bloomberg analysis.
In Virginia, regulators approved a 2026 rate increase that will add roughly $16 per month to typical residential bills while assigning more grid upgrade costs to data center operators. The company projects that average residential bills could rise by roughly 50% by 2039. In Columbus, Ohio, residential rates have risen by about $7.90 per month in 2026.
In most places, you are paying for the power plants and transmission lines that feed the data centers, not the tech companies.
A few states are trying to fix this. Ohio regulators approved a landmark tariff for AEP Ohio that forces large data centers to pay minimum demand charges instead of dumping costs on all ratepayers. Texas passed legislation requiring large data centers to cover their own infrastructure costs or pay equitably. Virginia is looking at similar measures. Most states have not caught up.
In March, President Trump secured volunteer pledges from tech companies to pay their own electricity costs and build their own power plants, but it remains to be seen if those pledges will be honored.
The dependency trap
Here is what happens when the subsidy ends.
Your company fired the customer support team and rebuilt the workflow around AI agents. The headcount budget became the API credits budget. Junior developers who used to review code got replaced by Claude. Senior engineers who could catch the mistakes are gone.
Then OpenAI and Anthropic have to raise prices to actual cost. Maybe they triple the API rate. Maybe the $200 Pro plan becomes $800. Maybe the free tier vanishes overnight.
You cannot rehire the workers. They found other jobs, retired, or left the industry, and the knowledge walked out the door. Meanwhile, your CRM, your code pipeline, your customer onboarding flow, and your reporting dashboards are all built around API calls to someone else’s model.
Inside a Fortune 500, admitting the AI replacement was a mistake is politically impossible. The CTO who signed the deal is not standing up in a board meeting to say we should rehire 4,000 people because the math stopped working. The budget officer who cut the department and moved the money to AI subscriptions is not reversing that call. They will pay the tax forever.
Goldman Sachs’ Jim Covello put the question bluntly in mid-2024: “Generative AI: Too Much Spend, Too Little Benefit?”
Covello’s case was simple. AI is not built for the complicated problems that would justify the price tag. The cost is too high for the value delivered, and the payback is not coming soon.
He was right about the spend. What he underestimated was the dependency, because companies are not just buying AI but rebuilding their operations around it, firing the people who knew how work got done, and trapping themselves in a vendor relationship with suppliers losing billions every year.
That is the trap. AI has plenty of value, but the gap between spending and earning keeps widening, and the companies downstream are cutting off their own ability to walk away.
What survives
When the bubble pops, and it will, some things survive.
Local models running on consumer hardware are the hedge against the API tax. A single RTX 4090 can run large language models that required much more expensive hardware just a few years ago, and open source models from Alibaba, Google, and others give you a real alternative to renting access by the token.
Companies that bought their own hardware instead of renting from OpenAI will be in the strongest position.
Own your tools, your data, and your compute. If your entire business is an API call wrapped in someone else’s model, you do not own anything. You are a middleman with a logo, and the model providers can change pricing, terms, or availability whenever they want while you cannot do a thing about it.
The AI companies burning billions right now will need to recoup those losses eventually, which means higher prices and tighter terms for everyone downstream.
The real winners are not the model builders. Nvidia sells picks and shovels no matter who finds gold. Chipmakers and infrastructure providers come out ahead and so do the cloud giants with multiple revenue streams. They are selling to both sides of every bet.
The dot-com bubble wiped out trillions in investor wealth, and the telecom bust that followed destroyed even more. But the internet survived, and so did the fiber in the ground.
AI will survive too. The question is whether the companies currently valued at hundreds of billions of dollars will be the ones standing when the dust settles.
History suggests they will not. This time, the victims will not just be the VCs who placed the bets. It will be every company that traded payroll for a loss-leader API, fired the people who knew how work got done, and discovered too late that the exit ramp had been bulldozed behind them.
Ai bubble, Openai, Anthropic, Nvidia, Tech
