Control

11 minute read Published: 2026-02-20

Nobody can buy RAM right now. SSDs are in a similar situation. Now, even spinning disks are hard to come by. Of course, GPU prices have been skyrocketing for some time. While geopolitics and supply chain issues compound the issue, the largest driver of demand is clear: the AI industry spinning up more and more compute to serve their oracles.

Our inability to buy computing power, while not necessarily the AI companies' direct objective, is absolutely a happy outcome for their bosom buddies, cloud service providers.

The Rise of Services

As we've covered before, Microsoft is now largely a cloud services company. By the numbers, cloud compute accounted for the 38% of its total revenue in Fiscal Year 2025—the largest single category and the fasting growing year-over-year. If you look at what they prioritize, what they invest in, and how they market themselves, it is clear that Microsoft no longer considers itself a software company. They are a cloud services company, and want desperately to be an "AI" company.

When you think Apple, you think iPhone. And sure enough, iPhone sales accounted for 50% of Apple's revenue in FY25. But that's down 1% from the year prior, which was 1% lower than the year before that. In that same timespan, Apple's revenue from services (iCloud, Apple Music, etc.) grew 4%.

As for the rest of the major tech companies, they were born as services. Google? Service. Amazon? Service. Meta? Service, although arguably they're actually a drug cartel. You get the point. "Products" for these major companies have given way to "services" in the era of the cloud. The genAI goldrush has only accelerated this trend.

Generative models are the ultimate service. Practically speaking, they can only exist in massive datacenters, offered to you at a "reasonable" monthly rate. Their nature is incompatible with local computing.

"But what about local models?" I hear you cry. All twelve of you. Since I use ollama for my own AI security research, let's call it a baker's dozen. Look, enthusiasts may be interested in experimenting with these tools and have the equipment to do so—or the disposable income to grab a Mac Mini while supplies last. That is not the general population, and it doesn't shape the direction of the technology, no matter what Mozilla would like you to think. And at any rate, all those "local" models got trained somewhere, and I promise it wasn't on somebody's gaming rig.

So generative models are a service. If we combine the stated ambitions of AI companies and cloud companies, recognizing their deep intermingling, we can arrive at a clear conclusion:

The tech industry's future is one in which you rent computing power from them, accessed via a locked-down device without the ability to operate on its own. The inputs and outputs of your computing will be mediated and captured by language models that obscure sources and process from you.

That sounds alarmist and conspiratorial, I'm sure. I don't think it's an evil master plan—not a unified one, anyway—but I do think all the incentives and investments are pointing in this direction.

The Business of Data Centers

Any search along these lines will bring you to a 2024 New York Times interview with Jeff Bezos. In the interview, he discusses how the current state of compute—everyone is building a data center—can't last. Instead, what will happen (and what has happened) is that companies will elect to purchase compute from a handful of major players. The AI boom has proven this to be only sorta true. OpenAI, in addition to its deep ties with Microsoft Azure, is now in bed with Oracle to the tune of $300 billion. Meanwhile, Anthropic drops $50 billion to build its own US data centers. Of course there's xAI, but those data centers aren't exactly exemplars of business practice. So the takeaway here isn't that consolidation is happening—yet, anyway—but that the business of building data centers is booming. In fact, it's the only business that is. But for AI investment, US GDP barely budged last year, according to some economists. Other analyses put the impact somewhat lower, but still about 39% of GDP growthin 2025. That's bigger than the dot-com boom ever was.

So it is a boom. But with a boom comes fear of a bust.

The Bubble

When bubbles burst, the pop tends to come from one of two places: debt or demand. In the case of debt, borrowing to spin up business comes due without the hoped-for revenue to service it. This was the nature of the 2008 crash, as financial institutions found their investment holdings cratering in value, leading to the inability to honor their obligations. Demand, conversely, describes the dot-com bubble of the early 2000s. Massive overvaluation of tech companies sent stock prices soaring and led many companies to go public and offer shares—without a viable business model, as it turned out. When demand was revealed to be far lower than expectations, the market experienced a drastic correction.

AI bubble is threatened on both fronts. Unlike previous buildouts from big tech using cash on hand, the AI data center expansion is debt financed in ways never before seen. Of the $443 billion spent by the "Hyperscalers" in 2025, $121 billion was debt-financed. Much of this has been through bond issuance, which is a bet with bondholders on future returns. That only works if the revenue shows up to justify the spend. For what it's worth, investors are already starting to get nervous about this overextension. Oracle's credit default swaps (a kind of insurance against default) have exploded in value since the OpenAI data center announcement.

Once again, we have a misadventure that is "too big to fail." The US economy can't afford the consequences if consumer demand for AI levels off or disappears. And remember: AI has yet to be profitable for anyone. Not just the hyperscalers, mind you—the small companies trying to find value in the product just...aren't.

How do you guarantee revenue on a product that nobody wants and people regret buying?

By making it mandatory.

Not Everything Is Code

I feel compelled to briefly address this viral article from Matt Shumer about the coming revolution prophesied by the latest model from Anthropic. His life changed and you're next, claims Shumer, because these new models are revolutionary.

We've heard this before—with each model release. But let's assume Matt is right that Opus 4.6 is a coding god. I need every software developer to listen very closely:

Code proficiency doesn't translate to the rest of human undertaking.

In fact, code is likely the easiest thing for LLMs to get right. The variance in samples for any given coding task will be vastly lower than variance between, say, every version of an apartment lease or restaurant review. Language models could very well do okay at these tasks, but as we were just reminded, the threats of hallucination never go away.

I mention this because a very specific population keeps telling us this technology works. It works for them, and they believe that means it will soon work for everyone. I can't think of another way to say this, so: software developers, please stop sniffing your own farts. Code isn't that special, and there is no ethical doctor on earth that is willing to give AI agents the same free rein with patients as developers do with their code.

But in the realm of code, there is seemingly no escape from the vibes or the coercion to vibe.

The Prompts Will Continue Until Morale Improves

Consider what has already taken place in the realm of forcing this dreck down our throats. Copilot in Windows. AI Overview in Google Search. Gemini on your Android phone; Apple Intelligence on your iPhone. Meta AI. Grok. Even Firefox bakes this in and makes you turn it off rather than letting you opt in.

The consumer computing world is infected, suffused with generative AI. You need to be a very specific kind of nerd (my kind) to hope to avoid it, and even then it's a challenge.

Is your employer requiring AI usage? Is it...helping? If not, why the mandate? Perhaps for the same reason you had to return to the office: to justify investments. They spent so much money on this, they need it to work too.

Now here's the pivot. Our current state is:

  1. AI investment has not yet paid off and shows no sign of doing so soon.
  2. Companies are massively exposed via debt and stock market correction.
  3. The only way you'll use this stuff is if you have to.
  4. Meanwhile, nobody can buy good computing components because AI investment is eating all the silicon.

So at some point—perhaps tomorrow, perhaps last year—these companies are going to realize that a great way to require AI usage is to require cloud service usage. And a great way to require cloud service usage is to neuter local computing power, which in turn solves competition for supply. Want storage? That'll be another $20/mo. Want graphics for games? Another $20. You want to perform data science and fit ML Models?! You're going to need the Professional plan, starting at $200/mo. It's a rent-seeker's dream.

I'm not saying this happens overnight. I'm also not saying the current supply shortage will last forever, although as long as data centers are getting built, I suspect consumers will be at a disadvantage. I am saying that an industry that already hoodwinked consumers into Chromebooks can do it again, but this time in a market where consumers can't afford high-powered alternatives. The service-delivery device, whatever shape it takes, will be ubiquitous and subsidized by the rental of AI services.

Why fight to get one of the few remaining laptops in inventory with onboard horsepower when you can get an OpenAI Terminal just for subscribing? Sure, all it does is provide you with a single interface, a blinking chat box through which your entire digital experience is mediated, but plans start at only $20/month! Isn't that easier? Look how shiny it is. Just relax. Don't resist. It hurts more when you fight it.

That's it. That's it. Now, what's your question?