Balance sheets and capital expenditure plans are slowly defining the AI investment cycle. The excitement surrounding models and demonstrations has now been replaced by capex and balance sheet projections.
The amounts involved are staggering. This could be the biggest capex boom ever.
OpenAI is the company dominating all discussions about AI since 2022.
Many people believe Sam Altman is responsible for the development of AI.
The past couple of weeks has shown some very interesting and important developments. It is worth taking a closer look at them, as well as evaluating opinions.
It is not a concentrated boom.
AI expenditure is not concentrated within a single company but spread throughout the whole stack.
Nvidia is still the leading supplier of high-end accelerations. Revenues from data centres continue to grow at rates which justify expansion.
Hyperscalers, like Oracle, invest tens or hundreds of millions each year in new data centers, networking and power infrastructure. These are not growth initiatives, but core strategies.
Microsoft, Amazon and Meta have all predicted materially increased AI capital expenditures through 2026.
Investors are becoming more sensitive to margins in the near term.
It is important because it shows the level of confidence. The capital is going first towards infrastructure and distribution.
The model developers are a step behind the decision makers.
Nvidia stops $100B OpenAI hype: What it means for NVDA Stock
Spending by hyperscalers is on their terms
Recent market reactions to Microsoft’s recent earnings have shown the tension.
Azure is still growing, but AI spending grew faster than the near-term revenue. Microsoft suffered one of its sharpest day-to-day losses in recent years.
It was not a question of whether AI worked, but rather how long margins would remain under pressure.
This reaction is instructive. Because AI is strengthening their ecosystem, hyperscalers are able to absorb this pressure.
Google is a prime example, as it continues to integrate major Chrome features with the Gemini model.
AI workloads drive customers to cloud platforms. Increase switching costs. Long-term prices are justified by higher costs.
The logic of strategy holds true even if the margins temporarily dip.
This logic is not applicable to OpenAI.
OpenAI’s economy looks very different
OpenAI’s spending trajectory is astonishing. According to estimates, computing will be the primary cause of losses that could reach $100 billion by 2028.
The gap between revenues and costs is widening, yet revenue continues to grow rapidly.
OpenAI must do two things at the same time to close this gap. OpenAI needs to scale revenue at a historic rate, while computing costs need to fall dramatically.
OpenAI has little control over the outcome of either scenario.
OpenAI, unlike hyperscalers does not control the infrastructure on which it relies. OpenAI buys computing from Microsoft, and possibly Amazon.
OpenAI will be exposed to whatever layer in the stack captures the most economic value.
OpenAI can feel the margins at the layer of the model.
Vertical integration has become the new advantage
Here is where the structural differences are most important.
Google is a chip-maker, cloud provider, and AI distributor. It distributes AI through Search, Chrome Android and Workspace.
Google Gemini trains its employees at a cost that is largely internal.
Meta follows a slightly different logic.
The company funds AI through the advertising revenue, first trains its models internally, then deploys them to products that have billions of users.
AI increases engagement and price power before direct monetisation.
Amazon gains from AI demands regardless of the model that wins. Each training request and each inference request helps AWS.
This lens should also be used to view reported conversations about large investments in OpenAI. Amazon buys exposure to the demand and does not bet its future on a single model.
OpenAI is working to achieve this goal by taking on a partial ownership in compute projects, and designing custom chips.
These efforts are helpful, but do not compare to the level of control that vertically integrated competitors enjoy.
The idea of one winner is being eroded by competition
Market expectations have changed due to the speed with which Google has closed its perceived performance gap.
Now, distribution is more important than benchmarks. Gemini doesn’t need to excel at every single task. It just needs to do well enough everywhere.
Anthropic’s enterprise-wide adoption has been centered on the regulated, technical and long-term contract use cases.
Customers in the enterprise sector are less concerned about being at the cutting edge and more interested in reliability, security and support.
Because switching costs are low, consumer AI is flexible. Social media and operating systems do not have the same network effects.
Rotation is based on price, convenience, and user preference.
This combination indicates a market where there are several dominant players, rather than one.
Apple’s approach is remarkably cautious.
Apple integrates AI to strengthen hardware and the ecosystem, rather than racing towards building larger models.
The acquisitions of Q.ai are focused on the device level intelligence and not on frontier computing.
Apple believes that the relationship with the users is more important than the model.
Nvidia is in an entirely different place now. It is the company that sells all the necessary tools, no matter who has won the race at the layer of the model.
The reported hesitation over a large OpenAI investment speaks volumes. Nvidia doesn’t need OpenAI in order to be dominant. AI inference and training is needed to scale.
It is important to make a distinction between owning and enabling the demand. This explains why AI exposure through infrastructure providers is becoming more attractive, while pure economic models are less appealing.
Investors underestimate the likelihood of a certain outcome
Now, a realistic scenario can be seen. AI adoption is increasing, but spending continues to be high. Several models are coexisting at comparable capability levels.
This means that prices will fall while chips and platforms, infrastructure, and other components of value capture the majority. The model developers are competing harder to achieve thinner margins.
Microsoft’s fall is a good example of how investors have become more sensitive about spending that doesn’t produce immediate results.
Investors are interested in the distribution of these models.
The search is on for moats that are defensible, like the ones built by Google or Anthropic.
Spending more money is not enough. Markets aren’t interested in it.
OpenAI is still important even if it’s not the economic leader of the AI age.
Recent tech history has taught investors to expect concentration.
Next-generation AI could reward companies that control infrastructure, distribution and balance sheets.
As updates develop, balance sheets and models may change.