Is xAI the neocloud now?

On Wednesday, xAI and Anthropic announced a surprise partnership that saw Claude’s maker buy “all the computing power [xAI’s] The Colossus 1 data center,” about 300MW that allowed Anthropic to quickly increase its usage limits. It is a big investment for xAI, potentially worth billions of dollars. More importantly, it quickly monetizes one of the company’s most impressive achievements, transforming xAI from a consumer to a computing provider.
It’s tempting to see programming as a shot at OpenAI in the middle of a case in progress. But Musk’s explanation to X was that xAI had already moved training to a new data center, Colossus 2, and xAI didn’t need both.
In the short term, there is a clear logic at work. Existing xAI products are heavily focused on Grok, which has seen usage decline since the release of image production earlier this year. If building an xAI data center is more than Grok needs to operate, partnering with Anthropic adds more green to the balance sheet. This is especially useful as the company, now merged with SpaceX, is speeding toward an IPO. More broadly, having Anthropic listed as a customer makes it easier to believe that SpaceX’s data center play could actually work.
But beyond the short-term gain, the Anthropic partnership sends a strange message about where Elon Musk’s priorities really lie. It suggests that the company’s real business may be more about building data centers than training AI models.
It’s rare to see a large tech company manage computing resources in this way when companies like Google and Meta, which are also training models, are building more data centers. It’s an easy point to miss, because many of these companies act as enterprise AI vendors, Internet services, and cloud providers all at the same time. But when forced to make a choice between selling a more affordable computer to customers and retaining others to build their own tools, they faithfully chose department No. 2.
Just last month, Sundar Photosi admitted on the phone that Google Cloud’s revenue was lower than it could have been because the company was “underpowered” – and when given the option to rent their GPUs or use them to develop AI products, Google chose AI products.
Facebook faced a more extreme version of the same obstacle, spinning up a new cloud machine just to ensure they would have enough GPU power to chase Mark Zuckerberg’s AI ambitions. As he put it when he announced Meta Compute in January, “The way we engineer, invest, and partner to build this infrastructure will be a strategic advantage.”
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
The key word there is “strategic”. Both Zuckerberg and Pichasi look to a future where AI powers the world’s most popular and profitable systems. Computing power is not just a way to satisfy today’s need for thinking, but to create tomorrow’s products – and wasting time on computing means missing that opportunity.
With a focus on data centers (terrestrial and otherwise), xAI is positioning itself as a neocloud business: buying GPUs from Nvidia and leasing them to modeling developers like Anthropic. It’s a very tough business, pressured by both chip suppliers and the changing demand cycle. The valuations of many active neoclouds reflect that fact: xAI was valued at $230 billion in its January funding round; CoreWeave, which manages a comparable amount of computing power, costs less than a third of that.
Musk’s version of the neocloud is very ambitious, as you might expect. Some of the data centers may be in space – at least by 2035, if things go according to plan. xAI will be making its own Terafab chips, which will take away some but not all of Nvidia’s pricing power. But neither is changing the basic economics of the neocloud business.
As recently as February all-hands, xAI had real ambitions in software. That was the presentation that revealed the project of the orbital data center, but also teased the important ambitions in coding (as reinforced by the Cursor partnership) and interesting ideas such as the beneficial use of computing into complete digital twins (unfortunately named after the Macrohard project). These are the type of long-term projects that require dedicated computing resources to succeed. As long as xAI sells huge amounts of computing to its competitors, it’s hard to imagine that such new ambitions have much of a future.
If you shop through links in our articles, we may earn a small commission. This does not affect our editorial independence.



