Powered by Smartsupp

xAI and Anthropic Strike Billion-Dollar Deal for Full Colossus 1 Compute Capacity, Unlocking Massive Usage Limits



By admin | May 06, 2026 | 3 min read


xAI and Anthropic Strike Billion-Dollar Deal for Full Colossus 1 Compute Capacity, Unlocking Massive Usage Limits

On Wednesday, xAI and Anthropic revealed an unexpected collaboration, with the company behind Claude acquiring "all of the compute capacity at [xAI's] Colossus 1 data center"—roughly 300 megawatts. This move enabled Anthropic to quickly raise its usage caps. The deal is massive for xAI, likely valued in the billions of dollars. More significantly, it immediately turned one of the company's most notable achievements into revenue, shifting xAI from a consumer of computing power to a provider.

It's tempting to view this arrangement as a jab at OpenAI amid the ongoing legal dispute. However, Musk explained on X that xAI had already moved its training operations to a newer facility, Colossus 2, and simply didn't need both data centers. In the short term, the logic is clear. xAI's primary product, Grok, has seen declining usage since the image generation controversies earlier this year. If the company's data center capacity far exceeds what Grok requires, partnering with Anthropic adds substantial revenue to the balance sheet—especially useful as xAI, now merged with SpaceX, races toward an IPO. On a broader scale, securing Anthropic as a customer makes SpaceX's orbital data center ambitions seem more viable.

But beyond immediate gains, this partnership sends an unusual signal about where Elon Musk's true priorities lie. It suggests xAI's real business might be constructing data centers rather than training AI models. It's rare for a major tech company to treat compute resources this way; companies like Google and Meta, which also train models, are building more data centers. This point is easy to overlook because many of these firms operate simultaneously as enterprise AI vendors, online services, and cloud providers. Yet when forced to choose between selling available compute to customers or reserving it for their own tools, they consistently pick the latter. Just last month, Sundar Pichai acknowledged on a call that Google Cloud revenue was lower than possible because the company was "capacity constrained"—and when given the choice between renting out GPUs or using them for AI product development, Google chose AI. Facebook faced an even more extreme version of this constraint, creating an entirely new cloud infrastructure just to ensure enough GPU power for Zuckerberg's AI ambitions. As he put it when announcing Meta Compute in January, "How we engineer, invest, and partner to build this infrastructure will become a strategic advantage."

The crucial word there is "strategic." Both Zuckerberg and Pichai envision a future where AI powers the world's most popular and lucrative systems. Computing power isn't just about meeting today's inference needs; it's about building tomorrow's products—and falling short on compute means missing that opportunity. By focusing on data centers, both on Earth and in orbit, xAI is positioning itself more like a neocloud business: buying GPUs from Nvidia and leasing them to model developers like Anthropic. This is a far tougher business, squeezed by both chip suppliers and fluctuating demand cycles. The valuations of most active neoclouds reflect this reality: xAI was valued at $230 billion in its January funding round, while CoreWeave, which manages a comparable amount of computing power, is worth less than a third of that.

Musk's version of a neocloud is more ambitious, as you might expect. Some data centers could be in space—at least by 2035, if plans proceed. xAI will also manufacture its own chips at the Terafab, which will reduce some, but not all, of Nvidia's pricing power. Yet none of this changes the fundamental economics of the neocloud business. As recently as the February all-hands meeting, xAI had genuine software ambitions. That presentation unveiled the orbital data center project but also hinted at significant coding aspirations (since boosted by the Cursor partnership) and intriguing ideas like leveraging computer use into full-scale digital twins (under the unfortunately named Macrohard project). These are long-term initiatives that require dedicated computing resources to succeed. As long as xAI sells large amounts of compute to its competitors, it's hard to believe such new ambitions have much of a future.




RELATED AI TOOLS CATEGORIES AND TAGS

Comments

Please log in to leave a comment.

No comments yet. Be the first to comment!