AI: Anthropic's Ambitious AI Data Center Plans. RTZ #994

AI: Anthropic's Ambitious AI Data Center Plans. RTZ #994

I’ve written a lot about both the broad AI industry and OpenAI’s massive AI Compute ambitions to spend sums in the trillions to build AI Data Centers and related Power.

The focus of course is building the AI Compute and Power to generate the never-ending and ever-accelerating AGI bound AI training and particularly inference intelligence tokens to AI tasks. From chatbots to reasoning to agents and more.

With OpenAI ‘sibling’ Anthropic coming on strong in recent months with its enterprise focused Anthropic Claude, Claude Code, Claude Cowork and other AI products, it too is ramping up its AI Infrastructure spend into the hundreds of billions.

And given that Anthropic is at least the ‘Pepsi’ to OpenAI’s ‘Coke’, if not possibly the Coke at some point, it’s worthwhile to look closer at its AI infrastructure plans.

The Information provides details in “Anthropic’s Data Center Ambition—and the Ex-Google Execs Who Could Make It Happen”:

Anthropic’s leaders have been loud about how its AI will reshape businesses but much less boisterous about how it plans to power that usage.”

“Now the company’s data center ambitions are coming into focus. Anthropic is quietly assembling a veteran team, including two former Google executives, and has discussed securing at least 10 gigawatts of capacity over the next several years, according to two people who have spoken with the firm’s leaders in recent weeks.”

At $50 billion per Gigawatt of AI Data Center compute, the numbers add up quickly:

“Obtaining that amount of computing power would cost hundreds of billions of dollars, though it isn’t clear how much of that Anthropic would be on the hook for. In any case, that figure is far greater than the $180 billion that Anthropic told investors it would spend on servers through 2029.”

“Anthropic has privately discussed getting 10 GW by renting capacity from cloud providers the way it currently does and, notably, by leasing its own data center space, meaning it would likely purchase and operate the server equipment that goes inside of it.”

Like OpenAI, who is working with Oracle, CoreWeave and othersto build their AI Compute, Anthropic has its preferred partners:

“So far, Anthropic has said it planned to invest $50 billion in data center capacity through a partnership with startup Fluidstack, involving data centers in Texas, New York, and other locations. It did not provide a timeline for that investment or say how the money would be used or how much capacity it might buy. Ten gigawatts of capacity would require far more capital.”

“Anthropic’s move toward direct leasing suggests it wants to have more control over its servers. It could also help the company save money over time.”

Of course, the race is up against bigger big tech peers with far bigger financials and ability to get more debt and equity financing:

“To put 10 GW into perspective, Amazon Web Services last week said it brought nearly 4 GW of capacity online in 2025, a period in which its sales increased $20 billion year over year. As of September last year, OpenAI privately disclosed it had contracts in place for 8 GW by 2028.”

“For now, OpenAI seems to be content with renting data center capacity from major cloud providers, such as Oracle, Microsoft and AWS. OpenAI has publicly discussed its ambitions to control its own facilities, but its recent announcement about a partnership to develop 10 GW of data centers with Nvidia’s help has not materialized. (However, OpenAI has been discussing deals with “colocation providers,” which essentially offer pre-built data centers to which customers can bring their own server hardware).”

And Anthropic is of course poaching all the AI Infrastructure talent it needs from nearby sources:

“To shape its strategy, Anthropic has hired data center executives with decades of experience. Among them are Tim Hughes, chief development officer at data center firm Stack Infrastructure, who is set to begin in a few weeks, according to someone with direct knowledge. Stack is owned by major data center financier Blue Owl Capital and is building a massive data center for Oracle (and Oracle customer OpenAI) in New Mexico.”

“Anthropic also hired Brett Rogers, who led data center construction and design at Google for nearly six years until 2021, this person said. Rogers joined Anthropic a few months ago, this person said.”

“These previously unreported hires join Winnie Leung, Anthropic’s head of data center infrastructure, who arrived last year after more than two decades at Google, including as a director of engineering for data center operations.”

Google of course has significant advantages with its own TPU powered AI infrastructure I’ve written a lot about:

“Anthropic’s recruiting from Google is no accident. Anthropic is a major user of Google’s tensor processing units and recently struck a deal to buy $20 billion of them from Broadcom, which co-designs the chips. (Google is a major investor in Anthropic, and the startup also uses Google’s cloud service.)”

“Several people in the data center industry said the new hires would give Anthropic greater expertise in identifying data center sites and give it more intelligence about which developers are more reliable than others, so that it can aim to avoid delays that are increasingly common in the field.”

Then of course are the creative ways to finance these AI infrastructure deals:

“Leasing data center capacity is common among large companies, but startups have a harder time signing such deals because they don’t have strong balance sheets or credit ratings, making it too risky for lenders to underwrite debt for the facilities’ owners. To get around these issues, AI startups have historically partnered with cloud providers, which sign long-term leases and rent out servers to the startups.”

“To lease facilities before it goes public—which would likely strengthen its credit rating and balance sheet—Anthropic will likely need financial backing from a major credit-worthy partner, such as a large cloud provider or chip supplier, to serve as a so-called credit “backstop.” (This would make lenders feel more comfortable, because they would still get paid even if Anthropic defaulted on its lease payments.)”

“For instance, Anthropic plans to put TPUs in an upcoming Louisiana data center Fluidstack leased, and Google agreed to provide a financial backstop on Fluidstack’s payments. The Google backstop structure was critical to securing financing from lenders, according to people involved in the deal.”

“Now, it’s Anthropic’s turn to see if it can line up similar financial arrangements like Fluidstack. I’m already hearing that major infrastructure investors are willing to fund Anthropic’s buildout, so I’m sure it will only be a matter of time until we see these deals get done.”

Then there’s the whole Elon Musk driven AI Data Centers in Space thing, with its hyped up narrative of its own:

“While Anthropic focuses on data centers on earth…everyone is talking about data centers in space! That’s after Elon Musk framed SpaceX’s acquisition of his AI lab, xAI, as part of his lofty vision to build data centers in orbit—though his blog post announcing the $250 billion acquisition didn’t explain how xAI actually fits into that vision. (My colleagues previously wrote about this topic here and here.)”

“Orbital data centers, made up of satellites harnessing the Sun’s rays, are inevitable because power will become too difficult to find on Earth, making space a cheaper and faster way to build computing capacity, Musk argued in an interview with John Collison and Dwarkesh Patel last week.”

Lot of on the edge assumptions around current science and technologies laced through his narrative. Especially when compared to options back here on earth.

“Not only does it take too long to get power from the grid, but xAI can’t build its own gas turbine power plants fast enough to keep up with its ambitions.”

“The turbines are sold out through 2030,” he said, implying that he thinks he might have to start making his own turbine blades to ease supply constraints.”

“Not everyone sees space data centers as a near-term priority. AWS CEO Matt Garman said in an interview last week that he thinks we’re “pretty far” away from making space data centers possible. OpenAI CEO Sam Altman said on the TBPN podcast that he doesn’t think space data centers will provide a meaningful amount of compute power in the next five years. That may not be a surprise, given how far ahead SpaceX is in launching rockets compared to every other rocket-launching firm.”

“I wish Elon luck,” Altman said.”

All of the above highlights how Anthropic is laser focused on building its own AI Data Center Infrastructure Compute vs the biggest and best of its peers. And we’re only in year four of post ChatGPT version of this AI Tech Wave. Stay tuned.

(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)





Want the latest?

Sign up for Michael Parekh's Newsletter below:


Subscribe Here