AI: Year-end update on "Braggawatt' powered AI Data Centers. RTZ #938

AI: Year-end update on "Braggawatt' powered AI Data Centers. RTZ #938

This year began with the big tech companies boasting about ‘supersizing’ just the 100,000 AI GPU data centers from last year’s state of the art AI data centers with tens of thousands of AI GPUs. Mostly from the key global supplier, Nvidia, with its own AI Data Center roadmap.

That rapidly accelerated to the common goal of ‘Gigawatt’ AI data centers with a million plus AI GPUs just this year. Along with the accompanying land rush in the US and beyond. Just as a point of reference, one of those has a cost of over $50 billion, with a total industry cost over a few years in the trillions. Thus the increasing need for debt financing across the players.

With almost every big tech ‘Mag 7’ in the race publicly except Apple. And OpenAI adding $1.4 trillion in AI Infrastructure by itself, followed by Anthropic with its deals.

And then there was the Power race for these data centers, which added its own massive costs on top as well as supply chain constraints worldwide. As well the cooling imperatives for the latest AI chips. And as we end 2025, the industry is already eyeing AI Data Centers in Space, as I wrote about a few days ago. Such is the AI infrastructure race at this early point in the AI Tech Wave.

And the boasting amongst the leading tech founders and CEOs as moved on to bragging with bigger numbers than ever. With a new term being coined in the process, as discussed below.

So it’s useful to get a year-end update on all this from the Information in “The ‘Bragawatt’ Data Center Era Brings Reality Checks—and Energy Breakthroughs”:

“Two key aspects of the AI race are misunderstood: The gigawatts of electricity needed for the most ambitious AI data centers aren’t coming online nearly as fast or as easily as the announcements about them did. And the acute shortage of “AI compute”—power and servers—is actually producing some thrilling innovations in energy, climate tech, industrial automation and computing much faster than government-led policies and stimulus plans ever could.”

“Let’s start with the reality check and finish with the thrill.”

“Breathless chatter about the AI race makes it seem like the roughly 1 trillion dollars per year that McKinsey estimates will be spent on the infrastructure buildout between now and 2030 is already producing fully electrified AI cities on a hill, one after another.”

There’s nothing like a feel for all this from the road:

“I’ve been traveling the country to see the steel and concrete rising (and, when I can get inside, the racks rolling in), and it may surprise you how many of the multigigawatt sites you hear about won’t hit full power capacity for years. JPMorgan Chase estimates that U.S. firms this year struck deals to add 9 GW of AI data centers, up from 4 GW of deals last year. If and when that capacity does come online, it’ll be just 13% of the way to the roughly 100-plus GW of AI compute that banks like JPMorgan and Goldman Sachs conservatively project will be coming to the U.S. by 2030. (RAND and others project even higher figures of more than 300 GW.)”

And of course, comments from the loudest ‘braggawatts’:

“Despite public boasts by Elon Musk and other AI leaders about electrified clusters of hundreds of thousands of chips, and announcements of sites so big that people in the data center industry started to joke about “bragawatts,” we’re just barely starting to consolidate one or more gigawatts of data center capacity on a single campus to prove that denser clusters will produce better AI.”

“Unless dreams quickly come true for nuclear fusion or AI servers in space, it doesn’t look likely that OpenAI CEO Sam Altman can get his hoped-for 250 GW of capacity by 2033, or that xAI founder Elon Musk can soon get to his stated goal of a terawatt (1,000 GW)—equivalent to all of the U.S. electricity produced today—by some unknown date.”

And as I’ve outlined in earlier pieces, OpenAI, led by founder/CEO Sam Altman, has been in a class of its own when it comes to ‘Braggawatt’ AI Data Centers. To the point of accelerating the US Economy by its own example.

“OpenAI has been furiously hunting down potential data center sites from Wisconsin to Michigan to move beyond the power limitations of its existing sites. For instance, Crusoe, a developer of OpenAI facilities in Abilene, Texas, just announced it had “topped off” the eighth building there, though that merely refers to the erection of the eighth structural frame. There’s much more to come before chips are installed and all systems are go to achieve 1.2 GW of capacity, expected by mid-2026. In January, Oracle CEO Larry Ellison said the Abilene site could grow to as many as 20 buildings, but the Texas grid operator can only supply power for eight buildings until new transmission lines can be built, sometime in the several years, people involved in the projects say. Even getting equipment like transformers, which adjust voltage from the grid or self-generated power before it flows inside the data center, has been so difficult that an OpenAI manager told me in September they’d almost considered assembling their own.”

It’s not all been progress on a straight line.

“There are many other infrastructure setbacks and pivots for high-profile U.S. data centers you haven’t read about (a power transformer from China had to be rebuilt, a fiber installation flooded a nearby aquifer, a gas pipeline was maxed out), and some you likely have, such as CoreWeave’s troubled Texas project with Core Scientific. In Pennsylvania, where a bipartisan state government and energy CEOs are working to usher in a golden age of gas-fired AI, the regional grid regulator recently failed to approve even one of 12 proposals to address electricity demand growth. That has left the busiest computing region in the country, also home to Data Center Alley in Northern Virginia, in limbo. A federal proposal may supersede those proposals, but rising power bills are making the debates politically fraught.”

And each of the big tech companies have forged their own paths to their ‘braggawatt’ ambitions:

“As Microsoft, Amazon and Meta become the single largest electricity customers in states where they’re developing facilities, they’ve faced an increasing public backlash that could shape next year’s midterm elections. Just two projects in Wisconsin from Microsoft and OpenAI could require the doubling of available capacity at Wisconsin’s main utility, We Energies, to about 11 GW from 5.3 GW by decade’s end, if they grow as big as the companies envision, according to many people involved in the power planning.”

With necessary accommodations along the way:

“Such friction is pushing these companies to develop or use energy sources outside the public electrical grid, which could speed up their projects. A leading architect of such behind-the-meter power, KC Mares, believes as much as 4 GW of private power installations got underway in 2025, and he knows of 10 GW of projects that will be started next year. But it’s a complicated dance: Mares says makers of small gas turbines, which the industry turned to because waits for big ones hit five to seven years, are also now virtually sold out until 2028. The massive off-grid renewables, batteries and gas sites—like the one Google is developing with Intersect, a clean energy and data center builder—will still draw some utility electricity and demand heroic logistics; Intersect put in equipment orders long ago.”

As is often these case with these crash infrastructure builds in tech waves, there have been some unexpected ‘innovation dividends’ along the way in this AI ‘Gold Rush’ as well:

“The great news is that the severity of these challenges, and the risk that precious server chips will sit idle, unable to morph into AI products, has unleashed a torrent of innovation beyond AI models.”

“AI doomers severely underappreciate the dividends of this infrastructure race. Power scarcity is catalyzing a badly needed reinvention of a broken, century-old grid. The $250 billion that the largest tech companies spend annually on research and development—over and above capital expenditures—is helping make fossil energy cleaner and nonfossil energy cheaper. We’re on the cusp of massive leaps in data center and power electronics efficiency. Tech and energy firms are rallying around common goals of producing more energy—ideally (although secondarily) with fewer emissions. We are also finally rethinking permitting and regulatory systems that didn’t serve us well.”

Examples abound:

“Google and Amazon, for example, are pushing the adoption of grid-enhancing technologies like sensors and advanced conductor wire. We’ve known about these technologies for years, but utilities didn’t have an incentive to adopt them because they profit from erecting big plants.”

“Nvidia and Emerald AI, an energy software startup named as one of The Information’s 50 most promising startups of the year, are demonstrating that it’s possible to shift computing workloads from one data center node to another to avoid straining the grid and causing power bills to spike.”

The glass half full longer term about all this is that this AI Tech Wave is no different than earlier infrastructure booms in previous tech waves.

Although the numbers in both dollars and resources are FAR bigger, and the ‘bragging’ is louder at times, it’s all directionally going the right way. And that’s a useful takeaway from this snapshot for now. Stay tuned.

(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)





Want the latest?

Sign up for Michael Parekh's Newsletter below:


Subscribe Here