No Image Available

Building Mega AI Data Centers Getting HARDER — Power Delays, GPUs at Scale, Local Pushback & More. ARD 69

The frame running through every item today:

Building multi-gigawatt AI data centers is getting HARDER.

The financing side has accelerated to historic speeds — hundreds of billions per year, moving toward trillions. But the physical-world layer — power generation, transmission infrastructure, transformer supply, GPU coherence at scale, local siting permissions — is racing to keep up, and increasingly NOT keeping up. The headline numbers on capex announcements get the attention; the realities of breaker capacity, switchgear lead times, and county-board hearings get the deployment timeline. Today’s three items map the friction.

Three Key Takes today:

(1) Half of Planned AI Data Centers Waiting for Power. The headline number is striking — roughly half of the planned US AI data center builds have been delayed or cancelled, with the bottleneck cited as power infrastructure, transformer supply, and parts coming from China. Capex announced ≠ capex deployed. The reporting: Tom’s Hardware — Half of planned US data center builds delayed or cancelled; AI build-out flips the breakers. The FT angle: FT — Data center delays choke AI expansion. The transformer crunch: Bloomberg — US data center boom relies on hard-to-find electrical equipment. Standing thesis from the AI-RTZ archive: AI-RTZ #407 — The AI Power Grab.

MP Take: “Despite rapid acceleration of AI data center financing and funding, the technical challenges of building and then maintaining AI data centers will remain challenging through the decade. Especially due to rapidly changing AI demand profiles — from training-driven loads to inference-driven applications like AI search, chatbots, and AI Agents. Leaves room for tech and AI companies up and down the AI Tech Stack to raise prices. Including at Nvidia and TSMC levels.

The build target keeps moving as customer-facing demand re-shapes what each gigawatt actually needs to serve. Power, transformers, switchgear, and parts pipelines are physical-world inputs that don’t respond to capex acceleration the way GPUs do. A gigawatt data center is at least a $50 billion expense, and power turbines take months to build with very precise tolerances. The capex willingness is necessary but not sufficient.”

(2) xAI’s Challenges to Big AI GPU Installations. xAI’s run at gigawatt scale is showing the operational reality of the new computer. Two recent pieces from The Information surface what the rest of the industry is about to learn — multi-GPU coherence at this scale is genuinely hard, and the “fast and cheap” data center build that xAI piloted is now revealing hidden costs. The reporting on GPU coherence at scale: The Information — xAI shows it’s tough using a lot of GPUs at once. The hidden-costs piece: The Information — xAI’s fast, cheap data center build hidden costs. Standing thesis on the gigawatt buildout: AI-RTZ #481 — Power Plays Begin to Build Gigawatt.

MP Take: “The big tech companies have a lot of teething pains ahead as they truly scale to multi-gigawatt data centers. These massive computers need constant, low-latency communications between tens of thousands of GPUs running in parallel — and the failure modes get harder, not easier, as the cluster scales.

The workload is rapidly evolving — hardware and software combinations have to keep up with fast-changing customer needs at both the consumer and enterprise levels. xAI is the early field test of what the rest of the cohort is about to encounter at the gigawatt scale they’ve all committed to. Elon Musk’s team tends to go around AND through obstacles, and they’re finding that this stuff is not only art, but there are hidden costs that get baked in to doing some of this on a more sustained basis.”

(3) Local Community and Regulatory Pushback to AI Data Centers. The third axis of friction is getting under-reported but is increasingly material — local community and state-level regulatory opposition to AI data center siting. Power pricing, water draw, and resource crowding are translating into county-board hearings, zoning fights, and state-level legislative pushback that federal AI policy can’t override. The reporting: NY Times — AI data centers face local opposition as construction accelerates. Standing thesis on AI’s need to play nicer with neighbors: AI-RTZ #503 — AI Need to Play Nice With Other.

MP Take: “Local community and regulatory pushback is on the rise to AI data centers, due to a variety of drivers — ranging from local power pricing to resource crowding on water and grid capacity. Despite attempts to govern these uses from a federal perspective, the state and local political realities will continue to be a strong headwind to AI data centers through the decade.

This is not a phase. The siting fight is the new permitting fight — and just like the older permitting fight on roads, refineries, and pipelines, it gets resolved at the county and state level on a case-by-case basis. National AI policy can’t pre-empt municipal water rights or state utility commissions. Build cycles get longer because of this; expect that to be the story for years.”

Plus: Gadget AI — Operating Systems and Software Integrations Need to Change for AI Agents. Our computer and smartphone operating systems, plus the broader software-integration layer (including emerging standards like MCP), will need to evolve rapidly to host AI Agents at scale. The current OS surfaces were designed for human users, not autonomous-agent users. Source: Axios — Agents, AI software, model context protocol. Standing thesis on AI Agents needing their own internet: AI-RTZ #1023 — AI Agents Increasingly Need an Agent-Native Internet. The daily-use anchor: AI-RTZ #1054 — Working Out Daily With AI and AI Agents.

MP Take: “This is a multi-year effort to evolve operating systems and an internet designed for humans to accommodate AI Agents at scale. The current desktop OS, mobile OS, and web stack assume human users with hands, eyes, and attention. Agents have none of those. They need machine-readable affordances, structured tool calls, deterministic protocols, and reliable identity.

MCP and the agent-native internet ideas point in the right direction, but it will take a lot longer than currently assumed. Today’s internet over the last 25 years was designed and built for humans. Most websites assume any bot knocking on the door is malware — so they don’t let it in. OpenAI and Anthropic with cloud code, codex, and MCP protocols are doing herculean things despite today’s operating systems, not driven by them.

Expect this layer to be a decade-long re-platforming, not a 12-month sprint — and expect the early years to feel like the late-1990s web before standards settled. Hybrid is the shape of the AI era (per yesterday’s Ep 68 Gadget AI take on Blackberry QNX) — and the OS / agent-protocol layer is where that hybrid stitching happens.”

Bonus — today’s AI-RTZ companion #1077 covers ‘AI Godfather’ Yann LeCun’s common-sense take on AI thus far — a deeper read on LeCun’s framing of where current LLM approaches hit ceilings, and what a more grounded, embodied, common-sense AI architecture might look like as a next-generation alternative.

Closing Questions —

  • MP’s negative AI Agent surprise of the week? Claude Cowork still forgets its core instructions on various projects day to day — despite rigorous efforts to have it memorize and record those instructions in its files. Even with standing playbook files, project-specific context, and per-session pre-flight reads built into the workflow, the system reliably drops core context at predictable points and has to be re-prompted. It’s like having a human assistant or intern who shows up as a blank slate every day. The memory architecture isn’t there yet. These tools are very, very new — Cowork didn’t exist a few months ago. So this is not a criticism, just a status report. Lot more work to do to make this stuff daily reliable for regular users.

  • MP’s positive AI Agent surprise of the week? MP’s getting a lot of utility out of the FREE tiers of Google Gemini, OpenAI ChatGPT, and Perplexity. A couple of friends and relatives asked — “I don’t want to spend $200 a month, what can I do?” So MP took some computers, didn’t log in to the top-tier accounts, and used the free tiers. Particularly Google Gemini, ChatGPT, and Perplexity are relatively useful for a lot of basic things people might want to do at the free tier. By contrast, Anthropic Claude Cowork still needs MORE paid compute credits to handle MP’s daily workflow — the per-token usage profile of Cowork is meaningfully heavier than the chat-style alternatives. Anthropic has even been testing potentially taking Cowork out of the $20 tier entirely. So that may be the one you have to pay for; the others are surprisingly capable on the free tier. Stay tuned.


(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)


Clips from today’s episode

Short — AI Agents Want Your Steering Wheel Early adopters of AI agents are finding they need a separate computer for AI agent work versus their own work — because they’re clashing. It’s like two drivers trying to grab the same steering wheel in a car. Not a long-term workable solution. AI agents need a lot more control of your device. Operating systems need to redesign for AI Agents at the device level — multi-year effort that will take longer than currently assumed.

Short — Today’s Internet Was Built for Humans Today’s internet over the last 25 years was designed and built for humans. Most websites assume any bot knocking on the door is malware. OpenAI and Anthropic with cloud code, codex, and MCP protocols are doing herculean things despite today’s operating systems — not driven by them. Operating systems need to be redone aggressively for AI Agents. Apple, Microsoft Windows, Android — plus new players. OpenAI may build its own AI smartphone to get around these issues. Hybrid is the shape of the AI era; the OS layer is where the hybrid stitching happens.

Short — AI Data Centers Face Growing Local Opposition AI data centers are facing accelerating local opposition for both construction and operation — driven by noise, pollution, water draw, power pricing, and resource crowding. Federal AI policy is pro-industry, but state and local regulators are getting tougher. The siting fight is the new permitting fight — county and state level case-by-case, just like roads, refineries, and pipelines. National AI policy can’t pre-empt municipal water rights or state utility commissions.

Short — AI Agents: Blank Slate Every Day Claude Cowork is helpful — saves 2-3 hours a day. But every session, the software needs to be reminded of preferences, projects, and how MP likes to do specific things. It’s like having an intern who shows up as a blank slate every day. They’re getting better, but you have to remind that intern every day. The memory architecture isn’t there yet — these tools are very, very new. Lot more work to do to make this stuff daily reliable for regular users.

Ep 69 scope: 1 Main + 4 Shorts — MP’s pre-recording scope decision. No Segments or Hooks (matches Ep 62/63/65/66/67/68 4-clip pattern).


About AI Ramblings Daily (ARD), and AI-RTZ

Both are daily. Both are free. Both are about AI. But they’re different mediums carrying different messages.

AI-RTZ is the morning text — a deeper written take on one idea, published by at least 5 AM EST. Today: post #1077 — Yann LeCun’s common-sense take on AI thus far.

AI Ramblings Daily is the afternoon video + podcast — my ad hoc takes and perspective on the day’s AI issues & news flow, around 16 minutes, with short 1-2 minute clips for quick topic views. Today: episode #69.

Subscribe to either or both on michaelparekh.substack.com. They run as separate Sections you can opt into or out of.


Links used in today’s show (already embedded inline above; listed here for reference)

Take 1 — Half of Planned AI Data Centers Waiting for Power:

Take 2 — xAI’s Challenges to Big AI GPU Installations:

Take 3 — Local Community and Regulatory Pushback:

Gadget AI — OS / software integrations need to evolve for AI Agents:

Companion text:





Want the latest?

Sign up for Michael Parekh's Newsletter below:


Subscribe Here