Liabooks Home|PRISM News
The Smartest AI Bet Might Have Nothing to Do With AI
TechAI Analysis

The Smartest AI Bet Might Have Nothing to Do With AI

5 min readSource

Over $500 billion has poured into AI startups. But with 36% of data center projects slipping timelines due to power shortages, the real opportunity may lie in energy infrastructure — batteries, transformers, and grid software.

Venture capitalists have poured over $500 billion into AI startups over the past five years. Yet the infrastructure those models actually run on is quietly falling apart at the seams.

The Numbers That Should Worry Every AI Investor

Sightline Climate recently published a report that cuts through the hype with a single uncomfortable ratio: of the 190 gigawatts of data center projects currently being tracked, only 5 gigawatts are actually under construction. Last year, roughly 6 gigawatts came online — but 36% of projects in the database saw their timelines slip in 2025 alone.

The culprit isn't chip shortages or permitting delays. It's electricity. There simply isn't enough power to feed the data centers the industry has promised to build. Goldman Sachs projects that AI will drive data center power consumption up 175% by 2030. The U.S. grid, built for a different era, was not designed to absorb that kind of demand spike.

Those delays don't stay contained to the data center industry. They trickle downstream to every enterprise, startup, and developer that depends on cloud-based AI services. Slower buildout means tighter supply. Tighter supply means higher prices. Higher prices mean slower adoption — the very thing the AI industry is betting against.

Big Tech Is Building Its Own Power Plants

Faced with grid constraints, the largest tech companies have stopped waiting for utilities to catch up. Amazon, Google, Meta, and Oracle are all developing on-site or hybrid power strategies. Less than a quarter of projects that have identified a power source will use on-site or hybrid generation — but together, those projects represent 44% of total planned capacity. The biggest players are leading the shift.

Google's latest move is instructive. For a new data center in Minnesota, the company is blending wind and solar with a 30 gigawatt-hour long-duration battery from Form Energy — a system capable of storing power for up to 100 hours. Google also worked directly with utility Xcel Energy to design a new rate structure intended to accelerate the adoption of emerging energy technologies. This isn't just procurement. It's vertical integration into the energy stack.

PRISM

Advertise with Us

[email protected]

Form Energy is now raising a $500 million round ahead of a planned IPO, riding the momentum of deals like this one. According to the U.S. Energy Information Administration, U.S. battery storage capacity should reach nearly 65 gigawatts by the end of this year — a market that barely existed five years ago.

The 140-Year-Old Technology Choking AI Growth

Batteries get the headlines, but there's a quieter bottleneck that rarely makes the news cycle: the transformer.

Most of today's power transformers use iron cores wrapped in copper wire — a design that is roughly 140 years old. It works. But as data center server racks push toward 1 megawatt of power density, the equipment needed to manage that power will occupy twice as much physical space as the rack itself. That's not a rounding error. It's a fundamental scaling problem.

A wave of startups is betting that silicon-based solid-state transformers can replace the old iron-and-copper technology. Amperesand, DG Matrix, and Heron Power are among those developing new power conversion hardware. They're more expensive upfront, but flexible enough to consolidate multiple pieces of data center equipment — potentially making them cost-competitive at scale. On the software side, companies like Camus, GridBeyond, and Texture are building systems to intelligently manage electron flow across facilities and grids.

The investment rounds in this space are smaller than the blockbuster AI fundraises dominating headlines. That's arguably a feature, not a bug. More tractable check sizes, clearer demand signals, and a customer base that includes not just AI companies but every industry electrifying its operations — from transportation to heavy manufacturing.

The Picks-and-Shovels Argument, Revisited

The Trump administration has added political pressure to an already urgent situation, urging tech companies to build their own power sources or pay premium grid rates. Most large players had already begun moving in that direction. But the policy signal accelerates timelines and, critically, validates the investment thesis for energy infrastructure plays.

There's a historical pattern worth considering here. During the California Gold Rush, the merchants who sold shovels often outlasted the miners who bought them. The AI equivalent of that trade might be batteries, transformers, and grid software — the unglamorous infrastructure that every AI application depends on, regardless of which model wins the benchmark wars.

The risk calculus is different too. An investment in a specific AI model is a bet on a particular winner in a fast-moving, winner-take-most competition. An investment in power infrastructure is a bet that AI — and electrification broadly — will keep growing. The second bet doesn't require picking a winner. It just requires the industry to keep building.

This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.

Thoughts

Related Articles

PRISM

Advertise with Us

[email protected]
PRISM

Advertise with Us

[email protected]