Data Centers to Run Out of Power in Two Years, Says DigitalBridge CEO

'We're kind of running out of power in the next 18 to 24 months,' warned Marc Ganzi of DigitalBridge. How data center operators might address this power shortage remains unclear.

Light Reading

May 8, 2024

5 Min Read
Fiber optic cables connected to an optic ports in a data center
Alamy

This article originally appeared in Light Reading.

It's no secret that data centers – including those running new AI programs – consume lots and lots of electricity. But according to the CEO of digital infrastructure company DigitalBridge, the situation is much more dire than most in the industry believe.

"Power is really the constraining factor. And that's going to become more evident to you and to the rest of the investor community over the next two years," said Marc Ganzi, CEO of DigitalBridge, a company that builds data centers as well as small cells, cell towers, fiber networks and other such infrastructure. Ganzi was speaking during DigitalBridge's recent quarterly conference call, according to Seeking Alpha.

Continued Ganzi: "We started talking about this over two years ago at the Berlin Infrastructure Conference when I told the investor world, we're running out of power in five years. Well, I was wrong about that. We're kind of running out of power in the next 18 to 24 months."

Of course, Ganzi isn't the only one sounding the alarm. "Amid explosive demand, America is running out of power," reads the Washington Post. "A.I. Frenzy Complicates Efforts to Keep Power-Hungry Data Sites Green," reads the New York Times.

Questing for power

Related:Data Centers Now Need a Reactor’s Worth of Power, Dominion Says

"The problem has been known for a very long time," Denise Lee, VP of Cisco's sustainability engineering office, told Light Reading in a recent interview.

But Lee said that, now, two major trends are getting ready to crash into each other: Cutting-edge AI is supercharging demand for power-hungry data center processing, while slow-moving power utilities are struggling to keep up with demand amid outdated technologies and voluminous regulations.

According to the financial analysts at TD Cowen, the situation is becoming acute.

"Our checks indicate that the minimum lead time to get data center power in most major US markets is +3 years," they wrote in a February report.

Specifically, they wrote that it can take up to two-and-a-half years in Dallas to obtain permits for the power necessary to run a new data center. In Atlanta that's up to six years. And in Silicon Valley, it can take up to seven years.

But it's even worse in Europe, the TD Cowen analysts warned. Lead times are now up to eight years in top markets like Frankfurt, London, Amsterdam, Paris and Dublin.

"This represents an incremental elongation vs. lead times seen a few months ago, a trend which we expect to continue," the TD Cowen analysts wrote.

AI demand

A new report from the International Energy Agency (IEA) found that the 460 terawatt-hours (TWh) consumed by data centers in 2022 represented 2% of all global electricity usage. Much of that was driven by computing and cooling functions within data centers.

Related:Power Is Key to Unlocking AI Data Center Growth

The report also predicted that data center electricity usage will double by 2026. It blamed the rise of power-intensive workloads such as AI and cryptocurrency mining.

AI systems generally run on graphical processing units (GPUs). Those demand more power than traditional central processing units (CPUs), but they can also produce more computations.

The IEA report isn't the only one forecasting the power demands of AI. For example, the Uptime Institute predicts AI will account for 10% of the data center industry's global power use by 2025 – up from 2% today, according to the NYT.

A Range of Solutions

According to Lee, the Cisco executive, data center operators are working to address the situation using a variety of strategies. For example, she said that some data center operators are locating their systems near natural gas or hydropower sources. Amazon put one of its data centers next to a nuclear power plant, she said.

Others are using decommissioned Navy ships in order to make use of liquid cooling, thereby lowering computer-cooling costs.

"There's no one size that fits all," she said.

"A big piece of the power puzzle centers around renewables," argued Ganzi, the DigitalBridge executive. 

He said Switch, one of DigitalBridge's data center companies, is mostly using electricity generated by wind and solar. Another DigitalBridge company, in Brazil, is using hydropower.

The Fiber

The data centers using all that power are also prime customers for fiber networks, according to those in the industry. After all, that's the primary technology for moving data – including AI computations – into and out of a data center.

"We expect that our recent wins for AI data centers will translate into orders and sales during the year," said Wendell Weeks, CEO of fiber provider Corning, during his company's recent quarterly conference call, according to Seeking Alpha.

And according to the financial analysts at TD Cowen, fiber network operators are bulking up their orders to address AI traffic around data centers.

"For example, demand for 144 [fiber] strands (from typical 8-12 strand orders in the past) and 400 gig circuits are becoming far more prevalent as it seems customers are buying network first and asking questions later," wrote the TD Cowen analysts in February of their meetings at the Metro Connect fiber trade show in Florida.

According to Ganzi, most current AI training operations do not need a speedy, low-latency connection between a user and a data center. But that may change as AI operations shift to an "inference" model. Under that scenario, data centers will need to focus on the speedy delivery of AI services to nearby end users.

"The whole fiber industry in general is going to need more new routes, low latency routes, and of course heavy strand count. And that's the way you bridge the gap in terms of creating low-latency environments for AI workloads," Ganzi said.

Read more about:

Light Reading
Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like