Skip navigation
The Tennessee Valley Authority's Bull Run coal-fired power plant Alamy

Power Is Key to Unlocking AI Data Center Growth

To capitalize on the AI boom, solving the data center power puzzle is essential, says industry expert Ali Fenn.

The Beatles told the world, “Love is all you need.” But according to Ali Fenn, president of Lancium, power is all you need – at least when it comes to AI in the data center.

At this year’s Data Center World in Washington, DC, Fenn outlined how AI is moving faster than anyone could have imagined. Its demand for compute and new data center capacity is unprecedented.

“Every current model is probably underestimating what is ahead,” she said. “AI’s transformational potential across every aspect of society is fundamentally governed by only one thing: power.” 

Cracking the Power Puzzle

Those wishing to capitalize on the AI boom need to figure out where the power is, how they can gain access to it, and where they can most rapidly build data centers to provide the power that AI needs – all while containing costs.

Some predict that AI will be humankind’s downfall. Fenn views it as a major cause for optimism. It may help researchers find a cure for cancer, eliminate poverty, do away with the drudgery of manual or repetitive tasks, and provide people with more time. She cited statistics such as AI representing up to $38 trillion a year to be spent on climate change by 2050.

“There is no AI without data centers,” said Fenn. “This is our moment, our responsibility, our opportunity.”  

AI is already powering major data center advances. Meta’s AI data center work is predicted to reduce overall costs by 33%. Google’s DeepMind AI cut cooling costs by 40%. Microsoft has introduced AI into its platforms to generate security alerts and improve safety. 

“By 2025, half of cloud data centers will deploy advanced robots with AI and machine learning capabilities, resulting in 30% higher operating efficiency,” said Gartner analyst Sid Naq.

Data Center World / AFCOMLancium president Ali Fenn at Data Center World 2024

Lancium president Ali Fenn at Data Center World 2024

The AI Power Era

Fenn believes AI transitioned from being all about software and code in 2021 to adding GPUs in 2022 and finding enough data center capacity in 2023. This year, the equation has shifted to gaining access to enough power to run AI applications and large language models (LLMs).

“The chief obstruction to data center growth is not the availability of land, infrastructure, or talent; it’s local power,” said Pat Lynch, executive managing director and global head of Data Center Solutions for commercial real estate firm CBRE.

Fenn noted that it took 30 years for U.S. data center power demand to reach 17 GW. Without AI, it was predicted to double again by the end of the decade. But now, estimates for data center power demand by 2030 range from 50 to as high as 80 GW.

“To put those numbers in perspective, 1 GW is enough power for the homes of a million people and 100 GW is enough for 10% of global lighting,” said Fenn. “We are witnessing the largest capital infusion in history into AI.”   

Generative AI (GenAI) applications such as ChatGPT are projected to consume approximately 75% of U.S. data center megawatts by 2025, compared to the available power in 2022. By 2027, GenAI power demands are anticipated to nearly double, far outstripping the total available power in 2022.

The simple fact is that power consumption will far outpace any efficiency or modernization gains. As a result, data center power consumption will grow from around 2% of U.S. electricity today to 7.5% by 2030. Analysts and forecasters can’t keep up. Whatever numbers they set for growth become gross underestimates within a few months. No wonder we are seeing so much investment in new data center facilities, campuses, and new land for future sites.

The Grid of the Future

The current grid model has zero chance of coping with AI build-out. The available power capacity in the top five data center markets between now and 2027 is only 4 GW. Yet as much as 50 GW may be needed within a few years. That’s almost as much as is currently consumed by the entirety of the Mid-Atlantic and New England states.

If that wasn’t a big enough problem, the pace of the grid is at odds with AI expansion. It takes six or seven years and thousands of approvals when it comes to finding power for a major project. To make matters worse, the transition to renewable energy means about 60 GW of traditional electrical output is forecast to come offline in the Northeast by 2030 with no replacement in sight.

“The grid of the future must be very different,” said Fenn. “The major problem is peak demand which we need to solve in new ways.” 

Instead of flexibility coming from the supply side (as in ‘peaker’ plants being brought online in the evening and to cope with sudden shortfalls in supply), flexibility needs to be there both in supply and demand. Fenn expects demand response initiatives to increase by 20 times by 2050 whereby data centers and other users consent to come off the grid and rely on onsite or backup power for certain periods. That alone, she said, could account for more than 2,000 GW. We are seeing the early signs of this with hyperscalers like Google moving workloads around the globe based on optimizing of supply, demand and power cost.

The Data Center of the Future

Along with power capacity, the size of new data centers is expected to soar.

“Traditional data centers will not go away and demand for their services is likely to remain,” said Fenn. “But we might soon see data centers of 1 or 2 GW and perhaps even larger.”

These new mega-data centers will be located on vast campuses based first on where there is enough power availability and influenced by being in the vicinity of networking hubs and load centers – or with a strong enough transmission infrastructure to bring power to where it is needed.

Data centers of such dimensions call for a shift in how they are managed. They probably need to be treated more like power plants and they must work in close coordination with utilities and grid operators. Fenn likes Texas as a likely destination for many of these new sites.  

“We need a predictive and responsive grid that takes into account weather forecasting, pricing, grid congestion, load, and dynamic orchestration,” said Fenn. “Achieving this will require innovation on all fronts as well as building lots of new gas peaking generators, nuclear, and other sources of energy.”

While much of the discussions surrounding AI and data centers are focused on power usage and capacity constraints, data center operators are themselves making use of AI systems to help them optimize their infrastructure operations.

In a previous interview with Data Center Knowledge, Fenn noted that one of the most significant opportunities for AI in the data center industry lies at the intersection of data centers and the grid.

“The rapid growth of data center demand and the emergence of large-scale, gigawatt-scale data centers presents new challenges for grid operators,” she explained. “At Lancium, we focus on developing AI-driven power orchestration and optimization technology to provide grid reliability and ensure workload reliability for both data centers and their customers, with SLAs that prioritize both reliability and carbon-free energy.”

Shaolei Ren, Associate Professor of Electrical and Computer Engineering at the University of California, Riverside, shared this optimism about the potential of AI to improve energy efficiency. “AI can offer more precise configuration of the cooling system operation based on real-time demand, and AI can also help predict the power usage effectiveness by offering precise configuration of cooling systems and predicting power usage effectiveness,” Ren said.

Click here for more insight from Data Center World 2024.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish