electronic equipment Alamy

What Does the End of Moore's Law Mean for the Data Center Industry?

Moore's Law may not be dead, but it is dying. Here's how the slowdown in the growth of computing power could impact the data center industry.

In case you missed it, Moore's Law — which states that computing power will steadily increase over time — is dead, or, at best, is slowly dying. Computer chips are no longer gaining processing capacity as rapidly as they did in past decades.

What does this change mean for data centers? Quite a bit, potentially. Keep reading for a look at how the slowdown in computing power growth could impact the data center industry.

What Is Moore's Law, and Why Is It Dead?

Moore's Law, named for Intel co-founder Gordon Moore, who posited the concept in 1965, is the principle that the number of transistors that engineers can fit inside computer chips doubles roughly every two years. By extension, the computing power of the average chip should increase at the same rate, and the cost that businesses pay for processing power should come down.

For decades, Moore's theorem turned out to be mostly accurate. Computing capacity increased roughly at the rate he predicted.

But that's no longer true. Although it may be too early to say that Moore's Law is definitely dead, there is reason to believe that we've reached the physical limitations of silicon-based CPUs. Without a practical alternative, engineers can no longer increase the computing power of chips as rapidly or as cheaply as they did in years past.

It's certainly possible that smart people will find ways to work around the current limitations of silicon — or that quantum computing might finally become practical and totally change the game surrounding computing power. But for now, the data shows that the rate of increase in processing power is slowing down, with no clear sign that that trend will change anytime soon.

Moore's Law and Data Centers

The fact that CPU capacity is not growing as quickly could have several profound implications for data centers.

More data centers

Perhaps the most obvious is that we're likely to see more data centers being built.

That would probably happen even if Moore's Law held true, of course. Demand for digital services has long outstripped increases in processing power, which means companies have had to increase the footprint of their IT infrastructures even though the per-server processing power of those infrastructures was increasing.

But in a post-Moore's Law world, we'll need even more data centers. If servers cease to grow more powerful year over year, the only way to meet increases in user demand will be to deploy more servers, which means building more data centers.

Data center sustainability challenges

An increase in the total number of data centers will exacerbate existing challenges related to data center sustainability. More servers translates to higher rates of energy consumption, especially if the number of transistors per chip remains flat.

This will likely mean that data center providers that can offer clean energy sourcing will become even more appealing. So will next-generation data center technologies, like immersion cooling, that can reduce the carbon footprint of data center facilities.

More companies enter the chip market

For decades, a relatively small number of vendors — namely, Intel and AMD — have dominated the market for the computer chips that go into commodity servers. These companies could deliver steadily increasing processing power, which gave other businesses little incentive to get into the chip-making game.

But that has changed in recent years as companies like AWS have begun building their own chips, and the obsolescence of Moore's Law is likely to push such businesses to invest in CPU technology even more. The reason why is that they'll be looking for newer and better ways to squeeze efficiency out of chips, especially in the context of the specific use cases for which they deploy the CPUs.

In other words, in a world where generic CPUs are no longer becoming more powerful and less expensive by the year, companies have greater incentive to create their own special-purpose CPUs optimized for the use cases that matter most to them.

Workload optimization grows in importance

Reducing the CPU consumption of workloads has always been a smart move for companies that want to save money on hosting costs. But in a post-Moore's Law world, workload optimization will become even more crucial.

This means we're likely to see more workloads move to containers, for example. The FinOps and cloud cost optimization market will probably boom, too, as more and more businesses look for strategies to maximize the efficiency of their workloads.

Conclusion

The data center industry grew up in a world where computer chips were always growing in power and coming down in cost. But that world has passed away. We're living in the post-Moore's Law age, or close to it.

The result is likely to be more data centers, more special-purpose CPUs, and greater pressure on businesses to optimize their data centers. Data center providers and their customers will need to adapt — or, alternatively, cross their fingers that the quantum revolution finally happens and makes computing power ridiculously cheap, although that's probably not a winning strategy.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish