Insight and analysis on the data center space from industry thought leaders.

The Laws of Technology: Driving Demand in the Data Center

Data centers have an insatiable appetite for space, power and bandwidth. But why does demand seem to grow and grow rather than level off? Bob Landstrom of Interxion answers this question using the five laws that drive demand for data center services.

Industry Perspectives

November 4, 2014

4 Min Read
The Laws of Technology: Driving Demand in the Data Center

Bob Landstrom is the Director of Product Management at Interxion.

What Moore, Kryder, Nielsen, Koomey and Jevons can teach you about the data center.

Data centers have a seemingly insatiable appetite for space, power and bandwidth. But why does this happen? Why does the demand seem to grow and grow, rather than level off? Let’s begin with the five laws that drive demand for data center services.

#1 Moore’s Law

In 1965, Gordon Moore, co-founder of Intel, wrote a paper in which he predicted the number of transistors manufactured onto a common semiconductor chip would double roughly every two years. This prediction, which has held true, has come to be known as Moore’s Law and ensures growth of computational devices on a given amount of chip space.

For the data center, this ever increasing density of data processing electronics correlates to increasing heat produced by those electronics, which in turn drives cooling resource demands.

#2 Kryder’s Law

Mark Kryder was a researcher and CTO at Seagate in 2005 when he was credited with the creation of Kryder’s Law. It is an observance of how magnetic disk storage density roughly doubles every thirteen months. As disk storage density has approached the physical limits of magnetic media, new storage techniques have been introduced. These include solid state storage (based on transistors, and following Moore’s Law again), which opens the gates to a new era of storage density increases for years to come.

Together, Moore’s Law and Kryder’s Law point toward greater and greater data processing capacity for a given physical footprint. For the data center, this is the continued trend of increasing power consumption and subsequent cooling demand of data processing equipment.

#3 Nielsen’s Law

Jakob Nielsen is a researcher in Web usability who, in the 1990s gave his name to the observation that common high-end consumer network speeds double approximately every 21 months.

The increasing consumption of content and data services by end users drives growth of demand in the data processing capacity hosted in the data center. Nielsen’s Law reveals the scalability of computing delivered to the masses of end users. It connects people and things to a global ecosystem of data processing, and with that access comes demand from both users and the data processing systems delivering it.

#4 Koomey’s Law

Jon Koomey is credited with documenting a trend of increased efficiency of data processing equipment. Koomey’s Law states that the number of computations per joule of energy dissipated doubles approximately every 18 months.

Some have pointed to Koomey’s Law as a reflection of natural energy efficiency as data processing technologies scale. While it certainly does reflect computational efficiency improvements, there are energy-specific features increasingly built into original equipment manufacturing (OEM) devices, which are primarily responsible for any reduced energy consumption by servers, such as 80-Plus rated power supplies and clock control using dynamic voltage and frequency scaling.

Koomey’s Law is the unleashing of the Internet of Things. The idea is that, at a fixed computing load, the amount of battery needed falls by a factor of two every 18 months. This enables wide-scale proliferation of mobile and miniaturized computing, sensing, and telemetry applications, facilitating the Internet of Things which in turn drives growth of data processing and content management in the data center.

With all of these laws rooted in some way to the physical nature of data processing technology, we now come to perhaps the most overriding factor driving growth of data center services.

#5: Jevons’ Paradox

William Stanley Jevons was a 19th-century English economist, working long before the advent of data processing technology. Even still, he noted that increased efficiency of a resource results in increased consumption, rather than the increased efficiency satisfying demand.

As an example, notice that as devices become more computationally effective, the price comes down. In the 1960s, the U.S. spent over $25 billion to land humans on the moon. Today, there is more computational power in a $200 smart phone than existed for the entire lunar program. We are achieving exponential steps in data processing capacity every 18 months, reducing the price of devices, and with each step we’re using it all and wanting even more.

Continuous growth, now and forever

These laws point to growth in your data center service needs as your business grows. You need more from your data center provider today than you did five years ago, and you will need even more in the future.

As well as providing a strong track record of high availability to minimize risk to your business, your data center provider should be positioned to support growth in the data processing demands of your business and that of evolving data processing technology. This is done when your provider understands how to operationally support contemporary data processing environments integrated with the cloud and pervasive connectivity to your end users and customers.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like