Riccardo Badalone is co-founder and CEO of Diablo Technologies.
Ask the average CEO to explain why data centers are getting bigger and more expensive, and you’d likely get a lot of wrong answers before you got to the right one: DRAM.
System memory, “commodity” silicon that rarely gets any attention, is more often than not the hidden issue forcing companies to build new data centers – facilities that can cost companies $1 billion or more.
The reasons are subtle – though not to the data center architects. Fortunately, that issue is on the radar of enterprise hardware vendors, and there are potential solutions.
The problem starts with the fact that having vast amounts of system memory – generally DRAM – has become crucial to getting the necessary performance out of the new breed of enterprise and data center applications.
Big Data is called that for a reason. One large company I’ve talked with is scaling up to run a web application that contains 10 petabytes of data, that data needs to always be in system memory in order to be useful. Even with the fastest storage technologies, performance would degrade to unacceptable levels if the servers’ CPUs were continually waiting for data to move back and forth between storage and memory.
This is where DRAM economics enters the picture. Most people unfamiliar with the DRAM market would predict that if you plotted the price points of DRAM modules with 8, 16, 32 and 64 gigabytes of memory, you’d see a line angling slowly and gently upwards. What you actually get, though, is quite different. While the price actually declines from 8 GB to 16 GB per module, it increases sharply between there and 32 GB, and sharper still from 32 GB on the way to 64 GB. A 32 GB memory module is 2.5 times more expensive as one 16 GB module; a 64 GB module is 7 times more expensive.
The bottom line: DRAM technology is struggling to deliver the capacity and low-energy use that data centers need. As a result, system architects often fill up their motherboard memory module slots with lower-capacity, and thus less expensive, modules. The average data center server today contains less than 256 GB of system memory. It’s a reflection of the fact that a server fully loaded with 64 GB modules costs 17.5 times as much as one with 16 GB modules. With that kind of economics, it’s cheaper to simply add more servers, despite the added capital and operating expenses.
The company I referenced earlier uses 100,000 servers to run its application with responsiveness that customers demand. In addition to a $300 million capital expenditure just for the servers, it pays substantial bills for both data center space and power.
What the industry needs is a way to cut costs while maintaining or improving quality of service. Adding more servers is a weak solution. Data center owners require vastly more system memory than what today’s generations of machines can support; they need to fit more into their billion dollar data centers while cutting their power bills. It’s a challenge enterprise vendors are tackling.
Intel and Micron, for example, cited the DRAM issue when they announced a new “3D XPoint” memory substrate that they say will be significantly denser and less expensive that current DRAM. Unfortunately, their technology won’t be ready for a number of years.
But data center architects have a problem right now, and there’s a solution that’s ready: flash memory. When most IT people think of flash, they think of solid-state drives, replacing hard-dries for data storage. For all its cost advantages, they say, flash is just too slow for system memory. But that’s no longer true. Now, with the right design, you can use flash as memory and still get web-scale performance.
You can currently buy a product that makes it possible to have four times the memory density as DRAM for a fraction of the cost – and other companies are likely taking their own approach to the issue.
As an industry, it’s fair to say that until recently, we had “forgotten about memory” and the role it plays in making data centers an ever-bigger line item on income statements. The good news is that the vendors are now on the case. The problem will be solved, and every company that runs data centers will benefit.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.