Multi-Core Adds Complexity to Data Center

Add Your Comments

The technology industry’s shift to multi-core chips has elevated the importance of parallel processing, a trend which has broad implications for how data center operators and customers pay for hardware and code their applications. As IT departments are focused on improving data center efficiency, the shift to multi-core processors is introducing a new layer of complexity, and many companies say they are not adequately prepared for the change in the marketplace.

That change is happening fast. The number of cores on new processor models is expected to double every 18 months, according to Carl Claunch, an analyst with Gartner, who said that trend is likely to hold true through at least 2015. Speaking at the Gartner Data Center Conference in Las Vegas, Claunch said that performance gains now are focused not in the processor speed but in parallel processing, and software applications must change to be able to take full advantage of each additional core.


But much like data center facilities, multi-core processors present a bevy of apples vs. oranges comparisons. Multi-core processors can take advantage of the ability to run multiple processing threads, but not all chips manage threads in the same fashion, according to Claunch. More cores isn’t always better, as each multi-core chip must be developed properly to scale and leverage the architecture. Some multi-core chips can encounter bottlenecks as they wait for threads wait for item to finish, creating inefficiencies.

“It’s very hard to look at two chips that are quad core and say that they’re comparable,” said Claunch. “When you compare threads, it’s even more difficult.” Variables include the number of cores, threads, arithmetic speeds and cache usage.

Multi-core processors also “break vendor licensing,” according to Claunch, who compared multi-core licensing practices to the variability and confusion seen in virtualization licensing. “When we see multi-cores and multiple threads, we see bizarre things happening with pricing,” said Clough. “Trying to get something fair out of that can be challenging.”

Multi-cores and multiple threads are commonly used as justification for premium pricing, according to Claunch, who said most ISVs’ business models depend on getting more money from bigger companies. “The problem here is that no one has figured out a clean transition,” he said. “This is producing more pressure on how customers are charged.”

The toughest transition may be for software developers, who must master coding for multi-thread parallel processing for their applications to leverage the performance improvements made possible by multi-core processing.

“We believe a minority of developers have the skills to write parallel code,” Claunch said. An audience poll at the Gartner Data Center conference in Las Vegas in November found that just 17 percent of attendees felt their developers are prepared for coding multi-core applications, compared to 64 percent who say they will need to train or hire developers for parallel processing.

About the Author

Rich Miller is the founder and editor-in-chief of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.