Gartner: Capacity Concerns Will Make DCIM a $1 Billion Market
December 12th, 2013 By: Colleen Miller
LAS VEGAS - Roughly 36 percent of the audience in a Gartner conference session reported they will be likely to significantly implement Data Center Infrastructure Management (DCIM) tools in their data center in 2014, and another 25 percent predicted they will implement in the next two years. Gartner analysts Federico de Silva Leon and Jay Pultz said their research indicates that by 2017, DCIM will be deployed in 60 percent of larger data centers (more than 3,000 square feet) in North America.
Those projections are why Gartner believes that the market for DCIM tools is now “north of $1 billion.” It’s also why the sector has attracted dozens of vendors, providing end users with a wide array of options for gaining greater control over their data center environments.
What can DCIM tools do?
So let’s look at DCIM tools and what they can do for data center management. Using a DCIM tool, a team can optimize power, cooling and physical space in the data hall, leading to capital deferral because space doesn’t need to be built as quickly. The tools can model “what if” scenarios, such as the placement of additional racks and gear, and allow managers to see impacts before actually experiencing them. The tools can enhance asset management, increasing one’s knowledge of location of servers and storage equipment and their inter-relationships. And finally, and maybe most importantly, DCIM can produce energy savings. Gartner analyst Pultz said that DCIM tools can pay for themselves within three years on energy savings alone.
However, capacity management is the most pressing concern of most data center professionals. What happens when the data center is out of power, cooling and space? “The issue is really about useable capacity,” said Pultz. “We get new equipment and say, ‘Here’s a place we can put this,’ but it’s about what’s the best place to put this new equipment.”
To address the challenge around capacity management, DCIM tools let you “see inside” the data space, without physically going there. The analysts showed a graphic from the iTRACS product, which includes 3-D visualization of the data center white space, showing all the gear and cabling, while visually displaying data about the health and capacity of all the systems.
The other focus of DCIM is the dashboard view of the monitoring data about the data center, which can very valuable, especially for providing information to others such as C-suite executives, and the analysts showed Schneider Electric’s dashboard.
Vendors in DCIM
The analysts reported that there are more than 70 companies in the DCIM space, including large, established companies such as Emerson and Schneider Electric, and others who have been in the game more than ten years, such as Nlyte Software. Each comes from a different angle, said Pultz, including those that focus on power, such as Power Assure and Cisco. Others they highlighted included CA Technologies, Intel (with a product called DCM, that enables DCIM in others’ products), Siemens, Vigilent, Field View, Panduit, among others. Sentilla was mentioned as a company which has withdrawn from the DCIM space.
How does one start a DCIM selection and deployment project?
The analysts recommended a number of questions to ask during your selection process.
- What do you want DCIM to do for you? This includes the prioritization of needs, and may be your most difficult task.
- Will it really work? Ask for proof of concept, vendor credentials. Do your due diligence.
- Will this product be a good match? What about size of deployment? Existing vendors?
To assist data center managers in selecting which tool to deploy (and invest heavily), the Gartner analysts reported they will add a DCIM vendor “Magic Quadrant” research product to their portfolio in 2014. They noted that they previously looked at the space but abandoned the research because the marketplace was unclear. With maturity of the market, and a crowded and evolving provider landscape, Gartner now believes there is a strong need for guidance in the selection process.
I learned a long time ago when I was working with GE that ‘You can’t improve what you don’t measure’. Until that point, I had wondered why everyone I ran into measured EVERYTHING. It was to drive improvement and get groups focused on what was important. Really important. To do that a baseline had to be established and then a routine to measure.
Capacity management is cited as a big issue, and I will suggest that there is a capacity VISIBILITY issue that is driving the bus on this one. There have been a bunch of tools the past 15 years as crude as Visio to the integrated software from IO, and others like Field View. The DCIM market is as fractured as the data center market itself.
That said, throw a flag in the ground.
When we’re out of shape we don’t start exercising by doing 100 push ups or running a marathon our first time out (Well I didn’t anyway) it starts with a few push ups and a brisk walk to the end of the driveway to get the mail. The point is, get moving. Eat the elephant in bites and starting looking at what you have so you know where to focus your energy on improving your own situation. Nice article to help us get off the couch.
Jim LeachPosted December 15th, 2013
Great article Colleen! DCIM was a big topic at the Gartner Data Center Conference and the analysts, end users, and vendors did a great job educating the audience.
RagingWire is active in the DCIM space. We use our N-Matrix DCIM system to run our data centers and provide the data to our colocation customers — full transparency. Partners are key to our success. We have 20+ systems integrated into N-Matrix including: CA Technology, Schneider Electric, Emerson, Power Assure, SynapSense, TrendPoint, and Canara.
Thanks for the great article Colleen.
Cost may be the most commonly sited con for DCIM solutions. Without a doubt, some DCIM solutions are very expensive, particularly when the cost to do the initial data collection is added. Part of the misconception of the cost of a DCIM solution comes down to the difficulty in defining the return on investment