There’s a clear distinction between data center management software and a data center infrastructure management (DCIM) tool. It gets missed all the time, but it’s there. DCIM deals with the basic allocation of resources required to maintain the functional level of a data center — energy, processor power, storage, bandwidth, cool air. The rest may deal with higher-level activities such as access control and operating system refreshing, all of which at some point pass through the data center, and all of which have at some point been mislabeled “infrastructure.”
There’s also a clear distinction between Artificial Intelligence and automation. This one isn’t just missed by folks in the general press – it’s actively ignored. AI refers to a class of process whose solution, once attained, would appear to a rational person to have required human intelligence to produce. No matter how long or sophisticated a math problem is, the attainment of its solution is not AI. A spreadsheet is capable of predicting the general trend of a heuristic chart. AI is about discerning patterns that ordinary mathematics would miss.
It would seem a simple enough matter for us to find the overlap between DCIM and AI. Is there a place for artificial intelligence in optimizing the everyday processes of managing infrastructure, and are there tools available now that use real AI toward that objective?
This May, long-time DCIM tools producer Nlyte Software announced a partnership with IBM’s Watson IoT group. IBM announced the partnership the previous month. In the earlier announcement, IBM told DCIM users to expect Nlyte to integrate a service called Predictive Maintenance and Optimization (PMO). The service would, IBM’s Amy Bennett wrote, elevate Nlyte’s capability to perform predictive analysis. Nlyte would continue to provide an intuitive interface and comprehensive reports, while PMO would add what Bennett called “the magic in the middle.”
When IBM released PMO version 1.0 in June 2017, the company did not portray it as an AI tool, or hint at any type of magic. According to its own documentation, PMO “offers advanced analytics, business intelligence, dashboards, and visualization to provide programs that enable organizations to develop applications with which to monitor, maintain, and optimize assets for better availability, utilization, and performance.”
Yet, as the company continued marketing the tool along with its own implementations of it (for instance the Maximo Enterprise Asset Management, or EAM, platform), it took on a more magical glow. It’s really AI, IBM explained, because it augments human intelligence and predicts outcomes before humans can. While that sounds like a definition for predictive analytics, company representatives like Watson IoT Analytics senior consultant Andrew Condos asserted it’s not predictive analytics because it requires minimal or no historical data to make forecasts or reach conclusions. Now PMO truly does sound magical.
The Efficiency Factor
Kurt Marko, principal analyst with MarkoInsights, believes AI can provide specific benefits for DCIM. In a note to Data Center Knowledge, Marko suggested AI could help with:
- Equipment placement for minimizing thermal imbalances
- Workload placement to equalize resource usage across rows or facilities
- Predictive maintenance, by analyzing machine telemetry to detect anomalies and equipment trouble
- Facilities power and cooling systems, to improve energy efficiency
Nlyte is integrating Watson’s PMO into its Machine Learning (ML) component, which is in turn a critical function of its entire DCIM suite. Those integrations are ongoing at the time of this writing, but in explaining how the new ML engine will be of use in specific management situations, Nlyte ticked off each item on Marko’s list and added one more: it will reassess the priority of logged events, especially in sequences, to come up with more rational severity levels.
Here is a fact that we could declare undeniable were it not for all the folks following the recent trend of denying it: data centers are best managed through human intellect. Enzo Greco, Chief Strategy Officer for Nlyte, admitted precisely this during a recent webinar with 451 Research.
“What we find first-hand is that customers do have the skills within the organization,” he said. “Certainly, they know how to run a data center, they have knowledge of all the different systems. But unfortunately, often those skills are not allocated to the DCIM project itself. And that is one of the biggest factors regarding successful and unsuccessful projects.”
Absent from both Marko’s and Nlyte’s lists was any notion of cost reduction or ROI improvement. When a DCIM customer with those skills evaluates AI for its own facilities, we asked Greco, are they typically more interested in improving efficiency or cutting costs?
“Ultimately, it becomes more a question of efficiency,” he responded, “which derives from reduced costs. For most organizations, the focus is on optimization: given what I already have, what is the best way of optimizing my existing facilities? That certainly reduces or avoids additional CapEx investment.”
Ted Dunning, chief application architect at MapR Technologies, is a several-decades’ veteran AI developer. What he envisions is a DCIM environment that scales up a data center operator’s capabilities in proportion with the number of nodes — virtual or physical — in their purview.
“Roughly the same numbers of people are managing vastly more machines,” he told Data Center Knowledge. “IT has to scale output and capacity without scaling people power. If you add 1,000 more nodes, you don’t get to go from 30 people in the IT organization to 300. Maybe 32.”
Machine learning could be more applicable in a vastly scaled-out data center, Dunning said, if it’s learning patterns from the people who manage its equipment, not just from logs and data stores. Perhaps then we could perceive DCIM tools as more “intelligent.” Until then, the answer to the question of whether you can use AI assistance in DCIM today is a definitive almost.