TeamQuest Director of Market Development, Dave Wagner, currently leads the cross-company Innovation Team and serves as worldwide business and technology thought leader to the market, analysts and media.
The concepts behind Big Data have been around for a while, with many successful business analytic solutions having been adopted by businesses over the last few years. A challenge of Big Data is in determining how best to glean advantages that positively impact IT’s ability to efficiently ensure the balance of infrastructure cost, risks, and performance to support those lines of business.
Beyond descriptive and through predictive
Many IT veterans know that predictive analytics predate Big Data by several decades. What is evolving, however, is an understanding that the two technologies work well together. Further increasing interest is an emerging approach going beyond descriptive, through predictive, to prescriptive analytics – technology that recommends specific courses of action and forecasted decision outcomes.
Per Gartner, more than 30 percent of analytics projects by 2015 will provide insights based on structured and unstructured data. The promise of predictive and prescriptive analytics is appealing to IT decision makers because it adds a highly proactive future view of mission critical processes and resources – not only potential issues, but also optimization decisions.
Speaking at a TeamQuest ITSO Summit, Mark Gallagher, of CMS Motor Sports Ltd. described how Formula One data analysts successfully analyze data to not only ensure the safety of team drivers but to win races.
“In 2014 Formula One, any one of these data analyst engineers can call a halt to the race if they see a fundamental problem developing with the system like a catastrophic failure around the corner. It comes down to the engineers looking for anomalies. Ninety-nine percent of the information we get, everything is fine,” Gallagher said. “We’re looking for the data that tells us there’s a problem or that tells us there’s an opportunity.”
Picking up on Gallagher’s Formula One example, it is apparent that there is an overlying theme of exception-based, predictive and prescriptive analytics, as well as the game changing nature of measuring and analyzing what matters. In my opinion the best way to describe this is by equation: good data + powerful analytics = better results.
Prescriptive analytics tools for IT and business
Having progressed along the spectrum from descriptive to diagnostic to predictive analytics, the resulting business intelligence can be used to see ahead, plan, and make decisions when there are too many variables to evaluate the best course(s) of action without help from advanced technology.
Prescriptive analytics tools develop business outcome recommendations by combining historical data, business rules and objectives, mathematical models, variables, and machine-learning algorithms. This enables virtual experimentation when real-world trials are too risky or expensive. Beyond insight, prescriptive analysis can foresee possible consequences of action choices, and conversely, recommend the best course of action for the desired outcome(s).
Advanced analytics is still in the early stages; Gartner surveys show most organizations are assessing current and past performance (descriptive analytics), with only 13 percent making extensive use of predictive analytics. Under 3 percent use prescriptive capabilities. Growth is certain. Gartner analyst Rita Sallam claims, “Those that can do advanced analytics on top of Big Data will grow 20 percent more than their peers.”
Continuous optimization vs. assured performance
The heart of IT infrastructure optimization lies at the intersection of financial efficiency, resource use effectiveness, and assured customer service. As the IT systems and software that comprise data center infrastructure become ever more sophisticated, automated, and complex, so must management and optimization approaches.
Advanced analytics and intelligent automation deployed across all infrastructure domains will equip IT to cost-effectively scale and optimize resources ahead of the demand curve, yielding improved business agility, market share and customer experience with increased revenue and reduced risk.
Understanding application performance must become deeply integrated with data center management tools and data for automatic provisioning of resources to be simultaneously cost-effective and service risk minimizing. Automated provisioning of storage, bandwidth, and computing power is a primary benefit of virtualization and a powerful capability of SDDCs. But without integrated business intelligence all that is likely to happen is that sub-optimal decisions will be automatically implemented more quickly than ever – with no assurance of continuous, acceptable service performance.
Bridge all the silos!
When teams and tools bridge silos, the synergy becomes the basis for competitive advantage. Gathering good data streams—metrics that matter to both business and IT— and correlating them through powerful analytics will amplify bottom line results.
As an example, by measuring and analyzing more than just power utilization effectiveness (PUE), the focus of continuous optimization shifts to risk reduction, revenue growth, decreased capital and operating expenditures, and enhanced customer experience. What does it mean for a data center to be the most efficient possible according to industry standard PUE? What are you getting for your use of that “efficient” power? How much work is accomplished? If high PUE goes to servers that are not cost-effectively accomplishing useful work in service to the business, is that really efficient?
IT teams will be more successful if they’re able to look at the right data in combination with powerful analytics. IT must understand what’s important to the business to be successful by delivering accurate, strategic advice – sometimes in a matter of seconds.
The spectrum of analytic approaches
It’s important for IT to use the “descriptive, predictive, prescriptive” spectrum of analytic approaches which reinforces that it’s not just about getting good information; it’s about knowing what to do with that information, when, and importantly, why.
This spectrum journey can be started with whatever existing level of tool, process, and skill maturity is extant within IT environments and yield immediate and game changing results toward complete data center optimization.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.