Insight and analysis on the data center space from industry thought leaders.

Using Big Data to Optimize Business Operations

In order to uphold reliability and preserve reputation, data centers must maintain an unimpeded flow of data at unprecedented levels, writes Bhavesh Patel of ASCO Power Technologies. To do so, data centers need to refine how they value data used in monitoring their own operations.

Industry Perspectives

August 5, 2014

4 Min Read
Data Center Knowledge logo

Bhavesh Patel is Director of Marketing and Customer Support at ASCO Power Technologies, Florham Park, NJ, a business of Emerson Network Power.

Business – and life in general – is becoming data-centric. In order to uphold reliability and preserve reputation, a data center must maintain an unimpeded flow of data at a level not anticipated even just a few years ago.

To do that, a data center needs to refine how it values data used in monitoring its own operations so that the flow of information generated in running the facility does not flood or overwhelm IT and management capabilities.

Shifting the focus

Data centers would do well to shift emphasis from the volume, variety, and velocity of the data generated to monitor its operations to how it can best utilize the data, mining it to optimize business insights and data center operation.

Streaming data center operation information into clusters that talk to each other can help. Ideally, each cluster would not only collect data but also have local intelligence to determine what information to feed upstream.

For example, a building could have the following clusters: power (including power efficiency metrics to enable more efficient power distribution and monitoring of critical power), cooling (for optimized efficiency and control of the environment), safety and security, and facility management. Each cluster would feed overview and status information to others, with a building management system orchestrating policy decisions using aggregated data. While each cluster would have its own monitoring, measurement and control capabilities, they would share overview and status data which would help inform decision makers.

Data center IT and management can use the analyses to change behavior and practices that impact such clustered concerns, with different clusters talking to each other for more effective data mining so that both the data center operator and the data center customer benefit from the enhanced knowledge.

Redefining how we use data

The strategy of using existing individual data points in newly networked ways is already in place in other industries. Examples include the auto industry where the vehicles themselves use generated data points to improve safety and comfort, in fleet management to enhance efficiency and monitor driver behavior, and in the running of a multi-location carwash business to improve daily operation.

In today’s cars, monitoring of data points is clustered under the cover (anti-lock braking system and automatic transmission), under the bonnet (automatic wiper control and engine management system), behind the dashboard (climate control), in the boot (parking aid), in the footwell (electric window and central locking), behind the central console (airbag control unit), and behind the glovebox (alarm and immobilizer).

As for fleet management, wireless GPS fleet tracking and diagnostic software solutions use long-in-use data points, such as driving speed and total distance on the odometer, to monitor performance and generate in-depth performance data on every vehicle in the fleet. Management can know where each vehicle is and how and when it got there and can receive alerts and reports that enable decisions that reduce fleet fuel and maintenance costs, improve fleet efficiency, and even modify individual driver’s habits.

A multi-location carwash business takes advantage of generated data to improve daily operations. The company uses multiple data points monitored by sensors affixed onto eight different drums of carwash chemicals at each of eight locations and dedicated software monitors chemical levels throughout the day. The monitoring was previously conducted weekly with a measuring stick stuck into each drum. With the new approach, management can pull reports at any time and react immediately to any levels deviating from the expected norm.

Reporting on the data center

At data centers, Data Center Infrastructure Management systems (DCIMs) and/or Critical Power Management Systems (CPMSs) are increasingly popular ways to monitor and report on specific data points related to power generation and distribution. A CPMS could also interact intelligently with a data center building management system which, in this increasingly data-centric world, could also be culling data from other categories of gathered information, creating valuable actionable intelligence in real-time.

As Big Data becomes more prevalent, there will be more ways to reap meaningful return on data, enabling a better ROI on data collection systems already in place or still on the horizon.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like