Skip navigation

Three Vs of Big Data: Volume, Velocity, Variety

When we think of Big Data, the three Vs come to mind – volume, velocity and variety. Just as the amount of data is increasing, the speed at which it transits enterprises and entire industries is faster than ever, writes Steve Baunach of StarView.

Steve Baunach is Founder/GM Americas for Starview, Inc. He is responsible for the design and delivery of domain-specific solutions to customers, working to translate challenges into products.

SteveBaunachSTEVE BAUNACH
Starview

The data explosion is upon us, with increasing amounts produced each day. The trend shows no sign of stopping, or even slowing down. In 2009, research firm IDC noted a 62 percent  increase in worldwide data over the previous year, and predicted that digital information, which in 2009 had reached 0.8 zettabytes (one zettabyte equals a trillion gigabytes), could reach as much as 44 zettabytes by 2020. And, as if that wasn’t enough, the Berkeley School of Management forecasts that more data will be created in the next three years than in the previous 40,000.

Big Challenges

When we think of Big Data, the three Vs come to mind – volume, velocity and variety. Just as the amount of data is increasing, the speed at which it transits enterprises and entire industries is faster than ever. The type of data we’re talking about includes hundreds of millions of pages, emails and unstructured data, such as Word documents and PDFs, as well as a nearly infinite number of events and information from every type of enterprise data center— such as financial institutions, utility companies, telecom organizations, manufacturing facilities and more. Content can be generated by everything from common customer transactions, such as phone calls and credit card usage, to manufacturing facility transactions, like machine maintenance and operational status updates. All of this information needs to be analyzed, acted upon (even if that action is deletion), and possibly stored.

Another important aspect of Big Data involves protecting information and keeping it moving, even during disruptive events. Things like inclement weather, a sudden load on an energy grid (such as people plugging in their electric vehicles every evening) or mechanical failure can cause brown outs and black outs that will have utility companies scrambling to get their service trucks out the door before the flood of service calls begins. For example, last summer in Dublin, Ireland, a transistor failure caused a power outage at major cloud computing data hubs for Amazon and Microsoft – what followed was a series of failures that resulted in partial corruption of the data base and the deletion of important data.

Technology Keeps Pace

Fortunately, the following trends promise to provide tools and technologies that can help industries and enterprises involved with handling, storing and transmitting data:

  • Faster data capture and analysis. New tools allow this to happen as quickly as the data is generated. One example: real-world models of events.
  • More intelligent, automated decision-making. Developers are creating software and languages designed to handle intricate “if/then” scenarios, empowering administrators to customize responses to fit any possible scenario.
  • Distributed storage techniques and cloud computing. These include the conversion from tape to disk, de-duplication, flash storage and the rapid adoption of 100 Gigabit Ethernet, replacing the fibre channel. All of this allows for more storage capacity and new challenges of retrieval of data and on the fly computing, without necessarily storing everything.

Big Opportunities

According to research by McKinsey & Company, Big Data creates value in the enterprise by:

  • Making information transparent and usable at higher frequency;
  • Allowing more accurate and detailed performance information on everything from product inventories to sick days, exposing variability and boosting performance;
  • Enabling segmentation of customers to more precisely tailored products or services;
  • Improving decision-making through more sophisticated analytics; and
  • Optimizing products and services. For example, sensors embedded in products can create innovative after-sales service offerings, such as proactive maintenance (preventive measures that take place before a failure occurs or is even noticed).

New and more sophisticated data analysis capabilities support productivity growth, innovation, and consumer surplus, as long as the right policies and enablers are in place.

If 2011 was marked by advances in cloud computing, 2012 is poised to show how the emergence of business analytics and optimization can benefit a wide range of industries. It’s an exciting time to be a part of Big Data.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish