Big Data and Data Center Analytics
How big is a petabyte? Where exactly do exabytes come from? The emergence of questions like these make it clear that big data continues to be big business. Although there is a fair amount of hype about the words “big data.” I have to agree with Chuck Hollis of EMC that there is value in scale and efficiencies of the information factories.
There are plenty of blog posts, articles, vendor pitches and records being broken around the amount of storage that is being consumed and the incredible rate of growth for it. Looking at examples such as sensor data, social and mobile data and the technological advances in storage being made, it is no wonder that “big data” is a popular topic …. and opportunity. Some other examples of big data:
- In 2007 IDC predicted that we would have 601 exabytes (1,000 Petabytes) in 2010. It turns out we had around 1,200 exabytes. Their prediction for 2020 is 35 zettabytes (35,000 exabytes).
- CERN’s Large Hadron Collider generates 15 Petabytes every year. This is the same amount of enterprise flash sold by Fusion-io in 2010.
- The Internet Archive stores 650 Terabytes per rack, for a total of 5.8 Petabytes
- IBM’s Watson was able to win on Jeopardy with less than 1 Terabyte of stored data.
If big data is big business then analytics and data warehousing is huge business. There has certainly not been a shortage of news about advances in analytics either, or company acquisitions in this space. Recently data warehousing leader Teradata announced that it acquired Aster Data Systems.
The VMware announcement Tuesday was really about analytics — with vCenter Operations modeling cloud environments and applying analytics to help customers achieve the degree of automation required to operate their cloud. Driving a new management model for virtualized environments the aim of vCenter Operations is to mine the wealth of data from underlying physical components (servers, storage, network) and provide understanding, in real-time, about the information that matters and then visually present it in a simple, actionable way through dashboards.
VMware also recently acquired WaveMaker, a startup focused on letting users build Java cloud applications without having to write code.
Data Center Infrastructure Management (DCIM) tools are essentially information factories about the data center, and mining that data for automation tasks or visual dashboards to assist in the effective operation of the facility. On Tuesday analytics software provider Netuitive previewed their enhanced virtual data center dashboard available in the next release of the software scheduled for the second quarter of 2011. Using predictive analytics their patented Behavior Learning Engine is focused on solving virtualization management issues in the enterprise. “We are pleased to provide a sneak peak of the new dashboard and look forward to announcing all of the features that will be in the next release of Netuitive scheduled for Q2,” said Nicola Sanna, CEO of Netuitive. “Predictive analytics for IT continues to grow as a must have for virtualization management and cloud infrastructure initiatives in enterprises globally.”
Data Center Intelligence software provider CiRBA announced the general availability of its Efficiency and Risk dashboard deployable on CiRBA Version 6.1. CiRBA’s Efficiency & Risk dashboard contains the Efficiency and Risk Spectrum that provides a unique visual representation of provisioning status within a data center at the environment, host and VM or guest level. CiRBA 6.1 takes an inventory of all components in an environment and provides an intuitive visualization with specific recommendations to remediate or optimize infrastructure.
The Data Center Opportunity
The data center industry has certainly not absent from the big data opportunity. GigaOm’s Structure Big Data conference will be held in New York in a couple of weeks and a primetime sponsor of the event is Equinix. Rackspace is a sponsor as well. Cloud storage is also taking the big data spotlight, as there has been a lot of activity (and cash) flying in this area as well.
[...] is the bread and butter of all existing enterprises in the world. And data is the microfiber that constitutes analytics. This only tells us that Big Data solution is and will be a flourishing business. IBM’s City [...]
[...] Zettabytes! By some estimates, there’s 1.2 zettabytes (1.8 x 1021 bytes) of digital content in the world, with 35 zettabytes expected by 2020. [...]