Posted By Rich Miller On September 26, 2011 @ 8:00 am In Storage | 3 Comments
If there’s one takeaway from the many insightful discussions at last week’s O’Reilly Strata Summit, it is this: Our society is generating an ever-growing ocean of data, and developers are building tons of new applications to extract value from this data. It’s a trend that crosses the spectrum, with examples in finance, retail, government, entertainment, journalism and the non-profit sector.
Each panel reinforced a key point for the data center industry: all this data needs a place to live. Big data will require storage devices to house it. It will require servers to power the applications that will make sense of it. In many cases, that data analysis will require supercomputers or high-performance computing clusters.
That all adds up to a bullish case for the data center industry, and a huge opportunity for the growing ecosystem of companies specializing in Big Data.
“Data is the new oil,” said Andreas Weigend, social data guru and former chief scientist at Amazon.com. “Oil needs to be refined before it can be useful. Big data startups are the new refineries.”
If big data generates big analogies, analysts say the trends are reflected in the growing investment in storage hardware. Forrester’s James Kobielus said enterprise storage budgets are growing by 20 to 40 percent per year to keep pace with soaring data needs, That’s driving demand for strategies like deduplication that can compress data and save hardware costs.
The silver lining: Storage continues to get cheaper and cheaper. This trend will continue, Kobielus predicted, with “dirt cheap petabytes” on the horizon. “By the end of the decade, petabytes of storage will live in the palm of your hand,” he said.
While that level of data portability poses challenges from a security perspective, the ability to pack more data into smaller devices is important in the data center, where this tidal wave of data must be tamed.
“Our ability to generate data is outstripping our ability to store it,” said Michael Chui, McKinsey: “We believe the use of big data will change the nature of competition.”
That’s creating some interesting opportunities for companies that have been handling large datsets for years. An example is LexisNexis Risk Solutions, which provides data services to the legal, accoutning and government markets. “We’ve been doing big data for about 15 years,” said Armando Escalante, who heads HPCC Systems , the technology arm of LexisNexis.
HPCC Systems recently open sourced its data platform and released it on Github. “We did it to remain relevant, gain from the community and make our platform better,” said Escalante. “This is not a startup. This is an old company with a new spin on life.”
The Big Data economy will create many of these kind of opportunities, according to Robert Lefkowitz of analytics specialist 1010data .
“If you want to profit from big data, figure out who you are a middleman between,” said Lefkowitz. “The middleman often knows more than parties on either side. That knowledge has value. The Internet is the greatest engine for the creation of middlemen that our civilization has ever seen.”
Article printed from Data Center Knowledge: http://www.datacenterknowledge.com
URL to article: http://www.datacenterknowledge.com/archives/2011/09/26/in-the-pipeline-a-tidal-wave-of-data/
URLs in this post:
 HPCC Systems: http://hpccsystems.com/
 1010data: http://1010data.com/
 Rich Miller: http://www.datacenterknowledge.com/archives/author/richm/
Copyright © 2012 Data Center Knowledge. All rights reserved.