Posted By Rich Miller On March 25, 2011 @ 7:30 am In Storage | 4 Comments
What exactly is “big data” and what does it mean? Those questions were at the heart of the GigaOm Big Data conference Wednesday in New York, which explored the world of huge datasets and the technology needed to mine its secrets and convert them into business intelligence. There were many examples of how companies are solving these problems. But the big takeaway, from the data center perspective, is that data is being generated at an incredible pace, and as improved analytics tools unlock value from it, processing and storing all that data becomes ever more important. Big data has to reside somewhere, and that trend bodes well for data center demand.
Here’s a roundup of quotables moments from the day-long event:
Luke Lonergan, CTO and Co-Founder, EMC Greenplum: “One of our guys recently indexed the entirety of Wikipedia in 30 seconds. You can take what was previously thought to be unattainable amounts of data and gain insight from it. The question becomes – is this something only eggheads can do?”
Alfred Spector, VP of Initiatives and Special Initiatives, Google: “We have essentially unlimited computational power so long as we are willing to pay for the electrical cost. The scale issue is profound. I see endless technical opportunities. There’s going to be huge impact on computer science for all disciplines.”
Pete Warden, founder of OpenHeatMap: “Did you know you can hire 100 servers from Amazon for $10 an hour? That just blows my mind. For me, that’s really the heart of the big data revolution.”
Jeff Jonas, Distinguished Engineer at IBM: “Observations add up.The more data you have, the better your predictions get. As computers have been getting faster, organizations have been getting dumber. For the most part, organizations have been trying to analyze individual transactions to see what they mean. Now you can take data from many different sources. The future is really about making puzzle pieces into puzzles.”
Marc Parrish, VP of Loyalty Marketing, Barnes & Noble.com: “Americans are data fat. We’re on a diet of 34 gigabytes a day.”
Bill McColl, CEO and Founder of CloudScale: “Up until now, big data has mostly been about queries on offline data and batch jobs. Now it’s shifting to real-time access. This is a huge shift and it’s clear that the users of warehousing tools require this. And it’s not just about volume, it’s about velocity. I think that’s a major trend … At Facebook today, you’ve got a couple hundred millon users you’re trying to track in real-time. The number of events per second being generated is just enormous. The data has to be looked at as soon as it’s generated. The stream never stops, and you don’t get a chance to catch up if you fall behind. In the business world, where you want to act on data, you have seconds to act on it or you’re swamped.”
Terry Jones, CEO and Founder of FluidInfo: “We live in this computer world put together decades ago. Lots of information in this world doesn’t have a structure. We get hung up on this computer scientist thinking that we need a herarchy and a schema. And that doesn’t always exist in the real world … There’s this huge fuss about big data, and a lot of people can’t define what it is.”
Article printed from Data Center Knowledge: http://www.datacenterknowledge.com
URL to article: http://www.datacenterknowledge.com/archives/2011/03/25/big-data-and-what-it-means-for-the-data-center/
URLs in this post:
 Image: http://www.datacenterknowledge.com/wp-content/uploads/2011/03/bigdata-alchemists-1-of-1.jpg
 Rich Miller: http://www.datacenterknowledge.com/archives/author/richm/
Copyright © 2012 Data Center Knowledge. All rights reserved.