BigData2011-New-Alchemists

Big Data, and What It Means for the Data Center

4 comments

BigData2011-New-Alchemists

GigaOm's Big Data 2011 conference kicked off with a panel titled The New Alchemists - Data Scientists Panel. Shown above are: (left to right) Bassel Oijeh, President and CEO, nPario; Bill McColl, Founder and CEO, Cloudscale; Hillary Mason, Chief Scientist, bit.ly; Terry Jones, CEO and Founder, Fluidinfo and Jason Hoffman, Founder and Chief Scientists, Joyent.

What exactly is “big data” and what does it mean? Those questions were at the heart of the GigaOm Big Data conference Wednesday in New York, which explored the world of huge datasets and the technology needed to mine its secrets and convert them into business intelligence. There were many examples of how companies are solving these problems. But the big takeaway, from the data center perspective, is that data is being generated at an incredible pace, and as improved analytics tools unlock value from it, processing and storing all that data becomes ever more important. Big data has to reside somewhere, and that trend bodes well for data center demand.

Here’s a roundup of quotables moments from the day-long event:

Luke Lonergan, CTO and Co-Founder, EMC Greenplum: “One of our guys recently indexed the entirety of Wikipedia in 30 seconds. You can take what was previously thought to be unattainable amounts of data and gain insight from it. The question becomes – is this something only eggheads can do?”

Alfred Spector, VP of Initiatives and Special Initiatives, Google: “We have essentially unlimited computational power so long as we are willing to pay for the electrical cost. The scale issue is profound. I see endless technical opportunities. There’s going to be huge impact on computer science for all disciplines.”

Pete Warden, founder of OpenHeatMap: “Did you know you can hire 100 servers from Amazon for $10 an hour? That just blows my mind. For me, that’s really the heart of the big data revolution.”

Jeff Jonas, Distinguished Engineer at IBM: “Observations add up.The more data you have, the better your predictions get. As computers have been getting faster, organizations have been getting dumber. For the most part, organizations have been trying to analyze individual transactions to see what they mean. Now you can take data from many different sources. The future is really about making puzzle pieces into puzzles.”

Marc Parrish, VP of Loyalty Marketing, Barnes & Noble.com: “Americans are data fat. We’re on a diet of 34 gigabytes a day.”

Bill McColl, CEO and Founder of CloudScale: “Up until now, big data has mostly been about queries on offline data and batch jobs. Now it’s shifting to real-time access. This is a huge shift and it’s clear that the users of warehousing tools require this. And it’s not just about volume, it’s about velocity. I think that’s a major trend … At Facebook today, you’ve got a couple hundred millon users you’re trying to track in real-time. The number of events per second being generated is just enormous. The data has to be looked at as soon as it’s generated. The stream never stops, and you don’t get a chance to catch up if you fall behind. In the business world, where you want to act on data, you have seconds to act on it or you’re swamped.”

Terry Jones, CEO and Founder of FluidInfo: “We live in this computer world put together decades ago. Lots of information in this world doesn’t have a structure. We get hung up on this computer scientist thinking that we need a herarchy and a schema. And that doesn’t always exist in the real world … There’s this huge fuss about big data, and a lot of people can’t define what it is.”

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

4 Comments