Insight and analysis on the data center space from industry thought leaders.

Is Big Data a Bubble Set to Burst?

Sentiment about Big Data appears to be shifting from enthusiasm to skepticism, as the business world questions how much value it can actually provide

Industry Perspectives

March 30, 2015

4 Min Read
Is Big Data a Bubble Set to Burst?

As Co-Founder & Director of Business Development at SiSense, Adi Azaria is a passionate entrepreneur, author, and an established thought leader in innovative technology.

The term “Big Data” has become increasingly popular over the past few years. For a while, it appeared as though you couldn’t throw a stick (or a Google search) without running into someone speaking of its wondrous possibilities.

Just think about all the data being generated, collected and statistically analyzed by modern organizations, and the ways it would seemingly revolutionize our everyday lives.

 

 Google Trends)

Interest in Big Data over time (source: Google Trends). Click to enlarge.

 

In early 2015, however, it seems this trend might be reversing, and the business world is growing disenchanted with data. Today the discussion revolves around whether Big Data will ever be able to live up to its promise, with an increasing number of skeptical voices being heard. The sentiment seems to be gearing toward overall disappointment.

As someone who has been involved in the business intelligence industry for the past 10 years, I actually see this as a blessing: Indeed, the time has come for the Big Data bubble to burst.

I’m not saying this because I believe that data analysis and BI software tools have ceased to be valuable assets for organizations. On the contrary, data analytics is still far from reaching its full potential. It is already delivering real, proven value to businesses, and will proceed to become even more so as the technology that drives it becomes more efficient and accessible.

This has nothing to do with Big Data, though; because by and large, the term Big Data is nothing more than a catchphrase.

The Need for More Precise Terminology

The problem is, no one seems sure how to define Big Data, and variations differ. Some might refer to any dataset at terabyte-scale as “Big Data”, while others will rely on the 347 Vs module (typically Volume, Veracity, Variety, Veracity). Most of the talk around Big Data today, however, is generated by vendors, and they won’t even bother giving a definition. More than likely, their current capabilities will invariably be described as Big Data analytics.

When a term that is commonly used lacks a clear-cut definition, different expectations are going to be lumped onto it. Disappointment is naturally inevitable.

Perhaps it would be beneficial if we stopped talking about Big Data as a term, and just stated exactly what types of data we are talking about: structured or unstructured, one million rows or 100 million, homegrown or harvested externally, and so on. This could help us understand what it is—and isn’t—and set realistic expectations when dealing with specific datasets. We could determine if they should be processed according to actionable timeframes, or in the types of insights and value that can be derived.

Approaching the Plateau of Productivity

Gartner’s 2014 Hype Cycle report, released last August, positioned Big Data as one of the technologies well on the way to its “trough of disillusionment”—meaning the initial inflated expectations of it were beginning to recede and the industry’s high hopes for transformative powers are somewhat fading.

 

 Gartner). Click to enlarge.

Gartner hype cycle for 2014 (source: Gartner). Click to enlarge.

 

According to Gartner, Big Data is expected to reach its plateau of productivity, i.e. take its spot as part of the core activity of mainstream business within five to 10 years. However, another interesting aspect to note is the other data-driven technologies and fields that are also on the rise: from prescriptive analytics to the quantified self to wearable devices, and, of course, the Internet of Things, which is cited as one of the main reasons behind Hitachi’s $500 million acquisition of Pentaho.

This tells us one thing: data is not going anywhere. There’s absolutely no denying that more data is available, and innovative new ways of analyzing and using it are evolving. So, while it might be as good a time as any to lay the overhyped “Big Data” term to rest, there is definitely a bright future for data analytics as well as the technologies and practices that derive from it.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like