Optimizing Infrastructure for the Big Data V’s – Volume, Velocity and Variety

6 comments

Patrick Lastennet is director of marketing & business development, financial services segment for Interxion.

Patrick-Lastennet-tnPATRICK LASTENNET
Interxion

The use of big data, in general, is still in its early stages for many industries, but the financial services industry has been dealing with big data for years. In fact, it’s already been managed and embedded into core financial processes. What used to be done in hours can now be done in minutes thanks to advanced data processing capabilities being applied to everything from capital market portfolio management applications to financial risk management. Prior to such advancements, data from previous days or weeks was analyzed to help re-strategize market approaches for the next day’s trading. But now, with more complex data analytics capabilities, financial firms are able to shorten that window for data processing and create more up-to-date strategies and trading adjustments in real time.

However, it’s not just the increasing volume of data sets that is of concern to financial firms. There’s also the velocity and variety of the data to consider. When pulling clusters of diverse databases together for both structured and unstructured data analysis, financial firms rely on having powerful processing speeds, especially as real-time insight is increasingly a key strategic factor in market analysis and trading strategies. But are financial institutions equipped with the proper infrastructure to effectively handle the three V’s of Big Data – volume, velocity and variety – and benefit from real-time data analysis?

Increasing the Value of Real-Time Operations

With real-time data analysis, financial institutions are better able to manage risk and alert customers to real-time issues. If a firm is able to manage risk in real time, that not only translates into better trading performance, but also ensures regulatory compliance. Such improvements can be seen in consumer instances with enhanced credit card transaction monitoring and fraud protection and prevention measures. But, on a larger scale, the most recognizable incident that would have benefited from better data analysis may have been the collapse of Lehman Brothers.

When Lehman Brothers went down, it was called the Pearl Harbor moment of the U.S. financial crisis. Yet, it took the industry days to fully understand how they were exposed to that kind of devastating risk. For every transaction made, it’s imperative that the financial firms understand the impact, or, as an extreme scenario, risk another “Lehman-esque” collapse. Today, with advancements in big data analysis and data processing, whenever any trader makes a trade, financial firms know what’s going to happen in real time through the risk management department–that is, if they have the right infrastructure.

Optimizing Current Infrastructure

The crux of handling the volume, velocity and variety of big data in the financial sector lies within the underlying infrastructure. Many financial institutions’ critical systems are still dependent on legacy infrastructure. Yet to handle increasingly real-time operations, firms need to find a way to wean off of legacy systems and be more competitive and receptive to their own big data needs.

To address this issue, many financial institutions have implemented software-as-a-service (SaaS) applications that are accessible via the Internet. With such solutions, firms can collect data through a remote service and without the need to worry about overloading their existing infrastructure. Beyond SaaS apps, other financial companies have addressed their infrastructure concerns by using open source software that allows them to simply plug their algorithms and trading policies into the system, leaving it to handle their increasingly demanding processing and data analysis tasks.

In reality, migrating off legacy infrastructure is a painful process. The time and expense required to handle such a process means the value of the switch must far outweigh the risks. Having a worthwhile business case is, therefore, key to instigating any massive infrastructure migration. Today, however, more and more financial firms are finding that big data analysis is impetus enough to make a strong business case and are using solutions like SaaS applications and open source software as stepping stones for complete migrations to ultimately leave their legacy infrastructure behind.

Integrating Social Data

While the velocity and variety of big data volumes from everyday trading transactions and market fluctuations may be enough of a catalyst for infrastructure migrations and optimization, now that social data is creeping into the mix, the business case becomes even more compelling.

Pages: 1 2

Add Your Comments

  • (will not be published)

6 Comments

  1. Excellent piece Patrick. You really nail how Gartner's "3Vs" of big data apply to the financial sector. It's good also to see the marketplace at large finally embracing our "3Vs" albeit 12 years after we first posited them in a piece I wrote on the "Three Dimensional Data Challenge" (ref: http://goo.gl/wH3qG). Cheers, Doug Laney, VP Research, Gartner, @doug_laney

  2. Doug, Many thanks and hats off to you and gartner for showing us the way ! Cheers Patrick

  3. Excellent article, I would like to include domain expertise as part of the existing or needed infrastructure. The domain knowledge is very important for proper usage of all that big data has to offer, otherwise correlation can be mistaken for causation, time and effort spent on acquiring and utilizing incremental data with diminishing marginal returns. This will also address my contribution to the Vs of big data, an understanding of the vacuum in whatever the big data is showing, data limitations have to be understood and factored in otherwise expensive mistakes can be made. I covered that at International Big Data Week events this year. Cheers Kamal

  4. Sarah Trell

    Great post Patrick. May I also propose 'V' for viability - http://www.pros.com/big-vs-big-data/

  5. Information is being created at a faster pace than ever before with these varied channels of social data that are increasing their daily output of content. Thanks to social data, businesses are now able to uncover the latent, hidden relationships among these variables.