Skip navigation

Optimizing Infrastructure for the Big Data V’s – Volume, Velocity and Variety

The financial services industry has been dealing with big data for years, and today, with more complex data analytics capabilities, financial firms are able to shorten that window for data processing and create more up-to-date strategies and trading adjustments in real-time, writes Patrick Lastennet of Interxion.

Patrick Lastennet is director of marketing & business development, financial services segment for Interxion.

Patrick-Lastennet-tnPATRICK LASTENNET
Interxion

The use of big data, in general, is still in its early stages for many industries, but the financial services industry has been dealing with big data for years. In fact, it’s already been managed and embedded into core financial processes. What used to be done in hours can now be done in minutes thanks to advanced data processing capabilities being applied to everything from capital market portfolio management applications to financial risk management. Prior to such advancements, data from previous days or weeks was analyzed to help re-strategize market approaches for the next day’s trading. But now, with more complex data analytics capabilities, financial firms are able to shorten that window for data processing and create more up-to-date strategies and trading adjustments in real time.

However, it’s not just the increasing volume of data sets that is of concern to financial firms. There’s also the velocity and variety of the data to consider. When pulling clusters of diverse databases together for both structured and unstructured data analysis, financial firms rely on having powerful processing speeds, especially as real-time insight is increasingly a key strategic factor in market analysis and trading strategies. But are financial institutions equipped with the proper infrastructure to effectively handle the three V’s of Big Data – volume, velocity and variety – and benefit from real-time data analysis?

Increasing the Value of Real-Time Operations

With real-time data analysis, financial institutions are better able to manage risk and alert customers to real-time issues. If a firm is able to manage risk in real time, that not only translates into better trading performance, but also ensures regulatory compliance. Such improvements can be seen in consumer instances with enhanced credit card transaction monitoring and fraud protection and prevention measures. But, on a larger scale, the most recognizable incident that would have benefited from better data analysis may have been the collapse of Lehman Brothers.

When Lehman Brothers went down, it was called the Pearl Harbor moment of the U.S. financial crisis. Yet, it took the industry days to fully understand how they were exposed to that kind of devastating risk. For every transaction made, it’s imperative that the financial firms understand the impact, or, as an extreme scenario, risk another "Lehman-esque" collapse. Today, with advancements in big data analysis and data processing, whenever any trader makes a trade, financial firms know what’s going to happen in real time through the risk management department--that is, if they have the right infrastructure.

Optimizing Current Infrastructure

The crux of handling the volume, velocity and variety of big data in the financial sector lies within the underlying infrastructure. Many financial institutions’ critical systems are still dependent on legacy infrastructure. Yet to handle increasingly real-time operations, firms need to find a way to wean off of legacy systems and be more competitive and receptive to their own big data needs.

To address this issue, many financial institutions have implemented software-as-a-service (SaaS) applications that are accessible via the Internet. With such solutions, firms can collect data through a remote service and without the need to worry about overloading their existing infrastructure. Beyond SaaS apps, other financial companies have addressed their infrastructure concerns by using open source software that allows them to simply plug their algorithms and trading policies into the system, leaving it to handle their increasingly demanding processing and data analysis tasks.

In reality, migrating off legacy infrastructure is a painful process. The time and expense required to handle such a process means the value of the switch must far outweigh the risks. Having a worthwhile business case is, therefore, key to instigating any massive infrastructure migration. Today, however, more and more financial firms are finding that big data analysis is impetus enough to make a strong business case and are using solutions like SaaS applications and open source software as stepping stones for complete migrations to ultimately leave their legacy infrastructure behind.

Integrating Social Data

While the velocity and variety of big data volumes from everyday trading transactions and market fluctuations may be enough of a catalyst for infrastructure migrations and optimization, now that social data is creeping into the mix, the business case becomes even more compelling.

It used to be a fantasy, but now it’s a reality to use unstructured, social data in financial algorithms and analysis. Imagine tracking Twitter or Facebook feeds and then matching the sentiment of those feeds against market trends. The associated correlations could have huge implications, especially as information is processed to see what stock prices are going to be on particular instruments. For instance, if there was a lot of negative social sentiment around a specific company, it could predict or even trigger a market change. As a result, it’s becoming increasingly important for financial institutions to find a way to incorporate social data into their portfolios and then manage the associated risks.

Already some firms are making strides in this regard by linking social media analysis services to their analytics engines. This allows them to monitor social media feeds, but it also creates issues with increasing data variety and velocity. Not only does social data come in at exponential volumes, those volumes are created at lightning-fast speeds and are pulled from widely distributed sources, causing data analysis complications without the proper infrastructure in place that’s capable of handling the three V’s of big data.

How Colocation Can Help

Colocating with a carrier-neutral data center provider offers financial institutions a more cost-effective way of analyzing and processing data analysis. Not only do firms no longer need a large wide area network (WAN) to move data, but they also get far better proximity to data sources. In such a dynamic industry like financial services, where milliseconds could make all the difference, being located near market data feeds, liquidity venues and Internet exchanges is essential for maintaining real-time analysis of increasing volumes of quickly growing, varied data.

Moreover, financial firms can more easily imbed real-time analysis and social, consumer insights into trading algorithms by being in closer proximity to major financial hubs, which adds a major competitive differentiator. With a multi-tenant, colocation data center, firms gain a better network topology of solutions all under one roof from which they can more easily draw market data, analysis and feeds. And, with a carrier-neutral facility that has a variety of providers, financial institutions can ensure they have the most optimal connection with the lowest latency.

As the need for real-time data analysis and social data integration infiltrates financial strategies, perpetuated by big data, more and more firms are realizing the need to optimize their underlying infrastructure. Relying on legacy systems can only sustain such companies for so long. Without the proper framework to handle the growing volume, velocity and variety of financial data, many firms will lose their competitive edge. But, with carrier-neutral colocation data center facilities, financial institutions can take a step in the right direction to regaining that edge and set themselves up for a future of innovation.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish