Insight and analysis on the data center space from industry thought leaders.
Will High Frequency Trading Become A Flash in The Pan?
Given recent events, those salespeople and sales managers should be getting a little concerned, especially if they understand the clients' business models. Most publicly, the Volcker Rule — part of the Dodd‐Frank Wall Street Reform and Consumer Protection Act — threatens the basic legality of high frequency trading, writes Patrick Mannion of Align.
April 17, 2014
Patrick Mannion is the Director of Data Center Strategy in Align’s Professional Services team.
To the typical data center salesperson, a client who wants to perform high‐frequency trading (HFT) in their facility is a wonderful thing. These clients may be a little demanding on service levels, and they may push the envelope for a site’s capabilities or delivery timelines, but the clients themselves recognize this and are willing to pay a premium for the increased service levels they receive. Not surprisingly, the typical salesperson wants to take care of these clients and nurture them for the long term.
Given recent events, those salespeople and sales managers should be getting a little concerned, especially if they understand the clients' business models. Most publicly, the Volcker Rule — part of the Dodd‐Frank Wall Street Reform and Consumer Protection Act — threatens the basic legality of high frequency trading.
Also, the recent publication of Michael Lewis’ book, Flash Boys, and the subsequent 60 Minutes exposé on the topic have brought serious concerns around the ethics of the practice into the Zeitgeist. Combine these public relations stresses with the realities of the business, and one conclusion becomes apparent: HFT as a business practice is on thin ice.
Origins of High Frequency Trading
In 1998, when the U.S. Securities and Exchange Commission authorized the use of electronic exchanges for stockbrokers, we saw the first glimmer of firms using technology to game the system. Arbitrage plays, where stocks could be bought from one player and sold to another with a significant price difference, were made more obvious through automated execution of those trades and were only a few keystrokes away. Within a few years, trade execution went from being measured in minutes to less than a second, and trading volume on the major exchanges skyrocketed.
Over the next decade, the gap of time between the trade bid and execution continued to tighten. An interesting dynamic began to appear — the Cold War Arms Race of the late 20th century was reborn in financial circles.
The quest for speed started with network connections and circuits, then server hardware and trading platforms and finally through physical moves out of the financial giants’ data centers to colocation nearer the exchanges’ order management systems and matching engines. In eliminating all the places a delay could occur, market participants battled each other in a “race to zero latency” along the path a trade would travel. By 2010, the industry measured trade executions in milliseconds.
Trades Executived in Nanoseconds
As we enter 2014, many of the big players are talking nanoseconds within their platforms and microseconds for exchange-based executions. At some point soon, technology‐ and location‐focused improvements will stall, and the theoretical maximum speed for a trade will be reached. HFT will no longer be about speed and throwing technology at the race, but instead be about strategy and more sophisticated trading algorithms.
Of course, all of these high‐speed circuits, high‐end servers and routers and proximity‐focused colocation come at a constantly increasing price, not to mention the army of programmers and system administrators behind the algorithms who are trying to stay ahead of their opponents on the battlefield with new approaches and alternative trading strategies. In parallel, the execution of successful trades — known as the “fill rate” — is constantly decreasing as peers ramp up their own platforms and steal away opportunities as quickly as they appear.
As the race goes on, it becomes harder and more expensive for an HFT shop to compete. Between finding a bleeding‐edge technical advantage and building algorithms that can successfully find openings for trades, the brokerages are fighting to capture fractions of pennies on each execution. In fact, the TABB Group estimates that HFT‐focused profit is falling quickly — from $7.3 billion in 2009 to $1.3 billion in 2013 across all markets.1
Changes Within the Ecosystem: What Lies Ahead?
Focusing on their core competencies — the strategy definition and execution and the algorithms — the business of HFT becomes the focus and they will offload the physical management of the data center and its gear. This provides an opening for third parties to step in and provide value in their equipment procurement processes, technology operations teams and business aware
intelligent move, add, change processes. As their profit margins are squeezed from both directions, they will look to reduce their costs at every point in the chain.
There is another pressure to consider as well — the riskiness of the HFT proposition in general. In 2010 and 2013, the financial world was rocked by two so‐called “flash crashes” where markets saw an immense drop in prices occur in an incredibly small period of time. Directly tied to the use of HFT and “black‐box” strategies searching for arbitrage opportunities, these flash crashes were unpredictable and dangerous to the entire marketplace. Billions of dollars were lost and incompletely regained in a matter of seconds, and there is no single explanation of what triggered the crashes or recoveries across the entire marketplace.
Regulators are rightly concerned that these types of events will recur, possibly at a greater magnitude and with a reduced or even without any subsequent recovery. This brings more uninvited scrutiny into the HFT community. That scrutiny has already uncovered the fact that most HFT shops hedge their risk through alternative HFT strategies. They effectively take other bets to control their losses, but use the same methods that they are supposed to be protecting against failure.
Modern data center providers with an HFT client base are typically aware of these concerns. Across the industry, there is a movement afoot to contain these clients and hedge against the potential loss of momentum.
Some providers have custom‐built facilities to house these clients, satisfying the clients’ appetite for proximity to their trading partners, while reducing the potential losses to a small number of easily sold sites. Other providers have attempted to “salt‐and‐pepper” the clients with a wide vertical market mix in the same facility in an effort to protect their investments. Whatever the approach, the fact remains, the existing client base of hundreds of thousands of pieces of equipment across tens of thousands of cabinets and circuits focused on high‐frequency trading as we know it today will not last forever.
When the change comes, what form will it take? Will the HFT platforms shrink, or be transformed into something smarter? Will regulators force these market players to change their practices, or will the exchanges take drastic action to limit the effect of HFT? If they do, what will the effect be on the data center industry? Will the financial vertical lose its lucrative luster, while those providers that cater to it become more commoditized? Will the New Jersey and Chicago markets, home to the exchanges and the major players, feel a contraction or even bottom out?
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like