SAN JOSE, Calif. - Earlier this month, the Wall Street Journal reported that Bank of America, currently the second largest banking firm in the country, was in the midst of a shift from traditional data center hardware to hardware designed through the collective efforts of Open Compute Project, the Facebook-led open source data center and hardware design community. The article quoted the bank’s CTO David Reilly saying he was planning to run about 80 percent of the workloads on infrastructure fashioned after the web-scale infrastructure companies like Facebook and Google have been designing for themselves by 2018.
But BofA is only one example of a U.S. financial services giant considering the big changeover. Other Wall Street heavyweights are either testing the waters with Open Compute gear, preparing to buy it wholesale, or are already running it in production. One of them is Goldman Sachs, whose engineers have been involved with OCP from the project’s early years, and which has had a member on the OCP foundation’s board of directors since 2012.
After being involved in development of OCP hardware, firmware, and BIOS, Goldman is now gearing up for a wide-ranging deployment of OCP servers across its data centers. In an interview, Jon Stanley, senior technology analyst and vice president at Goldman’s IT and services division, said 70 percent of new server purchases the company is going to make this year will be OCP gear.
The company started buying servers from Hyve Solutions, one of the official OCP hardware vendors, last year, Stanley said. But Goldman has other suppliers lined up, which is one of the things that make this approach to IT procurement so attractive. There are always multiple vendors selling hardware built to similar designs, which ensures supply continuity and drives down the price for the end user.
‘Inevitable Thing to Happen’
The rise of something like OCP was an “inevitable thing to happen,” Grant Richards, Goldman’s managing director of global data center engineering, said while sitting on a panel of IT infrastructure heads from multiple major financial institutions at the Open Compute Summit in San Jose, California, Tuesday.
The usual process, where vendors like HP or Dell design proprietary boxes and have them manufactured in Asia before being shipped to customer data centers, is slowly living itself out. The latest sign was HP’s announcement this week that it had joined OCP and launched a line of OCP-compliant servers.
Goldman had to make a lot of adjustments to be able to “consume” OCP hardware, hardware that didn’t come with as much vendor involvement beyond providing the boxes themselves as Richards’ department was used to. Goldman changed as a company as a result. There are some organizational barriers lots of enterprise IT shops have that today prevent them from deploying something like OCP infrastructure at scale, and whether it is actually cost-competitive with "incumbent" gear is a matter of some controversy. But, if more incumbent vendors make announcements similar to HP’s, other end users won’t be faced with a learning curve as steep as Goldman was faced with.
Fidelity Looks Beyond Servers
Goldman is not the only pioneer financial services company driving open data center hardware forward. Another firm that’s been involved with OCP from its early days is Fidelity Investments. Like Goldman, Fidelity has been actively participating in development of the OCP specs and has been testing OCP servers in its data centers for several years. The company may now be considering a deployment at scale.
“I’m starting to see equipment that we can absorb,” Bob Thurston, director of integrated engineering at Fidelity, said on the panel. “This year particularly could be a very good turning point.”
Fidelity has made some substantial contributions to OCP and continues to do a lot of work. One of its biggest contributions was the “Bridge” rack, which can accommodate both traditional 19-inch-wide IT gear as well as the 21.5-inch chassis in some of the OCP designs.
One big ongoing OCP project at Goldman is called the Open Sensor Network, which is about bringing some data center infrastructure management intelligence to the Bridge rack. Thurston’s team has designed its own sensors that measure temperature, humidity, and amounts of particulate matter in air, and detect when a cabinet door opens or closes. The sensors feed data into Raspberry Pi computers, the tiny low-cost devices powered by ARM chips, but the ultimate goal is to store that sensor data on a Hadoop cluster and then write analytics applications that can use Hadoop to help improve cooling and power efficiency.
Other Giants More Than Curious About OCP
Capital One, another major U.S. banking firm, also recently became a member. “Capital One’s gone to open source in a pretty significant way, and I think we’re going to do that for infrastructure,” Brian Armstrong, director of Open Compute and next-gen infrastructure at the company, said during the panel session.
The company has just joined, so what exactly its involvement in the open source project is going to be remains to be seen. Right now, Armstrong and his team are figuring out where they contribute and how to start contributing as soon as possible, he said.
Others on the financial services panel included Matthew Liste, managing director of cloud development at JP Morgan Chase (currently the largest banking firm in the U.S.), and Justin Erenkrantz, head of compute architecture at Bloomberg, the big financial information services company. Both companies are looking at ways to integrate Open Compute hardware into their environments.
As financial services companies morph into technology companies, they are focused more and more on driving down the cost of their IT infrastructure while increasing the amount of things that infrastructure can do. IT and data center infrastructure is where interests of the internet industry and interests of the banking industry (and almost all other industries) are really similar.
The internet giants built the stuff for their own needs, so it will take some more time for the Open Compute ecosystem to produce technologies that can be adapted to a wider range of users. We’re already seeing signs of progress, but this is only the beginning of what many say will be a complete transformation of the way data center hardware is designed, produced, sold, and consumed.