The Internet is ultimately about connections – one computer connecting to another to display a web page, and networks connecting to move the data across the web. The exchange points where these networks meet are some of the most valuable data center real estate in an information-driven world.
The busiest of these Internet intersections is a campus in Ashburn, Virginia housing five large data centers operated by Equinix Inc. Massive volumes of digital traffic flow between the nearly 200 networks that meet inside these buildings. These are the hubs powering the digital economy, and the strategic focal point for Equinix, which has built a prosperous business helping companies avert Internet traffic jams.
The central role of Equinix is noted by one of its co-founders, Jay Adelson, who is now CEO at Digg. On his profile page, Adelson notes that Equinix, “if shut off, would screw up the Internet.”
Equinix hosts most of the infrastructure for Digg and Salesforce.com, and provides colocation and interconnection services to a lengthy list of marquee customers, including Google, Yahoo, IBM, America Online, Akamai, Electronic Arts, GE and Merrill Lynch.
Network Meeting Place
Equinix data centers are where these companies make the physical connections between their networks, usually through traffic-swapping agreements known as peering. Any interruption in that high-speed bitstream would quickly ripple across the Internet.
That’s why the company’s data centers are supported by the industry’s sturdiest power and cooling infrastructure, with several levels of redundancy built into each system to ensure that power failures won’t knock servers offline. Visitors to Equinix data centers must pass through five layers of biometric security between the street and the servers. (See our video tour offering a look inside an Equinix data center).
Why is Ashburn, which lies about 30 miles west of Washington, such an important site? The Equinix campus is just north of a former UUNet facility that was a key hub in MAE-East, the Internet’s first major interconnection point. Equinix built its first data center here in 1998, providing a “carrier-neutral” facility where companies could gain access to Internet backbones operated by UUNet and AT&T.
That first Ashburn site, known as DC1, quickly become the Web’s busiest meeting place. Equinix filled DC1 and has built four additional data centers in Ashburn spanning nearly 500,000 square feet of space. Equinix benefits from the power of a “network effect” in which each new connection adds to the business value of its facilities.
Interconnection as a Differentiator
“The value of the interconnection is our differentiator,” said Phillip Marangella, Director Marketing Strategy and Market Analysis for Equinix. “Even in the current difficult economy, we’re still seeing significant demand and not much in the way of churn or price erosion.”
High-speed connectivity has expanded the geography of the data center market, allowing cloud-builders to chase cheap power in areas not previously known as technology hubs. But the peering and interconnection business remains tighly focused around the largest and oldest Internet markets, where the aggregation of providers provides the most bang for the bandwidth buck.
That’s where you’ll find Equinix concentrating its development. The company has 42 data centers in 18 markets around the globe. But nearly half of those facilities are focused in just five major markets: Ashburn, Silicon Valley, Chicago, northern New Jersey and London.
Expansion Capital at the Ready
Equinix (EQIX) was the best performing stock on the NASDAQ market in 2004 and 2005. It reported $199 million in revenue in the first quarter, and expects to see $855 to $875 million in revenue for 2009. The company has $284 million in cash, and on Monday announced plans to sell an additional $250 million in convertible debt ”to fund the development of expansion opportunities.”
“Because of our financial stability, we are able to grow with our customers,” said Marangella, who noted that because of the credit crunch “some companies are unable to build out to meet their customers needs. We’re self funding with our future capacity. We have the space.”
“We own 32 acres here,” said Detlev Geuss, Director of IBX Operations for Equinix at the DC campus. “There’s room for expansion, and we can build out as needed. We have a whole team that tracks demand and utilization.”
Demand for high-speed peering connections is being driven by the adoption of video, as seen in the uptake of 10-Gigabit Ethernet ports, effectively the biggest pipes available to manage peering traffic. The number of 10 GB Ethernet ports purchased by Equinix customers has more than doubled in the past year, from 102 to 224.
Peak bandwith demand on the Equinix Exchange peering fabric in the U.S., which has surged from 220 gigabits per second a year ago to 300 gigabits per second today. By any measure, that’s a lot of connections taking place.