New Microsoft GFS Site Highlights Best Practices

The network supporting online services for Microsoft can provide up to 3.5 terabits of capacity to deliver content to end users, the company said today. The new details about the company’s network infrastructure were included in an updated web site launched this morning for Global Foundation Services, which builds and operates Microsoft’s data center network.

The site, which provides significant details about Microsoft’s infrastructure, reflects a growing industry trend to be more open about aspects of data centers that were once closely-held secrets. Built on Windows Azure, the new GFS web site showcases Microsoft’s best practices, and features videos in which Microsoft engineers provide deep dives into the company’s approach to data center operations, efficiency and security.

Data Center Video Tour

Highlights include a 10 minute video that provides a look inside Microsoft data centers in Chicago, Dublin and Quincy, Washington to demonstrate companies approach to data center design and how it has evolved since Microsoft deployed its first data center in 1989. Another video, titled Data Center 101, provides an introduction to the elements of the data center and how they work together to support IT infrastructure on a global basis. The videos feature 20 to 30 minute presentations by technologists, including Christian Belady, Dileep Bhandarkar and Vijay Gill.

Microsoft says its data center team will also blog more frequently about best practices and issues of interest to Microsoft’s cloud customers. The redesigned GFS site features a blog post by Belady and Gill discussing “operational excellence” spanning the company’s hardware, software and staffing.

Microsoft isn’t the first large data center operator to share publicly about the details of its operations. Facebook has published the specs and designs for its servers and data centers through the Open Compute Project, while Google dedicates a portion of its web site to its data center operations.

Focus on Cloud Networks

But the GFS site provides extensive insight into Microsoft’s technology and infrastructure, including its network.

“One of the world’s largest fiber backbones powers our data centers, providing more than 3.5 terabits per second of capacity to more than 1,200 networks with robust availability,” Gill and Belady write. “It provides the ability to instantaneously reroute around internet failures and the capacity to withstand significant network interruptions.”

Big networks are nothing new for Gill, who previously was a key player in network architecture for Google. Gill moved to Microsoft in June 2011 to become Director of Engineering.

“One of the things I’m focusing on is how to build a robust network infrastructure in a cost-effective manner,” said Gill, who will discuss additional details of Microsoft’s network in a presentation at NANOG 55 in June. “We have definitely switched around from our enterprise-style vertical network to a horizontally focused network.”

Microsoft distributes its content using an “edge network” that functions much like a CDN. As the company’s cloud business has grown, Microsoft has had to add infrastructure in many new locations, Gill said.

Broadening the Network & Conversation

“We are actively expanding our network reach,” said Gill. “As customer demand takes off for services, we are expanding the edge network and the core service, and scrambling to keep up.”

Belady, who is now the General Manager for Data Center Services at Microsoft GFS, said the focus on the network was part of an intentional effort to broaden the conversation about data center best practices.

“The industry has really been focusing on power and efficiency,” said Belady. “The reality is that resiliency has been about more than power. It’s about the network.”

Here’s the new look for the Global Foundation Services web site:

A screen shot of the new Microsoft Global Foundation Services web site.

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)