The New York Times building in New York City, 2016 (Photo by Mike Coppola/Getty Images)

The New York Times to Replace Data Centers with Google Cloud, AWS

Add The New York Times to the quickly growing list of companies replacing their own data centers with public cloud services.

As it continues to modernize its infrastructure, the publisher is planning to shut down three of the four data centers hosting its content and internal applications in the near future, migrating most of the workloads to Google Cloud Platform and Amazon Web Services, Nick Rockwell, the company’s CTO, said in a phone interview with Data Center Knowledge.

The Times is a rare relative success story in the digital age among traditional big media organizations, all of whom have been hit hard by the internet’s rise and the virtual takeover of the advertising market by Google and Facebook. The publisher has managed to temper steep declines in print advertising revenue with a growing digital subscription business (the last presidential election and The Times’ subsequent coverage of the Trump administration have boosted that growth substantially) and by expanding into new markets, such as its recent acquisition of the product review site Wirecutter.

This success story, however, has been powered by an outdated infrastructure backend, which Rockwell recently described as a “jumbled mess.” He joined near the end of 2015 after 10 months as CTO at Condé Nast, and since then his team has been working to bring that jumbled mess up to today’s standards.

And that usually means shifting as many applications as you possibly can to the cloud. The Times already uses a virtual private cloud in AWS and a variety of Amazon’s public cloud services in addition to running some apps in the Google cloud. In addition, however, it has cages of equipment in leased data centers in Newark, Boston, and Seattle, as well as its own internal data center at The New York Times building in Times Square.

See also: How The New York Times Handled Unprecedented Election-Night Traffic Spike

Rockwell’s plan is to shut down the three leased sites, keeping only the internal facility in New York, which primarily hosts infrastructure for video editing, network equipment, and a few older applications that are hard to move to the cloud.

All applications that depend on Oracle databases will be deployed on AWS, while most of everything else will run in containers on GCP, orchestrated by Kubernetes. “Plus, some other apps that we prefer to run on [virtual machine] instances will probably remain in AWS, mainly packaged enterprise IT apps,” he wrote in an email.

A critical piece of the publisher’s modernized infrastructure is a CDN (Content Delivery Network) operated by the San Francisco-based startup called Fastly. It caches clients’ content on SSD drives in edge data centers located in major metros around the world and provides an unusually high level of visibility into the way their traffic flows through its network – a key differentiator from the big traditional CDNs.

Using Fastly further reduces the amount of infrastructure The Times needs to deploy, in the cloud or otherwise.

The service that provides the primary API for mobile push notifications for its app, for example, sends out 25 million to 30 million notifications per minute when news breaks. AWS cloud infrastructure to support this service had to be scaled roughly 40 times typical load, costing the company about $25,000 per month, Rockwell said.

His team is working to switch that service to GCP in combination with Fastly (the startup is a GCP partner). He expects the new setup to bring the price tag for delivering the service down to about $5,000 per month, the cost of Fastly’s service in this particular instance being “immaterial.”

See also: Can Google Lure More Enterprises Inside Its Data Centers

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

San Francisco-based business and technology journalist. Editor in chief at Data Center Knowledge, covering the global data center industry.

Add Your Comments

  • (will not be published)

2 Comments

  1. Wes Brookes

    The first paragraph in the article states that they're getting rid of their own data centres and replacing them with cloud services. It then goes on to describe how they're in fact keeping their own data centre and getting rid of their third party data centres. They're instead taking on other third party data centres, those which house the public cloud services, which possibly sit in third party of the third party data centres. Anyone still with me?......

  2. Yevgeniy Sverdlik Post author

    I'm still with you Wes. From my experience, IT and DevOps folks distinguish between own and cloud, not necessarily owned and leased. They don't care about the real estate aspect.