LitBit’s AI to Help CBRE Manage Hundreds of Data Centers

The deal has the potential to result in the largest case study for AI in data center management yet.

Yevgeniy Sverdlik

February 8, 2018

3 Min Read
Data center generators

In a deal promising to result in the biggest-scale application of AI to data center management yet, CBRE Data Center Solutions, which manages hundreds of data centers around the world, has agreed to deploy LitBit’s AI-driven data center maintenance system across those facilities.

LitBit is a San Jose, California-based startup founded and led by Scott Noteboom, who in the past held data center executive roles at Yahoo and Apple. Although the company says its AI solution is aimed at the broad industrial facilities management space, the data center market is a particular focus.

The startup’s technology monitors data center mechanical and electrical infrastructure and environmental conditions on the data center floor to detect anomalies a human data center manager may not notice. The goal is predictive maintenance, or detecting issues before they cause major operational disruption.

To train its machine learning model, LitBit uses both existing historical operational data and human subject-matter experts. LitBit calls it an “AI persona,” and its name is REMI, or Risk Exposure Mitigation Intelligence.

Any machine learning model is only as good as the dataset that’s used to train it. The broader the dataset and the cleaner the data, the more accurate and effective the AI’s output will be. And that’s why LitBit’s deal with CBRE has the potential to become the biggest case study yet for the possibilities of AI in data center management.

Related:AI in Data Center Management: What It Means for Staffing and Processes

The Los Angeles-based real estate services giant manages more than 800 data centers around the world on clients’ behalf. Using all the operational data it has access to and the expertise of its thousands of technicians and millions of machines under management, CBRE said, it will create “the world’s largest actionable AI repository of machine operating data.”

So far, the largest application of machine learning in data center facilities management that’s been disclosed publicly has been Google’s use of its DeepMind AI technology to improve energy efficiency of its massive data centers around the world. While Google’s facilities are huge – hosting what is probably the world’s largest cloud platform – they are operated by a single end user, highly standardized, and optimized to run a specific, limited set of applications.

That’s a very different kind of data center fleet from the one CBRE manages for hundreds of traditional enterprise clients, such as banks and insurance companies. Taken as a whole, the portfolio hosts almost every imaginable model and generation of data center equipment. The operational dataset this portfolio has the potential to create is exponentially richer than Google’s.

Related:LitBit's AI Data Center Operator to Keep an 'Ear' on ROOT's Montreal Facility

But creating that dataset won’t be easy. First, older-generation facilities are not as well instrumented as modern data centers, presenting one potential problem in creating the data repository CBRE is promising. Additionally, because CBRE’s facilities contain such a diverse set of equipment, it will be difficult to create a dataset that’s clean enough for training LitBit’s model. Finally, not all companies whose mission-critical business infrastructure occupies these facilities will be comfortable piping their operational data into a centralized repository, for reasons such as security, compliance, and competition.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like