Artificial Intelligence and the Evolution of Data Centers

Charles-Antoine Beyney<br/>Etix Everywhere.Charles-Antoine Beyney
Etix Everywhere.

Charles-Antoine Beyney is Co-Founder and CEO of Etix Everywhere.

Data centers are proliferating to meet the relentless demand for IT capacity and seeking greater efficiency everyday, and each new innovation is a major step. To meet these requirements, Artificial Intelligence (AI) has arrived, holding tremendous promise for the industry.

Facility administrators and IT managers have several critical objectives for their data center operations, but none are as important as uptime and energy efficiency. According to 2016 research by the Ponemon Institute, the average cost of a single data center outage today is approximately $730,000. Unplanned outages are costly and time-consuming affairs that can be detrimental not just to a data center, but to an organization’s business operations and bottom line. Meanwhile, figures concerning data center energy consumption are equally compelling. Worldwide, data centers consume approximately three per cent of the global electricity supply.

See also: This Data Center is Designed for Deep Learning

Fortunately, AI is mitigating data centers’ energy consumption, while improving uptime and reducing costs without compromising on performance.

What is AI?

AI is technology that enables machines to execute processes that would otherwise require human intelligence. A machine endowed with AI is capable of interpreting data to form its own conclusions and make reasonable operating decisions automatically.

Many progressive businesses today are using AI to optimize resource management and to gain a leg up on the competition. A smart, AI-enabled data center is now a necessity for any business that wants to achieve an operationally efficient, high-performance computing environment.

Common use cases of AI include:

Long-Term Planning: Research and development teams may use AI to predict the short- and long-term implications of strategic business decisions. In a manufacturing setting, for example, AI could be used to make accurate, long-term environmental predictions. This data could be very useful for planning eco friendly business initiatives.

Game Theory: Some executives are using AI to predict how markets will react to certain business decisions. An AI engine can compile data from many different sources and help executives to better understand how customers and investors will respond to corporate announcements.

Collective Robot Behavior: Imagine a scenario where an unmanned drone has to land on an aircraft carrier. A successful landing would require many different connected systems to act as one, exchanging data in real-time from a variety of sensors monitoring ocean conditions, temperature, the speed of the craft and other vehicles that are attempting to land. In this case, AI is used to control the “collective behavior” of the different systems.

These diverse business cases make AI one of the hottest branches of computer science and a top focal point for technology providers today. According to MarketsandMarkets, it’s estimated that the global AI market will grow at an astounding rate of 62.9 percent from 2016 to 2022 when it will reach $16.06 billion, much of the increase driven by technology companies, including IBM, Intel and Microsoft, which serve high performance, data center computing environments.

AI and the Data Center Industry

The same AI applications and strategies that are being used to guide larger business decisions are now making their way into the data center. AI is being used in conjunction with data center infrastructure management (DCIM) technologies to analyze power, cooling and capacity planning, as well as the overall health and status of critical backend systems.

In 2014, Google, for instance, acquired an AI startup, DeepMind, and started to use it to slash costs and improve efficiencies in its data centers. Its AI engine automatically manages power usage in certain parts of Google’s data centers by discovering and reporting inefficiencies across 120 data center variables, including fans, cooling systems and windows.

Using AI, Google was able to reduce its total data center power consumption by 15 percent, which will save the company hundreds of millions of dollars over the next several years. Additionally, the company has already saved 40 percent alone on power consumed for cooling purposes.

DCIM tools are software and technology products that converge IT and building facilities functions to provide engineers and administrators with a holistic view of a data center’s performance to ensure that energy, equipment and floor space are used as efficiently as possible. In large data centers, where electrical energy billing comprises a large portion of the cost of operation, the insight these software platforms provide into power and thermal management accrue directly to an organization’s bottom line while reducing its carbon footprint.

Leveraging AI, automated software platforms such as DCIM and smart devices, businesses can ensure their data centers provide stringent security and are eco-friendlier, while improving uptime and reducing costs — without compromising performance. With that, why shouldn’t companies follow suit or at least make AI a discussion point in 2017?

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Add Your Comments

  • (will not be published)