Insight and analysis on the data center space from industry thought leaders.

AI in the Data Center: Automation and Efficiencies Promise to Drive ROI

Investing in AI for data centers promises efficiency gains, cost reductions, and enhanced security, with the potential to deliver substantial ROI.

5 Min Read
Close-up of microcircuit board for server computer
Alamy

As companies begin to invest in AI and apply it to their business operations, leveraging AI in data centers and the data lifecycle promises to improve efficiency and reduce costs. AI can also be effective at enhancing security and helping to better manage data, ultimately benefiting both organizations and their customers.

GenAI has generated a lot of hype since OpenAI launched ChatGPT in late 2022. And even as that hype begins to plateau, it is at the top of the C-suite agenda according to a recent report from BCG. A majority (71%) of the executives BCG surveyed said they plan to increase their company’s tech investments in 2024, and 85% claimed they’ll increase their spending on AI and GenAI.

From GenAI Hype to ROI

When it comes to return on investment (ROI), the BCG report states that 54% of business leaders expect AI to deliver cost savings in 2024. Considering the high expectations for how AI will deliver value, the pressure is for organizations to determine how and where it can be applied for the best business outcomes. In the case of data center operations, AI has the potential to transform the data lifecycle and improve the management of critical data center operations and infrastructure.

The effectiveness of AI depends on the quality of the data set. To optimize AI outcomes, clean, current, and quality data is required. To accomplish this, AI can be used to automatically classify and tag data based on its content, as well as identify redundant, obsolete, and trivial (ROT) data that’s no longer needed, then schedule it for secure data erasure. Once these initial and critical steps are taken organizations can maximize their return on investment in AI in the following ways:

Related:Power Is Key to Unlocking AI Data Center Growth

  • Automating the data lifecycle from ingestion and processing to storage and archiving. AI is poised to transform the data lifecycle, by identifying, and eliminating old and unnecessary data to produce higher quality data sets to drive better business intelligence. can also increase the quality of data in a variety of ways, such as correcting errors, inconsistencies, and duplicates. AI is ideal for not only automating the data lifecycle but also ensuring data is reliable, accurate, and managed with compliance and retention policies in mind.

  • Analyzing large amounts of data from various touchpoints in the data center to predict the potential for equipment failures before they occur. AI is ideal for detecting data patterns and anomalies related to network traffic, temperature, and power usage. Going one step further, AI can not only anticipate problems before they happen, but it can also schedule maintenance automatically, mitigating and minimizing downtime, all without human intervention.

  • Continuously monitoring network traffic to detect security anomalies and identify potential threats. This application is increasingly important as GenAI is expected to fuel an intense growth of data. This massive expansion of the threat footprint will force data center operators to continue to prioritize security to protect sensitive data from cyberattacks and unauthorized access. AI algorithms have the capacity to learn, therefore they can improve threat detection capabilities and take proactive measures to safeguard data and data center infrastructure.

Related:Is Edge Computing Over or Just Getting Started?

The impact of AI on data center sustainability

For all the benefits of AI, there is still a downside to leveraging AI in the data center that must be addressed in the near term: energy consumption. Data centers already consume a vast amount of energy and resources. For context, Google’s data centers consumed approximately five billion gallons of fresh water for cooling purposes. While machine learning models can be trained to monitor energy demands, optimizing data centers to be more efficient in the process, AI will undoubtedly drive huge spikes in energy consumption.

There is also a direct correlation to the amount of data that organizations are storing -- which requires more energy consumption -- and their sustainability footprint. Once organizations determine a large percentage of their data, which may be older than seven years and/or no longer required for legal or compliance purposes, it can be categorized as ROT and securely sanitized, erasing it for good. This will result in decreased energy consumption and costs associated with storing unnecessary data.

The International Energy Agency predicts global electricity demand from data centers, AI, and cryptocurrencies may more than double over the next three years. With more than 6,200 data centers in 135 countries and over 2,300 in the US alone, this doesn’t bode well for the global energy grid.

But there is some good news on the impact of AI on sustainability: AI requires specialized Graphic Processing Units (GPUs) for increased computation capabilities. GPUs are more energy-efficient, especially when they’re used in large cloud data centers. The answer may well be building more hyperscale data centers, which are far larger and more energy efficient than traditional cloud data centers. For context, a traditional cloud data center may occupy 100,000 square feet, while a hyperscale center can be 1 or even 2 million square feet.

The future potential of AI

As organizations and data center operators move from investigation and investment in AI to implementation, the nearly unlimited potential of AI will take shape. Not only does AI have the potential to transform the entire data lifecycle, but it also has the power to produce higher quality data sets to drive better business intelligence and competitive differentiation.

Russ Ernst is CTO of Blancco Technologies. Russ joined Blancco in 2016 as executive vice president of products and technology. and in September 2022 he was named chief technology officer. He is responsible for defining, driving, and executing the product strategy across both the data erasure and mobile diagnostics product suites. Critical parts of his role include developing a strong team of product owners and cultivating an organizational product culture based on continuous testing and learning.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like