Skip navigation
illustration of AI processor with purple background Alamy

Could AI Factories for a Single Tenant be the Latest Data Center Trend?

A new type of data center is springing up amid the AI boom: "AI factories" – data centers focused on a single application by a single customer, marking a shift from shared multi-tenant data centers.

The standard structure of a colocation data center is to have dozens, if not hundreds of customers all running different applications concurrently. But Nvidia has offered insight into a new type of data center – one with very few applications running and as little as one customer using it.

The Emergence of the ‘AI Factory’

Nvidia CEO Jensen Huang discussed this up-and-coming data center model on a recent earnings call to discuss the company’s quarterly financial results with analysts.

“There is a new class of data centers and this new class of data centers, unlike the data centers of the past where you have a lot of applications running used by a great many people that are different tenants using the same infrastructure,” Huang said.

“These new data centers [host] very few applications, if not one application, used by basically one tenant and it processes data, it trains models and then generates tokens and generates AI. And we call these new data centers ‘AI factories.’

He added: “We're seeing AI factories being built out everywhere. And my guess is that almost every major region will have and surely every major country will have their own AI clouds. And so, we're at the beginning of this inflection, this computing transition.”

The Nvidia head said this trend is currently taking place in India, Sweden, Japan, and France. Huang has since elaborated on this, stating that for AI to be truly effective, it has to live up to both language and cultural standards. The AI needs of Japan are different from the needs of Sweden. That’s why AI data centers and single-tenant AI factories are being restricted to individual nations.

Sizing Up to Deploy AI

Data centers from the big cloud services providers like Amazon and Google and from major colocation providers like Equinix tend to be massive, along the size of a football stadium. Given the enormous power draw of Nvidia’s Hopper processor, these AI factories are going to be about the size of a McDonald’s.

A typical data center rack power budget is in the 6 kW to 8 kW range, but if you wish to deploy a server optimized for running LLMs like the DGX H100, a single server consumes about 11 KW of power. That is like the average power consumption of about 14 general-purpose servers, notes Manoj Sukumaran, principal analyst for data center compute and networking at Omdia.

“In such situations, you can run only a limited number of GPU servers like the DGX H100 in a typical data center,” Sukumaran told Data Center Knowledge. “If you take a 1 MW data center, you can deploy about 50 DGX H100 servers in it. That’s it. And to deploy AI at scale to a large number of concurrent users, you would need large clusters of such servers. This means a typical data center can cater to only a limited number of customers and quite possibly to only a single customer.”

John Sasser, CTO of Sabey Data Centers, a US data center operator, says the company is seeing a significant uptick in the share of total demand coming from a variety of companies engaged in AI, all with common characteristics: high overall demand, high density, liquid cooling, and often more flexibility in the location of the data center.

The Future of AI Factories

Sasser says the most cost-effective design for single-purpose, GPU environments like AI factories will be a specialized data center designed exclusively for higher density and liquid cooling, and located optimally for the AI companies.

But they are not that common right now. “While we occasionally have a single tenant lease out an entire building, most of our buildings have dozens of tenants, each with distinct needs,” he explained. “As such, our designs are future-proofed for significant densities and new technologies like liquid cooling, but we only implement these solutions on a by-tenant basis."

Sukumaran says that while he has yet to see a dedicated AI data center used by a single tenant, it is possible that enterprises and governments using AI for various scenarios could end up building dedicated AI clusters.

“The power consumption of AI clusters would be a limiting factor to have a large number of servers in data centers, and it is very likely that some of these data centers would be dedicated to AI.”

The security and regulatory framework around AI could also drive this trend. Generative AI and the development of AGI have raised several security and compliance concerns, so it's possible that enterprises may decide to run such workloads from highly secure, dedicated facilities.

The AI Factory Versus the Data Center

With AI power density five to 10 times that of the traditional data center, an AI factory would not be the size of a traditional data center, which has gone north of one million square feet. An AI factory would be more like 10,000 sq ft, said Sean Farney, vice president of data center strategy in the Americas at data center construction firm JLL.

“They are going to be smaller because you can't build a 700,000 sq ft AI data center. The power that would be consumed by that thing would just be monstrous,” he said.

Another difference between traditional data centers and AI factories is their location. Whereas giant data centers tend to be developed in remote areas next to renewable energy sources, AI factories could be situated in downtown or big-city areas and in existing facilities that have lots of power available.

“Right now, there's tons of underused office and retail space,” Farney said. “What becomes very, very attractive is an abandoned building or underused building an urban space, or a part of an old warehouse in the middle of nowhere, that already has power where you can plunk down some AI gear, some liquid cooling and plug in and go.”

Although it’s impossible to predict the future of the data center industry, the rapid growth of artificial intelligence hints that AI factories may soon become a necessity, as digital infrastructure operators scramble to keep up with rising demand.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish