Skip navigation
1024px-Chiller.jpg

Is PUE K.O.’d? Water-cooled Data Center Builder Nautilus Says Yes

More hyperscalers should be considering the carbon expenditures from all the activities that make power consumable. One innovative builder suggests, if they did, maybe they wouldn’t be hyperscalers.

Quick: What’s the power usage efficiency of your data center? And while you’re pinning that number down in your mind, did the formula used to derive that ratio take into account all the resources expended in delivering that power to your facility, and in processing or recycling the waste from your facility?

With Equinix now sharing its journey towards carbon-neutrality with its customers and the public, breaking down all its carbon expenditures into three categories — including indirect use — suddenly the question of efficiency takes on new meaning. If a so-called “green fuel” produces more carbon particulates during the refinement process than are saved in its consumption, compared to biodiesel fuel, then in trying to achieve efficiencies, could we be simply pushing our own carbon burdens onto others?

Pepé Le PUE

“PUE is a terrible metric,” remarked Rob Pfleging, veteran engineer and President of Nautilus Data Technologies, speaking with Data Center Knowledge.  “It’s the only one we’ve got, so it’s better than nothing. Personally, I think DCIE is more intuitive — it’s more intuitive for me to say ’80 percent’ than ‘1.2.’ But it’s too easy to game the system. And there’s no regulatory teeth to PUE.”

If you’ve been following along with us this week, you’ve seen where Nautilus will be building a new data center facility on the site of an abandoned paper mill in Maine.  (If I’m telling the story right, the new facility will be a new building on the old site, not inside an old building.)  The way the site was originally constructed, water from an uphill reservoir flowed down, enabling gravity to turn the gears at the on-site hydroelectric substation, without the need for additional pumps.

Maybe not every data center project site will be as lucky as Millinocket, where the lay of the land could be leveraged to reduce power consumption. But if data center builders’ objectives are to follow Equinix’ lead and move not just towards low PUE / high DCIE but also carbon-neutrality by the Paris Accord milestone date (which gets moved around in public discussion, but may still be 2030), then they need every natural advantage they can possibly grab for themselves.

If their key performance indicators are squarely focused on whether they’re devoting most of their resources to computing rather than cooling — which is all PUE is really about, after all — then how will operators and engineers be able to gauge their progress?

PUE, Pfleging remarked, can be easily gamed. A facility that uses cooling doors, for example, may plug them into the same power strips as their IT loads, and count that power expenditure as part of the numerator in the PUE formula rather than the denominator. Or a hyperscale complex may count evaporative cooling as part of its efficiency metric, but conveniently exclude the fact that it evaporates several hundred thousand gallons of potable water per day. Sure, processes sound more effective, when we selectively exclude inefficiencies from our field of vision.

“What people aren’t taking into account is, the electricity associated with that water getting to me — which is large,” continued Pfleging.  “It’s not part of the number; it’s borne by the utility company. The electricity associated with that 700-800,000 gallons I’m taking in — or maybe I’m taking in a million, because I’m going to blow down a lot of chemicals and water off the cooling towers every day, that now go into the sanitary systems for them to do cleanup, and take those chemicals back out. There’s electricity associated with that.”

Hyperscale data center operation is all based on a template. It’s through this template that large facilities achieve scalability — by identifying the factors and characteristics that can be easily repeated, and building repeatability into designs. The emerging argument is, however, that there are factors selectively excluded from being taken into account, which will soon put hyperscalers into, to use the technical term, heaps of trouble, especially once carbon neutrality transitions become unavoidable.

“The current data centers have painted themselves into a corner,” stated Jim Connaughton, Nautilus’ CEO and the former Chairman of the White House Council on Environmental Quality.  “They’re land-locked, they’re built in commercial zones, and they are now constrained from any meaningful future-proofing.” Connaughton points to moratoriums in Singapore and other major industrial centers, where the resource usage problems have become too great for cities’ infrastructures to handle.

“If you had a facility that was reasonably proximate to naturally cold water, you could retrofit that tomorrow, or the plant could be built new yesterday,” he continued, “and the economics of cooling with water would be sufficient to decide to completely replace your chilled-air system with a water cooling system.”

TRUE or False

Nautilus has suggested that the data center industry adopt a metric called Total Resource Utilization Efficiency (TRUE), for which two California-based Nautilus engineers received a patent last December. According to the patent, TRUE takes a plurality of climate-oriented factors into consideration, including the following:

…calibrating at least one of the power unit, the water unit, the compute system, the storage system, the power management system and the operating condition of the data center facility, and determining an environmental impact based on a plurality of environmental impact variables comprising a greenhouse gas (CHG) intensity, a Carbon Intensity, a Particle Matter Intensity and an SO2/NOX intensity, wherein, in determining the environmental impact, the method comprises: aggregating the plurality of environmental impact variables comprising the greenhouse gas (GHG) intensity, the Carbon Intensity, the Particle Matter Intensity, and the SO2/NOX intensity, based on a number of units produced per megawatt hour (MWh); and determining a water treatment chemical intensity, based on a number of units of chemicals used annually at the data center for water treatment, calculated using the number of units used for water treatment per megawatt hour (MWh).

Maybe you get the gist of the idea: Measuring one’s TRUE index would not be an easy thing to do. You can imagine Nautilus itself pulling it off, or at least a reasonable facsimile, but without other values with which to compare it, the number itself might fall flat.

“TRUE will take into account electrical consumption, water consumption, and use of chemicals to include ozone-depleting refrigerants,” noted Pfleging. But even then, the concept could evolve, he admitted. Perhaps it could also take into account use of existing or recycled materials.

“It’s evolving every day,” he added.  “But to me, it’s about total impact to the environment. And it also ought to be able to take into account net benefits you’re providing.” As an example, he cited a data center facility’s ability to pipe its heated output water to a next-door tenant that could actually put that hot water to use, perhaps in warming the building. Maybe someone should hand out points for that.

“To me,” Pfleging said, “TRUE should take into account all of those things. You shouldn’t judge a data center based on other data centers, globally. You should judge them based on other data centers in like environments.”

But maybe that defeats the purpose of a standard metric: enabling comparisons between everything on a level plane. Given that so many variables would need to be taken into account to produce any data center’s TRUE index, would it end up having any practical use?

We put this question to Omdia’s principal analyst for data center power and cooling, Dr. Moises Levy. Just four years ago, Dr. Levy proposed a metric of his own which, like TRUE, would require what he called “a multidimensional approach.”  (Omdia and Data Center Knowledge are sister companies of Informa Tech.)

In that proposition, he told Data Center Knowledge, “I state the importance of using comprehensive metrics to measure several aspects of performance, examined across four sub-dimensions: productivity, efficiency, sustainability, and operations. Risk associated with each sub-dimension is also contemplated, as well as external risk such as location and global risks.  Data center multidimensional metrics have been incorporated in 2019 ANSI/BICSI-002 and BICSI-009 data center standards and best practices, for which I am a contributor.”

There are many new ways, stated Levy, to measure the multitude of factors that a new metric would require, including from sensors that many data centers have already installed.  “In A new approach to data center infrastructure monitoring and management (DCIMM) I describe a non-invasive and wireless sensor-based system to enable real-time monitoring and optimization of parameters such as power, temperature, humidity, airflow, among others.

“Historical data — much needed for comparisons and predictions — can be built over time. However, the real challenge is to be consistent and reliable on what to measure and how,” he went on.  “I consider that the key value of real-time comprehensive metrics is linked to AI–enabled predictive and prescriptive analytics. This helps users make more informed decisions to improve data center performance while reducing failures.”

Hyperscale data center development, remarked Nautilus’ Connaughton, “was right for its time. And it’s completely ripe for reconsideration and innovation.” Water cooling, he believes, is the key. Once the industry moves away from airflow as the principal cooling medium, “immediately, you can put two times more racks in there. Without even thinking. You can double the value of that footprint.”

Once those transitions are made, Nautilus’ executives believe, PUE levels will start to average around 1.1 (91% DCIE), and then descend from there. But after that time, when the numbers hover around 1.08 and 1.09, PUE / DCIE may become relatively useless. As long as airflow remains the medium, however, PUE can continue to serve for awhile. . . perhaps as a measure of the inefficiencies of which facilities have yet to rid themselves.

Cover photo of a York International water-cooled chiller, circa 2006, in the public domain.
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish