Dish-Washing

Wash Dishes or Watch Netflix? The Water Demands of Our Digital World

Imagine a world where, in order to use Spotify or Twitter, you had to sign an agreement saying you will only take one shower per week. Or think instead of how long it took to make your last purchase on Amazon. What if you were charged a “water usage” fee, depending on how long you browsed the site? These examples may seem far-fetched, but if more attention isn’t paid to the amount of water used by data centers, it could soon become reality for millions (if not billions) of people. As our appetites for social media, streaming platforms, self-driving vehicles and 5G cellphones increase, more data centers will be needed to provide these services and they will need to become increasingly more efficient to handle the demand.

Data centers have two main processes: run the Information Technology for these services and provide power, cooling and security so they continue to operate. There are several ways to cool a data center, but most often we see large quantities of water being used to “reject” the heat.

To understand the efficiency of data center cooling, we look at two figures. The first is “partial power usage effectiveness,” (pPUE) which measures how much power is being used for cooling versus computation. This is important because, depending upon which report you’re reading, data centers consume between 2–3% of the world’s energy, and 40% to 55% of that energy is typically used for cooling.

The second (and arguably the more overlooked) metric is “water usage effectiveness,” or WUE. WUE is a calculation of the annual data center water usage divided by the IT power usage in kilowatt hours (kKWwh). Water usage at a data center includes water used for cooling, regulating humidity and any electricity produced on-site. WUE also takes into account the water usage for electricity produced offsite (for example: steam evaporating from power plant cooling towers). The Lawrence Berkley National Laboratory estimates that “it takes about 7.6 liters of water on average to generate 1kWh of energy in the US, while an average data center uses 1.8 liters of water for every kWh it consumes.” No matter which part of that sentence you focus on, it’s clear — there is a lot of water being used.

It’s important to note that the data center community has done the right thing by focusing on improving energy efficiency (or pPUE). By making data centers more efficient in their use of electricity, it helps to lower the required consumption of fossil fuels and other critical energy resources, which in turn lowers the amount of water consumed to generate power. More attention now needs to be turned to the massive amount of water used for data center cooling.

We’re seeing a reduction in the use of water-based cooling towers in data centers, thanks to air-cooled chillers becoming more efficient. However, for data centers in warmer geographies, high quantities of water are still being used for pre-cooling. The opposite side of the spectrum is not doing much better — sites that do not use pre-cooling do indeed save water, but they use more electricity to operate air-cooled chillers (with the water-usage issue impacted upstream at the power provider).

As our demand for digital services continues to increase, data centers will continue to proliferate, especially with the advent of 5G and edge computing. And while water is a renewable resource, it is not distributed equally across the globe. Data centers, however, will need to be distributed rather fairly in order to provide services with low latency.

Let us consider one of the world’s most populous cities, São Paulo. São Paulo underwent an extremely devastating drought from 2014–2017, the worst drought the region had seen in a hundred years. At its worst, the city’s reservoirs were only 6% full and the city undertook severe rationing to avoid a total loss of the water supply. In light of these statistics, it is not hard to imagine a future where São Paulo residents must ration their digital activities and/or their water use because too many local data centers rely on water for heat rejection.

The São Paolo drought isn’t an isolated incident. According to leading meteorologist Dr. Jeff Masters, there were eight separate “billion-dollar droughts” in 2019 — the highest number on record — with total costs coming to $23 billion. The 2012–2013 drought in the U.S. Midwest was equally critical, costing $35 billion.

No Water? No Problem

I believe air-cooling (with water used for heat rejection) is quickly becoming a relic of the past, thanks to the adoption of cutting-edge “liquid cooling.” Liquid cooling, more specifically two-phase immersion cooling, can reject up to 98% of the IT head load through a liquid to liquid heat exchange and is 40% more efficient than air cooling. In two-phase immersion cooling, the IT is submerged in an environmentally friendly non-conductive fluid and dry coolers are used to reject the heat outside the data center. No water is used or required to reject the heat.

Here’s how two-phase immersion cooling works:

From a water-use perspective, liquid cooling is game-changing — water is now no longer needed inside a data center and can save a drastic amount of water in the power generation phase. Existing data centers that switch to liquid cooling can completely eliminate the use of IT fans, and the number of air handlers can be significantly reduced.

Environmental change is real — unless we make serious changes soon, we will see more severe and longer-lasting droughts, along with other critical weather events. The good news is that 2-phase immersion cooling can help combat destructive climate change. If all data centers used immersion cooling instead of air cooling, we could reduce the world’s total water consumption by nearly 200 trillion liters a year (and keep data centers running even in extreme drought) — meaning you can have your Netflix and your dishwasher too.

Stay in the know!

Sign up to get notified about the latest LiquidStack news, resources and events.