When data centers come to town, their water use gets a lot of attention. It’s common to see large numbers thrown out. These don’t always make much sense, but if something is measured in millions of gallons or billions of gallons, then its scale seems impressive.
Take, for example, the estimate from Google that one of their data centers uses 450,000 gallons of water a day. This sounds like a lot–especially if it’s paired with comparisons to an Olympic-sized swimming pool (660,000 gallons). A 2023 Washington Post report suggested that a large data center “can gobble up anywhere between 1 million and 5 million gallons of water a day – as much as a town of 10,000 to 50,000 people.” In both comparisons, everything emphasizes scale.
Yet, it’s not necessarily informative to make comparisons between sectors. Data centers are a commercial and industrial use, so it’s not clear how comparing it to recreational and residential water uses clarifies much. If you compare the water use to other commercial water users, it may not seem so large. One comparison, again from Google, is that it takes 450,000 gallons of water to produce 160 pairs of cotton jeans.
No single comparison will do the job. Every comparison will have limits and exaggerations, even while it is perfectly defensible. That Washington Post comparison is a useful example. The Environmental Protection Agency says that the average American uses 82 gallons of water per day at home. Scaling this up to be a town of 10,000 suggests around 820,000 gallons of day–with some more rounding that gives us the one million gallons they use for a town of 10,000 people. Yet does it make sense to further scale this up to a town of five times the people? Maybe, but that means ignoring potential efficiencies in scale that would push the number down for larger cities.
Ignoring scale in the comparison is on top of already rounding from 82 gallons per day to 100 gallons per day per person. That rounding adds more than 20 percent to our guess. And, remember, this compares an industrial user to residential users. Switching sectors is likely to confuse our comparisons, not clarify.
Fundamentally, good comparisons are difficult. Even reasonable choices and starting points can add up to dramatically different places. Averages wipe out differences between types of data centers, so can only take you so far. Still, these kinds of back-of-the-envelope estimates will always be useful for giving a range of what to expect. More broadly, we should expect resource intensity to decline overtime because of natural incentives to cut costs.
Companies want to lower costs, so they find efficiencies
Efficiency improvements in computing’s electricity requirements have been impressive. One team summarized the global trend as a six-fold increase in computing with only a one quarter increase in energy use. In a 2020 paper, even a doubling of computing does not dramatically increase energy use, assuming these efficiency improvements continue.

In water, we’re seeing similar efficiencies arise. For instance, Microsoft has reduced its water intensity (water consumed per kilowatt-hour) by over 80% from its first generation of owned data centers in the early 2000s to its current generation in 2023. This enables far more computing with the same amount of water.
These efficiency improvements are in addition to their private efforts to replenish water. Specifically, Microsoft’s commitment is to replenish “more water than we consume across our global operations.” These kinds of investments are common in the tech world. Google and Meta have similar endeavors to use water efficiently and to clean and reclaim water. These sometimes are even in excess of the water the company uses–Google’s goal is to “replenish 120% of the water” that they consume. Meta’s goal varies by the water needs of the region–with a goal to “restore 200% of consumption in high water stress regions and 100% of consumption in medium water stress regions.”
Many of the complaints about a data center’s resource use misses these factors. Not only is there an inherent incentive for efficiency as companies cut costs, there are social incentives for companies to invest in maintaining the resources that they use.
Careful comparisons will always be valuable
Localities are right to make careful plans and ensure they build systems that support all users. It’s particularly important that large users pay their share of the costs. Yet panicked and confused comparisons can easily mislead if they become divorced from overall trends.
This kind of optimism isn’t Pollyannish. Believing that data centers will increasingly find ways to get more from less over time is just a belief that the three decades of the technology sector’s history are a useful guide to the next three decades. In fact, this kind of dematerialization has been an unseen story in the modern world. Consider how many devices your phone replaces! We’ve done more with fewer resources–an economic and environmental success.
To the extent that recent news reports have highlighted rising resource uses, energy, water, or emissions, these largely reflect temporary trends and upfront costs in developing AI. Policymakers and commentators would do well to think about the last few years, not the last few days, when considering responses.