The AI boom is colliding with a resource constraint, water

Everyone's focused on the GPU shortage, but the real constraint on America's AI boom might flow from the tap.
Data centers don't just consume electricity, they use massive amounts of water to prevent overheating. Servers generate enormous heat during computation, and water circulates through cooling systems to absorb that heat. Approximately 80% of that water evaporates into the atmosphere and can't be reused.
A single large facility can use up to 5 million gallons per day, equivalent to a town of 10,000 to 50,000 people. In Newton County, Georgia, one Meta data center consumes 10% of the entire county's water supply. In Northern Virginia, facilities consumed nearly 2 billion gallons in 2023, up 63% from 2019.
Even though these projects were approved locally, driven largely by tax revenue incentives, the regulatory frameworks weren't designed to assess whether water systems could handle this new demand. When they fail, locals pay the price: water shortages during droughts, higher utility bills as municipalities scramble to expand infrastructure, and restrictions on residential use while data centers keep running.
Fortunately, communities have fought back, scrapping 25 projects in 2025, with 99 more at risk of cancellation. But other communities that have already gone all-in must face the consequences.
🦍 Ape Watch: Trending News
🐒 Not subscribed yet?
Subscribe here. Also, please forward to any friends who might be interested!