top of page

WHY AI IS A SUSTAINABILITY AND GOVERNANCE PROBLEM

By Junyu Ke

September 26, 2025 

​

It started with a rumor and a codename: Project Blue. In August 2025, Tucson’s residents sparked concerns over how a secretive AI data-centre proposal might siphon hundreds of millions of gallons from a desert city already counting every drop. After weeks of protest, Tucson passed a new ordinance [1]:  any user expecting to draw more than 7.4 million gallons per month must file a water conservation plan, face council review, and risk penalties for overruns [1]. To put that threshold in perspective, during GPT-4’s training timeline, Microsoft’s Iowa facilities drew ~11.5 million gallons in July 2022—about 6% of the service district's use that month

​

What sounds like local drama is a preview of the next decade.  

​

Start with electricity. By 2030, global data-centre electricity use is projected to more than double to ~945 TWh, with AI-optimised loads more than quadrupling. That’s just under 3% of worldwide consumption—small globally, but big locally: data centres tend to cluster in a few hubs, putting significant strain on local grids; utilities are seeing unprecedented capacity requests.  

​

Then water. Cooling electronics is often an invisible cost until a crisis puts numbers on the table. It is estimated that data centres will consume around 1.2 trillion litres of water annually by 2030, up from about 560 billion litres today. Impacts will be most acute in water-stressed regions, where build-out is accelerating. In Texas, policy modelling suggests data centres could consume ~399 billion gallons annually by 2030—about 7% of statewide use—illustrating how quickly AI work can become a watershed issue. Researchers have been warning that AI’s water footprint varies sharply by where and when workloads run, and should be tracked alongside carbon

​

None of this is hypothetical; it’s the new baseline for planning. 

​

AI runs on electricity, water, and land. Those are governed resources that impute legal duties and make regulatory claims that also intersect and resonate with the way that reciprocal obligations are articulated by many Indigenous scholars (e.g., Lewis et al., 2018) [2]. As AI scales, it is now a water-energy-equity question for municipalities, raising questions over how to consult and co-govern with affected communities, including tribal communities, when facilities, power lines, and water systems affect Indigenous lands and waters. 

​

Back to Tucson. The municipal ordinance didn’t appear in a vacuum. Nearby Marana had already moved to ban potable water for data-centre cooling; Pima County began exploring industrial siting limits; and there is a deeper governance question: how will officials engage Indigenous peoples whose lands and waters are implicated in these decisions? That’s the pivot from technical choices to shared authority, from better chips or “greener” code to clear disclosures and Indigenous co-governance over the resources and knowledge that AI depends on.  

​

Learn from Tucson’s playbook: a threshold in gallons/month doesn’t just manage risk. It reveals whose voices count when AI growth meets finite resources. If a workload will push a community above meaningful thresholds, then we slow down and respect local limits. This could mean conservation plans, reclaimed/non-potable sources first, clear enforcement for overruns, and full public access to filings, and more. Decisions don’t sit in silos—city, county, and Tribal authorities would have to coordinate so water and power planning follows the flow of the river, not just the flow of data.  

​

Far from being a brake on innovation, this co-governance is what unlocks AI’s climate upside. By building trust, transparency, and shared authority, it helps remove the adoption barriers that IEA flags—data access, digital infrastructure and skills, regulatory and security constraints, and social or cultural obstacles—so we can deploy proven AI applications where they actually reduce emissions. In doing so, institutions can reduce the risk of reproducing appropriative, extractive knowledge economies—the kind of anonymous, low-accountability ‘hyper-intelligences’ serving state or corporate agendas that Lewis et al. caution against—and move toward reciprocal relationships with land, water, and the communities who live with the outcomes.

 

[1] City of Tucson, Ordinance 12188 (adopted Aug. 19, 2025), Mayor & Council Regular Meeting (OnBase Meeting ID 1869), agenda item “Large Water Users,” attachment “ORDINANCE 12188.pdf.

 

[2] For a full reading, please refer to: Lewis, Jason Edward, Noelani Arista, Archer Pechawis, and Suzanne Kite. “Making Kin with the Machines.” Journal of Design and Science, published July 16, 2018. https://doi.org/10.21428/bfafd97b  

Proudly supported by Western University, through the Western Sustainable Impact Fund, and the Department of Gender, Sexuality, and Women's Studies.

© 2025 RhizomeMind.

All rights reserved.

bottom of page