← NewsAll
AI chatbot water use: should you be concerned?
Summary
Estimates of how much water AI chatbots use differ widely, while data centres do rely on water for cooling and related processes.
Content
AI chatbot water use is being discussed because different studies and companies report very different estimates for how much freshwater is used when people interact with chatbots. Data centres that run models need cooling and also draw on water in power and hardware supply chains. Some industry figures show tiny per‑query amounts, while academic and industry groups have offered much larger estimates. The debate centers on total volumes, where data centres are located, and whether local water supplies face seasonal pressure.
Key facts:
- Company statements report small per‑query amounts: Sam Altman said ChatGPT uses less than 1/15 of a teaspoon for an average query, and a Google Gemini study reported under 0.3 ml per prompt.
- Some research gives higher figures: a 2023 University of California estimate said about 500 ml of water for every 10–50 medium responses for ChatGPT (reported as such).
- The UK Government Digital Sustainability Alliance reported AI-related global freshwater use could rise from 1.1 billion to 6.6 billion cubic metres by 2027 (reported as such).
- Data centres use water for cooling, for electricity generation and during hardware manufacture; many sites are noted as being near sensitive biodiversity areas and at times face seasonal peak demand.
- Regulators and researchers have called for more transparency and annual freshwater reporting for data centres; the European Union now requires data centres to report annual freshwater consumption.
Summary:
Estimates and reported figures vary considerably, so the scale of AI chatbots' contribution to water demand is disputed and attention is focused on local and seasonal pressure where data centres operate. Undetermined at this time.
