AI’s Drinking All The Water!
- Kieren Sharma

- Aug 12
- 5 min read
Updated: Oct 18
In our latest episode, we dove deep into a surprisingly pressing issue that's making waves (pun intended!) in the tech world: AI's growing thirst for water. What started as a seemingly innocuous detail in a blog post by OpenAI CEO Sam Altman has quickly escalated into a global concern, prompting us to ask: Is AI really drinking all the water?

How Much Water Are We Talking About?
The discussion kicked off with Sam Altman's blog post, “The Gentle Singularity," where he mentioned that an average ChatGPT query uses about 0.34 watt-hours of electricity and a tiny one-fifteenth of a teaspoon of water. Sounds like nothing, right? But as we dug deeper, the numbers started to tell a different story.
OpenAI reportedly receives around 2.5 billion queries every single day. If you multiply Altman's figure by this staggering number of queries, it equates to roughly 200,000 gallons of water per day, or nearly 1 million litres.
However, this figure has been widely debated. Research papers suggest Altman's numbers might be significantly underestimated, possibly referring to smaller, older models or only a very narrow aspect of water use. For instance:
A paper titled “Uncovering and Addressing the Secret Water Footprint of AI Models" found that 10 to 50 “medium-sized" queries (around a 100-word email) to GPT-3 evaporated approximately 500 millilitres of water.
The Washington Post, recalculating with the more modern GPT-4, found it was 519 millilitres of water per single 100-word query. This is nearly half a litre per query, making Altman's initial estimate seem very low indeed.
Even more concerning, the International Energy Agency (IEA) estimates that OpenAI's facility, using about 500 megawatts of power, could be consuming around 10 million litres of water a day. This means each prompt would equate to about 4 millilitres of water, nearly 10 times Altman's figure.
Globally, data centres currently consume 560 billion litres of water per year, a number the IEA expects to more than double to 1,200 billion litres per year by 2030. These are truly immense figures!
Why Does AI Need Water Anyway?
It might sound odd that computer systems require water, but there's a crucial reason: heat.
AI models run on incredibly powerful computers housed in massive data centres.
Unlike Central Processing Units (CPUs), which are good for single, fast tasks, AI relies heavily on Graphics Processing Units (GPUs).
GPUs are designed for parallel processing, handling many tasks simultaneously, making them much more powerful but also causing them to get way hotter. Remember Sam Altman's X post about GPUs “melting"? That's why.
To cool these hot GPUs, simple air fans are no longer enough. Data centres use sophisticated liquid coolant systems, and this is where water comes in.
Crucially, this isn't just any water; it needs to be clean, fresh (drinking) water to prevent issues like corrosion, salt buildup, or bacteria in the cooling pipes.
The cooling process involves pipes circulating coolant over processors, which then transfers heat to water in a heat exchange unit. This hot water goes to cooling towers, where fans cause 80% of the water to evaporate to dissipate the heat. This means fresh water is constantly lost and needs to be replaced.
To put this into perspective, a typical 100-megawatt data centre guzzles about 2 million litres of water a day, enough for 6,500 US homes. And OpenAI is building a 5-gigawatt facility – 50 times that size, capable of powering the entirety of London or 4.5 million toasters!
The Real Problem: Local Impact and Water Stress
While the global numbers are staggering, the real concern lies in the local impact. Data centres are often built in areas already struggling with water scarcity:
Two-thirds of all data centres commissioned or built in the United States since 2022 are located in areas of high or extremely high water stress zones.
Just five states host 72% of these new data centres: California (17), Arizona (26), Texas (26), Illinois (23), and Virginia (a staggering 67 new data centres since 2022). Virginia is particularly popular because it's where much of the internet's original infrastructure was built, offering faster data speeds.
This pattern is echoed globally, with tech companies establishing data centres in water-scarce countries like China, India, Saudi Arabia, and the UAE.
This has led to protests worldwide, from the Netherlands and Uruguay to Chile, and even here in the UK, Anglian Water recently objected to a new AI data centre in North Lincolnshire due to insufficient water supply.
The unfortunate truth is that water costs and scarcity are often a later consideration when siting data centres, trumped by the price of energy and the need for high-speed internet infrastructure.
So, What's the Solution?
There are efforts underway to address this monumental challenge:
Big Tech & Scientific Innovation:
Most major tech companies have pledged to be water-neutral by 2030, though specific plans are often vague.
Some CEOs optimistically suggest that scaling AI will accelerate research into new clean energy sources or cooling mechanisms.
More concretely, companies like Microsoft are investing in “zero-water" designs or closed-loop systems that recycle all the water used for cooling, preventing evaporation. OpenAI is also incorporating these into new facilities.
Amazon's AWS is exploring innovative methods like using treated sewage for cooling.
There's even research into using the heat generated by data centres to warm homes in cooler climates.
However, many of these solutions are still in the research or pilot phases, and for now, AI still relies on evaporating vast amounts of water.
What You Can Do: A Three-Step Decision Plan for AI Use:
We’ve put together a simple three-step decision framework to help you consider your own AI usage:
The Social Value Test: Before you send a prompt, ask yourself: Is this genuinely valuable to me or my community, or am I just trying to kill time? Using AI for essential services or research is different from asking for a silly story when you could be doom-scrolling instead.
The Right-Tool Test: Do you actually need a giant cloud model for this task? Many tasks can be done with a smaller, on-device model, traditional search, or even just your own brain, significantly reducing water and energy usage. We explore this in our “Island Explorer" exhibit at We The Curious, building intuition about the water use of different information sources.
The Frequency & Batching Test: Can you ask fewer, better questions? Learning prompt engineering to be more efficient with your AI interactions can reduce the number of queries needed to get the desired result.
The picture of AI's energy and water consumption is complex and goes much deeper than we could cover in one short episode. But it's clear that the environmental footprint of this rapidly advancing technology is a critical issue that demands our attention, from global policy to our everyday choices.
If you enjoyed reading, don’t forget to subscribe to our newsletter for more, share it with a friend or family member, and let us know your thoughts—whether it’s feedback, future topics, or guest ideas, we’d love to hear from you!




Comments