Weâre shielded from the environmental impact of our apps, but the effect is very real. Not only are electricity shortages forcing the Norwegian government to choose between ammunition production and TikTok storage, but also the water required by data centres for cooling is truly astounding.
The paper estimates that training GPT-3 at Microsoftâs state-of-the-art US data centres would consume 700,000 litres of clear freshwater. This is due to the sheer scale of an operation that Microsoft has revealed contains supercomputers with 10,000 graphics cards and more than 285,000 processor cores.
And thatâs a conservative estimate, because training could also be done at the companyâs less efficient Asian data centres. If thatâs the case, water consumption could be tripled to 4.9 million litres.
The upshot of this is that consumers engaging in a 20-50 question conversation with ChatGPT will see the bot âdrinkâ a 500ml bottle of water, the researchers say.
âWhile a 500ml bottle of water might not seem too much, the total combined water footprint for inference is still extremely large, considering ChatGPTâs billions of users,â the paper adds.
Why do data centres need water?
Itâs all about cooling. Data centres, whether used to train algorithms or store TikTok videos, have a lot of computer hardware inside. This in turn generates a lot of heat.
If left unchecked, this could damage the equipment, so server rooms are kept between 10 and 27 degrees centigrade. Because of the constant heat emitted by racks and racks of computers, data centres evaporate water to keep things frosty.
Not just any water, either. It has to be fresh water, because using seawater could cause corrosion of the hardware.
Thatâs not a great look when draughts are very much a thing across the world. And â while itâs not as headline-grabbing as techâs carbon footprint â big companies are keen to show theyâre working on the problem.
Read More
The paper on AI chatbotsâ water consumption starts with a selection of quotes from Meta, Amazon and Google. These highlight their awareness of the importance of water security. âWater is a finite resource, and every drop matters,â reads Facebookâs 2020 Sustainability Report.
How to reduce big techâs thirst for water
The paper has a few suggestions as to how AI chat models could be more water efficient.
First, when and where AI models are trained makes a huge difference. Cooler ambient temperatures require less water. This means that models could be trained at night and/or in locations with colder climates.
There is an environmental dilemma here, though. Hotter climates obviously have more sun, and solar power certainly helps to reduce techâs massive carbon footprint. But, in an environmental catch-22, that heat requires more water consumption. âIn other words, only focusing on AI modelsâ carbon footprint alone is far from enough to enable truly sustainable AI,â the researchers say.
What can consumers do? Thatâs a bit harder to answer. They could theoretically time their ChatGPT requests for âwater-efficient hoursâ, in the same way that people sometimes run laundry loads overnight when electricity is cheaper. But thatâs currently difficult for even the most contentious user due to chatbot makers keeping such data quiet.
âWe recommend AI model developers and data centre operators be more transparent,â the researchers conclude.
âWhen and where are the AI models trained? What about the AI models trained and/or deployed in third-party colocation data centres or public clouds?â
âAI modelsâ water footprint can no longer stay under the radar â water footprint must be addressed as a priority as part of the collective efforts to combat global water challenges,â the paper concludes.