
Artificial intelligence (AI) might seem invisible, yet it has a significant environmental impact.
Even something as simple as asking a chatbot for recipe ideas with sweet potatoes does come with an environmental cost. Determining the exact scale of that cost, however, is challenging, as corporations that own and operate the most popular AI chatbots are secretive about their data.
Recently, some studies have tried to quantify the problem. For instance, according to an article published in Nature Sustainability, the proliferation of AI servers in the United States could require 731 to 1,125 million cubic metres of water per year, and add up to 44 million metric tonnes of carbon dioxide emissions annually between 2024 and 2030.
What do these numbers mean? Should we be worried? To answer these questions and more, Euronews Tech Talks reached out to Jonathan Niesel, leader of the information technology department at Greenpeace Germany and Maja Kirkeby, a researcher focused on the energy consumption of software at Roskilde University in Denmark.
The environmental challenge of AI data centres
Data centres are specialised facilities that house all the computing infrastructure that IT systems require, including servers, data storage, and network equipment. These facilities are crucial for understanding the connection between AI and the environment.
They aren't new, as they power almost everything we do on the internet; however, data centres dedicated to AI require more electricity than those used for powering a simple internet search. And this can result in much greater environmental emissions.
“Not all of the grids are based on 100 percent renewable energy, so of course, we have CO2 emissions corresponding to the energy use of these data centres,” Niesel explained.
Water can also be a concern when it comes to AI, as it is normally used to cool technological equipment in data centres. Therefore, water consumption depends on the infrastructure’s cooling technology and, indirectly, on how the energy it relies on is produced.
For instance, some data centres located in cooler regions rely on outside air to cool the equipment; in such cases, water use is significantly reduced.
The energy source that powers the data centre also has an impact on water usage, according to Niesel.
“If you want to reduce the overall water usage, first you go to a power supply,” Niesel said, underlining that data centres powered by renewable energy could require less water than those powered by non-renewable sources.
Finally, another factor that could aggravate AI’s environmental impact is the quantity of e-waste produced when the hardware underpinning the technologies reaches the end of its life.
“Due to the new infrastructure, all the new servers need to be built, and they only have a certain lifetime,” Niesel explained. “But we are also seeing a decrease in recycling rates.”
Different environmental costs
Not all requests to AI chatbots have the same environmental impact. “Pictures are way heavier than text generation,” Kirkeby said. “You have to run more computations to figure out what should be in the picture,” she added.
Some AI chatbots may also be more environmentally sustainable than others, but this is difficult to determine with the data that is currently available.
“It depends on whether you trust what they [the companies behind big AI chatbots] are saying or not,” Kirkeby said.
This summer, for instance, Google released a report analysing how much energy its AI chatbot Gemini used for each query. The company estimated that the median Gemini Apps text prompt uses 0.24 watt-hours of electricity, emits 0.03 grams of carbon dioxide equivalent, and consumes 0.26 millilitres of water, or about five drops.
But according to Niesel, these figures should be taken with a pinch of salt: “They didn’t actually say which products were observed or how many tokens were used for the research; they just said the overall estimate is this certain amount of energy”.
What can be done?
According to Kirkeby and Niesel, users can and should be thoughtful about the requests they make to AI, but the main environmental responsibility rests with the companies developing and operating these systems.
Niesel pointed out that AI developers can decide where to build data centres, taking into consideration how electricity is produced and the availability of water resources in a given area.
Kirkeby added that many technical choices can also make a difference. For example, some AI models wait for a batch of requests before processing them in parallel, which can be more environmentally efficient.
Finally, both users and AI developers can pay attention to the size of language models they use and develop. “Using a specialised model is a good thing to do if you want to be mindful of energy use,” Kirkeby explained, suggesting that AI tools should be proportionate to the task they are designed to perform.
As an example, Niesel said that an industry customer in need of an email classification algorithm should not use the biggest language model available, but rather an AI tool highly specialised in carrying out this task.


