A new study by researchers at the University of California, Riverside, reveals that artificial intelligence tools, like ChatGPT, are using more water than initially thought, mainly for cooling purposes.
OpenAI’s ChatGPT alone needs four times the amount of water for cooling than previous estimates suggested. The researchers found that handling 10 to 50 user requests on AI chatbots can use up to two liters of water, while earlier estimates indicated just half a liter.
This higher water consumption is due to the cooling demands of data centers, which are required to process these AI requests.
This concern isn't limited to ChatGPT, as other tech companies face similar issues. Microsoft, for example, reported a 22.5% increase in water use for its AI models, while Google and Meta’s water usage rose by 17% in 2023 and 2024.
This growing water and energy demand is becoming a global concern. In the UK, data centers are projected to consume as much water as a city the size of Liverpool, while in Ireland, they account for 21% of the country’s total energy consumption.
The study underscores the environmental impact of expanding AI technologies.
留言