AI Datacentre Energy Consumption Surges
Artificial intelligence systems could account for nearly half of datacentre power consumption by the end of 2025, according to analysis by Alex de Vries-Gao, founder of the Digiconomist tech sustainability website. This projection comes as the International Energy Agency (IEA) forecasts that AI will require almost as much energy by the end of this decade as Japan uses today.

De Vries-Gao’s calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices used to train and operate AI models. The research also considers energy consumption by chips from other companies like Broadcom. The IEA estimates that all data centres – excluding cryptocurrency mining – consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao argues that AI could already account for 20% of that total.
Factors Influencing AI Energy Consumption
Several variables affect these calculations, including datacentre energy efficiency and electricity consumption related to cooling systems for servers handling AI workloads. By the end of 2025, AI systems could consume up to 49% of total datacentre power, reaching 23 gigawatts (GW) – twice the Netherlands’ total energy consumption.
However, factors like waning demand for applications such as ChatGPT and geopolitical tensions affecting AI hardware production could slow hardware demand. For instance, export controls on Chinese access to chips led to the development of the DeepSeek R1 AI model using fewer chips. “These innovations can reduce the computational and energy costs of AI,” said de Vries-Gao.
Concerns About Increased AI Usage
Any efficiency gains might encourage more AI adoption, potentially increasing hardware demand. The trend of “sovereign AI,” where countries develop their own AI systems, could also boost hardware requirements. A US datacentre startup, Crusoe Energy, has secured 4.5GW of gas-powered energy capacity, with OpenAI among potential customers through its Stargate joint venture. This raises concerns about increased dependence on fossil fuels.
Need for Transparency in AI Energy Use
Prof Adam Sobey from the UK’s Alan Turing Institute emphasised the need for more transparency about AI’s energy consumption and its potential to improve efficiency in carbon-emitting industries. “I suspect we don’t need many very good use cases [of AI] to offset the energy being used on the front end,” Sobey said.
The EU AI Act requires AI companies to disclose training energy consumption but not day-to-day usage. As major tech companies like Microsoft and Google have admitted that their AI drives endanger environmental targets, the need for sustainable AI practices becomes increasingly critical.