Since writing about the environmental impacts of AI and the importance of taking a responsible approach last September (2024), AI technologies have become even more integrated into our digital infrastructure. Generative AI (e.g. ChatGPT, Gemini, Copilot) has been adopted by millions and is becoming as familiar as checking an email, which is particularly concerning given generative AI development is typically where we see the greatest environmental costs. Previously, the focus was largely on the energy costs of training AI models, but we’re now seeing a shift towards the ongoing energy demands of running them instead. The tension around AI’s potential benefits versus its growing footprint continues, but an acknowledgement of the challenge has begun, both from big tech and governments on a global scale, which is a positive move in the right direction.
Given the nature of AI, things have of course developed since September. However, six months on, we feel that the message of responsible use remains the same. This blog will provide an update on the developments and summarise where we are right now.
The current environmental landscape
Energy and water consumption remain huge factors in the conversation around impact. Data centres that power AI and Large Language Models (LLMs) in particular, continue to require water for constant cooling, with Microsoft reporting yearly increases in operational water consumption. Figures from their environmental sustainability report indicate an increase from 4.2 million cubic metres in financial year 2020 (FY20) to 7.8 million cubic metres in FY23. It will be interesting to see the figures for FY24, given they’ve been working on quantum computing.
On top of this, we also need to consider the embodied emissions associated with AI and datacentres. These include the mining of rare earth minerals, the energy intensive processing of chips and hardware, and the emissions related to material and product transport, typically from extraction sites in one country, to factories in another. Another concern is the short lifespan of hardware components, such as servers, GPUs, and CPUs, which typically last only two to five years. This is often driven by constant activity to meet demand and frequent upgrades to keep up with rapidly developing systems. Whilst this isn’t an issue unique to AI, the advancement and scale of AI technologies are likely to intensify the problem.
Unfortunately, many of these components cannot be fully recycled either, and the materials can be costly and hazardous to handle. This is a much larger issue and conversation, but it’s clear the industry will need to work hard to ensure a more sustainable future.
Shifting focus: From training costs to ongoing operational impact
As we’ve previously mentioned, the training of LLMs requires a lot of computational resources, along with significant power and water for cooling. However, once trained, the models are then deployed and continue to need a vast amount of energy for inference. Following DeepSeek’s emergence earlier this year (2025), OpenAI’s Chief Operation Officer told CNBC that ChatGPT had 400 million weekly active users as of February (2025), which was up 33% from 300 million in December (2024). This rapid growth in users highlights the immense ongoing energy demand required to keep these systems running at scale.
DeepSeek, which disrupted the market with its release earlier this year, is an example of an inference-heavy application. You can read our thoughts and advice on this tool in a previous blog post. While it was much cheaper to train, it is a reasoning model, which ‘produces responses incrementally, simulating how humans reason through problems or ideas’, and therefore has a much slower response time. Research from HuggingFace found that the average responses were ‘6000 tokens long’, with some ‘containing more that 20,000 tokens’, requiring significant GPU use and implying a much higher inference cost. This was an interesting development and one which highlights how certain types of AI, even if cheaper to train, can place a greater ongoing strain on energy and computing resources due to how they operate.
Scale and the estimation challenge
As more people use the technology and it becomes embedded in daily life, the numbers around water and energy consumption, for example, are likely to continue rising. The National Engineering Policy Centre report projected a significant rise in the electricity demand of UK data centres from 2020 to 2050, reflecting the growing energy requirements of digital infrastructure. Although individual efficiency gains may be taking place, they are being outpaced by deployment scale.
In the last blog we mentioned that it can be difficult to get accurate data on environmental impacts as precise numbers are closely guarded by big tech companies (Microsoft, Google, Amazon etc.). This, unfortunately, remains the same but there appears to be a growing pressure calling for these numbers to be made available. It’s clear that the energy demand is there given that they’re not meeting their sustainability targets (as mentioned in their 2024 sustainability reports) and Microsoft even planning to reopen a decommissioned nuclear reactor to meet these demands. The NEPC report calls for the need for standardised reporting, as without decent data, we’re faced with the challenge of measurement and estimation. When referring to the evolution of demand for data centres and AI services, the International Energy Agency said in their World Energy Outlook 2024, that these projections should be made with caution and by putting results in the context of the broader energy sector.
While generative AI is just one piece of the puzzle, it’s clear that its environmental impact deserves attention. Of course, many other technologies and online activities have their own environmental impact too, and in a future blog, we’ll be putting those into perspective.
Are you currently promoting sustainable and responsible use of AI in your institutions? We’d love to hear from you.
If you work in the FE and Skills sector, you may be interested in joining Jisc’s FE and Skills Digital Sustainability Community. If so, please fill out the expression of interest form via this link and we will be happy to email you the invite to join the Teams channel. This online community space will be dedicated to fostering a collaborative environment where members can share ideas, resources, and best practices related to digital sustainability.
Find out more by visiting our Artificial Intelligence page to view publications and resources, join us for events and discover what AI has to offer through our range of interactive online demos.
Join our AI in Education communities to stay up to date and engage with other members.
Get in touch with the team directly at AI@jisc.ac.uk