The Problem
The power demands of AI are growing exponentially at the risk of becoming unsustainable if its growth is left without intervention. The impacts could have wide ranging implications from the physical environment to societal.
Training of GPT-3 was estimated to have consumed 1,287 MWh (megawatt-hours), equivalent to the typical annual electricity usage of 477 average UK households (Broadway et al, 2024).
Training the GPT-3 language model can directly evaporate 700,000 litres of clean freshwater, and further consuming 500ml of water for roughly 10 – 50 medium length responses. (Li et al, 2025)
The Solution
We have been researching and assessing ways to maximise the efficiency of AI training and inference beyond hyperparameter tuning, and looking into bespoke low-power hardware as an alternative.
Creation of a robust benchmarking system for finding the optimal platform. Creation of a robust benchmarking system for finding the optimal platform.
Creation of thorough guidance, to support decision making of how to run workloads in the most sustainable way.
Consistent & uniform energy metrics across services to facilitate fair comparison.
Outcomes and Benefits
To ensure that AI can remain a sustainable solution there is a need to increase awareness and understanding of energy requirements and environmental implications by publishing energy-efficiency data.
Through our research we can help the industry to be transparent about the true monetary, environmental, and human costs of AI.
Our aim is to support informed decisions on the choices of AI technologies, from when to use it, to potential returns of investment.
HPC Workflows
HPC Workflows
HPC Workflows
HPC Workflows