News — Energy-hungry Artificial Intelligence ‘has an unsustainable impact in terms of CO2 emissions; training large deep learning models (such as GPT and BERT) has a significant environmental impact, in fact training a model such as GPT-3 (175 billion parameters) required 355 GPU-years (the GPU is the type of processor used for AI), costing an estimated $4.6 million in energy alone and consuming around 1300 megawatt-hours (MWh) for training alone, equivalent to the annual consumption of 130 homes in the US. For example, the training of BERT (a Google AI platform) produced 284 tonns of COâ‚‚, equivalent to the emissions of 125 transcontinental air journeys. The data centers hosting these models consume about 15 per cent of Google's total energy. There is an urgent need to move towards the use of ‘green’ AI platforms that consume less energy, albeit at the expense of lower accuracy. It is estimated that Green AI (achieved in various ways, such as by training on less data or opting for more sustainable processors) can reduce energy consumption and carbon footprint by up to 50 per cent or more, depending on the technique used.
These are the results of a study by Enrico Barbierato and Alice Gatti, researchers from the Department of Mathematics and Physics at the Università Cattolica del Sacro Cuore, Brescia campus, published in the journal IEEE Access. The study highlight that the main problem is that many companies still focus on the maximum accuracy of models, accepting the high cost to the environment that they entail.
Background
Artificial intelligence has become a major player in recent years, mainly due to deep learning, which enables revolutionary results in various fields. However, advanced models such as ChatGPT have a significant environmental impact due to the high energy consumption required for their training. In particular, so-called Red AI, models trained with resource-intensive methods on large datasets. maximize accuracy and performance, but entail high energy costs and a considerable ecological footprint. In contrast, Green AI are models designed to reduce environmental impact through the use of smaller datasets, less wasteful training techniques or the adoption of sustainable energy sources to power them. Green AI aims at efficiency rather than mere accuracy, Italian experts explain. The energy cost of AI grows exponentially as the size of models increases. It is estimated that the energy required to improve the accuracy of a model by 1 per cent is an order of 100 times higher.
The study
The paper suggests strategies to reduce the impact of Red AI, starting with an increase in computational efficiency through the use of specialised hardware such as Tensor Processing Units (TPUs, a type of processor that is up to 30 times faster and up to 80 times more efficient than a normal CPU in our PCs) and GPUs optimised to reduce power consumption. Then it can be useful to choose techniques to reduce the number of parameters to train the model (up to 80% fewer parameters), without reducing its performance. This can reduce energy consumption by 30-50%. And again, one must opt for ‘renewable energy to power the processors. Some AI models have been tested with 100% power from renewable sources, cutting emissions almost completely.
‘Sustainable AI is possible, but requires trade-offs between accuracy and energy consumption,’ the Catholic University researchers explain. Red AI generates a huge carbon footprint, while Green AI tries to reduce it with more efficient methods and the use of clean energy. However, the transition is complex because companies still focus on larger and more accurate models at the expense of environmental efficiency,' they conclude.