Eco Innovation

The rising environmental impact of AI

19th August 2024
Paige West
0

Google's recent revelation of a 48% increase in emissions over the past five years, driven by the energy demands of its data centres, has spotlighted the environmental toll of artificial intelligence.

Previously a less publicised issue, the significant consumption of computing resources and energy by the rapidly evolving AI industry is now coming to light.

As businesses race to embed AI into their products, processes, and services, the substantial environmental impact can no longer be ignored. Importantly, it's not just the training of AI models that significantly contributes to the carbon footprint; the continuous inference needed to process requests and generate responses also plays a major role.

Dr Leslie Kanthan, CEO and Co-Founder of TurinTech, further explores.

Addressing AI's environmental impact requires comprehensive solutions, including the adoption of green energy and advancements in hardware. However, an often-overlooked aspect that is crucial for both AI performance and sustainability is the efficiency of AI code. Code efficiency is fundamental to reducing AI's carbon footprint and enhancing its operational effectiveness.

By focusing on code optimisation, companies can achieve immediate reductions in their AI's carbon footprint. This process involves refining the code to improve performance and reduce energy consumption, providing both environmental and financial benefits.

The Generative AI surge

The newest Generative AI (GenAI) models, such as ChatGPT and Copilot, require ever-increasing computational power for training and operation, with these demands doubling approximately every six months. While these models enhance AI capabilities, their escalating compute requirements come with significant environmental and financial costs.

For instance, it was estimated earlier this year that a single ChatGPT search query consumes 15 times more energy than a Google search. Furthermore, running the model could cost as much as $700,000 per day. With major tech companies integrating GenAI into their search engines, these costs are set to soar.

While the demand for compute power outpaces advancements in AI hardware, inefficient code exacerbates this unsustainable growth and cost. Although major Cloud providers like AWS, Azure, and GCP offer optimised compute and memory services, they do not address the rapid expansion of data centres needed to run these large AI models. Smaller companies, reliant on open-source models for their AI systems, must find ways to maximise energy efficiency.

The costs and challenges of inefficient code

When aiming to cut costs or optimise operations, the importance of optimising AI code is frequently underestimated. This oversight leads to the persistence of inefficient code, which makes AI models more environmentally and operationally unsustainable. Meanwhile, optimising code without the right tools is challenging.

Implementing AI also comes at a significant cost, with estimates suggesting that deploying even a basic AI system can cost around $50,000, considering hardware, software, and data requirements. The environmental costs start during the training phase. Five years ago, MIT Technology Review reported that training a single AI model could emit over 626,000 pounds of carbon dioxide, nearly five times the lifetime emissions of an average American car. Given Google's increased emissions, it's reasonable to expect a similar rise in emissions for training the latest AI models today.

The importance of code optimisation

Starting with code optimisation can help companies reduce the computational load on servers and data centres, thereby lowering their carbon emissions. This process also improves software efficiency, reduces latency, and enhances the user experience.

But how can engineers achieve this without resorting to laborious manual processes? The answer lies in using AI itself. Modern code optimisation platforms leverage GenAI and large language models, combined with input from human developers, to optimise AI code. These pre-trained models automatically identify inefficient code by scanning entire codebases and suggest improvements. Engineers can then evaluate these suggestions against the original code, ensuring that the optimised code performs the same functions more efficiently.

Significant benefits from efficient changes

Implementing code optimisation can yield substantial environmental and financial benefits. Improved model performance through code optimisation can result in a 46% reduction in memory and energy consumption, translating to nearly $2 million in annual savings in AI production and deployment. This approach can also save 256kg of CO2 emissions annually.

For data centres, which are central to Google and AI's rising carbon footprint, code optimisation can lead to massive carbon reductions. A 46% energy reduction could lower carbon emissions from 668kgs of CO2 equivalent per server annually to 360kgs.

Building a sustainable AI future

The AI industry urgently needs to address its sustainability challenges. While AI technology holds the potential to provide solutions, it must not be used as an excuse to overlook its significant environmental impact. The rising costs and processing demands underscore both a financial and technical imperative.

For companies, prioritising code optimisation offers immediate benefits by enhancing AI performance and significantly reducing costs and emissions. By focusing on this area, the sector can contribute to the broader effort of reducing the substantial energy and resource consumption of data centres.

Featured products

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier