In a quiet data center somewhere in the world, tucked between endless rows of servers and miles of cabling, an artificial intelligence model is running at full throttle. It’s learning, analyzing, calculating – and consuming energy at a rate that would have seemed unimaginable just a few years ago. Generative AI systems, from chatbots to image generators and complex language models, are captivating and add immense value to our daily lives. Yet there’s a fundamental drawback: they’re incredibly energy-hungry. But in the face of this challenge, a determined movement is underway, focused on dramatically reducing this energy demand and steering the technology towards a greener future.
The computing power needed to train a large language model is staggering. Training GPT-3, a predecessor of today’s state-of-the-art AI models, required around 1,287 megawatt hours of electricity – enough to power a small town for months. The resulting CO₂ emissions were comparable to what 700 return flights from New York to San Francisco would produce. But this is only one side of the energy coin in AI. Training is merely the starting point, as the models, once trained, continue to consume significant energy with each user query they respond to in real time. What sounds like a quick conversation often involves millions of computations in mere milliseconds – all of which require substantial electricity.
This reality has sparked a sense of urgency. More and more companies, researchers, and policymakers are critically examining AI’s energy footprint and working on innovative solutions to redirect this intelligent power play towards a more sustainable path. The first step lies in specialized hardware. The development of energy-efficient chips and specialized GPUs enables these complex computations to be processed more quickly and with less energy. This technology is purpose-built to maximize AI performance while keeping power consumption as low as possible. As a result, a new generation of data centers is emerging, designed with efficiency and sustainability in mind.
Alongside hardware, advancements are being made in software and algorithms that form the core of AI models. Developers and researchers are striving to design these algorithms to require less computational power while still delivering high-quality results. This includes creating leaner models, tailored to specific tasks rather than capable of doing everything. This specialisation not only saves energy but also allows AI to be deployed in areas previously constrained by resources. Companies like Google, Microsoft, and Meta, aware of their social responsibility in deploying AI, are heavily investing in this research and are implementing optimized software solutions that could significantly reduce AI’s carbon footprint.
But the origins of the energy itself are being increasingly scrutinised. If data centers are powered by renewable energy, the impact on the climate can be minimized. Google, for instance, plans to run most of its data centers on renewable sources in the near future – and has taken a novel approach by exploring small, modular nuclear reactors. While the idea may sound controversial, it could be a solution for meeting rising energy demands without further burdening the environment. Through this step, the technological revolution driven by AI could be aligned with climate goals.
Beyond optimisation and sustainable energy sources, there’s another emerging concept: harnessing waste heat. Every data center generates heat, much of which is currently wasted. But what if this waste heat could be used to warm buildings or swimming pools in urban areas? This idea is already being tested in several pilot projects and could make a substantial difference. What was once viewed as energy waste could become a valuable resource, with excess heat being repurposed for entirely new uses.
The challenges surrounding generative AI’s energy consumption don’t lend themselves to simple solutions. Yet they are fuelling the pioneering spirit and innovation needed to guide this high-energy journey toward a sustainable future. While AI’s power consumption may seem relentless today, research shows that targeted measures and creative thinking can make a real difference. A future in which we reap the benefits of AI while living in harmony with our planet’s resources is not a distant utopia – it’s an achievable goal.
The technologies and approaches being developed today lay the groundwork for an AI that is not only smarter but also more eco-friendly. They demonstrate that we can, indeed, channel even the strongest energy flows and give us hope that the next generation of AI models will not only be the smartest but also the most sustainable.