As AI reshapes our world, it's gobbling up electricity at an alarming rate. This surge in energy consumption isn't just a tech industry problem—it's a climate issue that's sparking debates about sustainability, innovation, and our digital future. We dive into the power-hungry world of AI to understand its energy appetite and what it means for our planet.
Editor’s note: When there isn’t a big headline making news, we often pick a Big Story on a topic that we think will be interesting to you. We’d be just as happy to take requests from you. Do write to us at talktous@splainer.in. We’d also love to hear what you think of our leads on these kinds of less-newsy stories on the complicated truth about narcissism, ‘philanthropy’ of the very rich, the problem with pandas, and why animals have ‘virgin births’.
Written by: Samarth Bansal
When you chat with an AI assistant or marvel at the latest image generation tool, you're probably not thinking about electricity bills. But behind every witty response and pixel-perfect creation lies a voracious appetite for energy that's giving climate scientists and tech leaders a hard time.
Here's a startling fact to kick things off: Training a single large language model like GPT-3 — an AI model that can generate human-like text, powering many chatbots — is estimated to consume about 1,300 megawatt hours (MWh) of electricity. That's roughly the same amount of power used by 130 US homes in an entire year. And that's just for training—not even counting the energy needed to run these models day in and day out.
But it's not just about electricity: Data centres, which power AI operations, use significant amounts of water for cooling their massive banks of computers. A non-peer-reviewed study led by researchers at UC Riverside estimates that training GPT-3 in Microsoft's state-of-the-art US data centres could potentially have consumed 700,000 liters (184,920.45 gallons) of freshwater.
These are just estimates. Exact figures for AI-specific energy usage are hard to come by. But this should give you a sense that the concerns are real—and worth our attention.
Why AI is different
So, what makes AI such an energy hog compared to other computing tasks? It all comes down to the sheer scale and complexity of these systems.
Data centres on steroids: AI models, especially large language models (which power applications like ChatGPT), require massive computing power They run on specialised hardware called Graphics Processing Units (GPUs), which consume far more energy than standard computer chips.
Training vs deployment: The initial training of AI models—the process where the AI learns from vast amounts of data—is particularly energy-intensive.
Constant learning: Many AI systems are designed to continuously learn and improve, which means they're constantly crunching data and consuming energy.
Scale matters: As AI models grow larger and more complex, their energy needs grow exponentially. It's not just a linear increase—it's more like a hockey stick graph of power consumption.
Zooming out
To grasp the scale of AI's energy consumption, let's compare it to streaming movies.
Watching an hour of Netflix uses about 0.8 kWh of electricity. Training a large language model like GPT-3 uses an estimated 1,300,000 kWh. That means training a single AI model consumes as much energy as streaming Netflix for over 1.6 million hours or 185 years straight.
But it's not just about training. The International Energy Agency predicts that by 2026, data centres, AI, and cryptocurrencies combined could consume as much electricity as Japan—the world's third-largest economy. This dramatic increase is driven by the rapid adoption of AI across industries, the growing complexity of AI systems, and the massive amounts of data they process.
For context, consider this: In 2021, when the song "Despacito" went viral and hit 5 billion views, it was reported to have used as much energy as 40,000 American homes use in a year. Now imagine the energy consumption of AI models that are constantly running and learning, not just being played back like a video.
In 2020, Swedish researcher Anders Andrae estimated that information and communication technology could account for more than 20 percent of global energy use by 2025.
“We have a tsunami of data approaching,” he said. “Everything which can be is being [digitalised]. It is a perfect storm. 5G is coming … and all cars and machines, robots and artificial intelligence are being [digitalised], producing huge amounts of data which is stored in data [centres].”
And it’s not just about direct energy use. The production of specialised AI hardware, like GPUs, is also carbon-intensive and major manufacturers like Nvidia are secretive about their footprint data. The thing is, while the tech industry has made significant strides in energy efficiency over the years, the rapid growth of AI threatens to offset these gains.
How the tech industry is responding
The good news? The tech industry isn't blind to this challenge. Here are some ways they're trying to make AI greener:
Renewable energy: Many tech giants are investing heavily in renewable energy sources to power their data centres. For instance, before 2022, all major big tech companies like Amazon and Meta had pledged to become zero-emissions by 2030.
Efficient algorithms: Researchers are developing more energy-efficient AI algorithms that can do more with less computing power. For example, the TinyML framework allows users to run machine learning models on small, low-powered edge devices like microcontrollers with low bandwidth requirements.
Specialised hardware: Companies are designing AI-specific chips that are more energy-efficient than general-purpose GPUs. Microsoft has even set up its own nuclear start-up company to explore new energy solutions.
Despite these efforts, there's an ongoing debate about whether the potential benefits of AI—such as optimising energy grids or accelerating climate change solutions—outweigh its energy costs.
One thing though. Maybe… companies can just be more transparent and give those who care (is that you?) more choice? As climate researcher Sasha Luccioni said:
I think that we should be providing information so that people can make choices, at a minimum. Eventually being able to choose a model, for example, that is more energy efficient, if that’s something that people care about, or that was trained on [non copyrighted] data. Something I’m working on now is kind of an Energy Star rating for AI models. Maybe some people don’t care, but other people will choose a more efficient model.
Global regulatory push
The energy consumption of data centres and AI has not gone unnoticed by policymakers:
In Ireland, nearly a fifth of the country's electricity is used up by data centres, and this figure is expected to grow significantly in the next few years. As a result, there's currently a moratorium preventing the construction of new data centres in Dublin.
In the US, several states have put in legislation and even moratoriums to prevent companies from constructing data centres indiscriminately. For instance, Virginia's General Assembly recently launched a study of the industry's impacts after a host of bills seeking to further regulate data centres were introduced.
And there have been protests against data centres in various countries, including the US, Chile, and Uruguay, highlighting growing public concern about their environmental impact.
The bottomline: The challenge ahead is clear: How do we harness the transformative power of AI while ensuring it doesn't accelerate climate change? Can we make AI smarter without making our planet suffer? It's a balancing act that will require innovation, policy changes, and a collective commitment to sustainability.
As consumers and citizens, we have a role to play too. Being aware of the energy costs of our digital activities, supporting companies that prioritise sustainability, and advocating for greener tech policies are all ways we can contribute to a more sustainable AI future.
Reading List
The Verge and The Guardian have great primers on AI and its energy cost. Jacobin looks at how regulatory policy has lagged behind the rapid development of AI data centres. The Guardian also has an insightful feature on the cost of Ireland’s data centre “boom”. Yale Environment 360 looks at what could be done about the high energy demands. Vox has a good interview with climate scientist Sasha Luccioni on what’s next for AI in terms of climate change.