In January, the International Energy Agency (IEA) issued its forecast for global energy use over the next two years. Included for the first time were projections for electricity consumption associated with data centers, cryptocurrency, and artificial intelligence.
The IEA estimates that, added together, this usage represented almost 2 percent of global energy demand in 2022 — and that demand for these uses could double by 2026, which would make it roughly equal to the amount of electricity used by the entire country of Japan.
We live in the digital age, where many of the processes that guide our lives are hidden from us inside computer code. We are watched by machines behind the scenes that bill us when we cross toll bridges, guide us across the internet, and deliver us music we didn’t even know we wanted. All of this takes material to build and run — plastics, metals, wiring, water — and all of that comes with costs. Those costs require trade-offs.
None of these trade-offs is as important as in energy. As the world heats up toward increasingly dangerous temperatures, we need to conserve as much energy as we can get to lower the amount of climate-heating gases we put into the air.
That’s why the IEA’s numbers are so important, and why we need to demand more transparency and greener AI going forward. And it’s why right now we need to be conscientious consumers of new technologies, understanding that every bit of data we use, save, or generate has a real-world cost.
One of the areas with the fastest-growing demand for energy is the form of machine learning called generative AI, which requires a lot of energy for training and a lot of energy for producing answers to queries. Training a large language model like OpenAI’s GPT-3, for example, uses nearly 1,300 megawatt-hours (MWh) of electricity, the annual consumption of about 130 US homes. According to the IEA, a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. (An incandescent light bulb draws an average of 60 watt-hours of juice.) If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.
I recently spoke with Sasha Luccioni, lead climate researcher at an AI company called Hugging Face, which provides an open-source online platform for the machine learning community that supports the collaborative, ethical use of artificial intelligence. Luccioni has researched AI for more than a decade, and she understands how data storage and machine learning contribute to climate change and energy consumption — and are set to contribute even more in the future.
I asked her what any of us can do to be better consumers of this ravenous technology. This conversation has been edited for length and clarity.
Brian Calvert
AI seems to be everywhere. I’ve been in meetings where people joke that our machine overlords might be listening. What exactly is artificial intelligence? Why is it getting so much attention? And why should we worry about it right now — not in some distant future?
Sasha Luccioni
Artificial intelligence has actually been around as a field since the ’50s, and it’s gone through various “AI winters” and “AI summers.” Every time some new technique or approach gets developed, people get very excited about it, and then, inevitably, it ends up disappointing people, triggering an AI winter.
We’re going through a bit of an AI summer when it comes to generative AI. We should definitely stay critical and reflect upon whether or not we should be using AI, or generative AI specifically, in applications where it wasn’t used before.
Brian Calvert
What do we know about the energy costs of this hot AI summer?
Sasha Luccioni
It’s really hard to say. With an appliance, you plug it into your socket and you know what energy grid it’s using and roughly how much energy it’s using. But with AI, it’s distributed. When you’re doing a Google Maps query, or you’re talking to ChatGPT, you don’t really know where the process is running. And there’s really no transparency with regard to AI deployment.
From my own research, what I’ve found is that switching from a nongenerative, good old-fashioned quote-unquote AI approach to a generative one can use 30 to 40 times more energy for the exact same task. So, it’s adding up, and we’re definitely seeing the big-picture repercussions.
Brian Calvert
So, in material terms, we’ve got a lot of data, we’re storing a lot of data, we’ve got language models, we’ve got models that need to learn, and that takes energy and chips. What kind of things need to be built to support all this, and what are the environmental real-world impacts that this adds to our society?
Sasha Luccioni
Static data storage [like thumb drives] doesn’t, relatively speaking, consume that much energy. But the thing is that nowadays, we’re storing more and more data. You can search your Google Drive at any moment. So, connected storage — storage that’s connected to the internet — does consume more energy, compared to nonconnected storage.
Training AI models consumes energy. Essentially you’re taking whatever data you want to train your model on and running it through your model like thousands of times. It’s going to be something like a thousand chips running for a thousand hours. Every generation of GPUs — the specialized chips for training AI models — tends to consume more energy than the previous generation.
They’re more powerful, but they’re also more energy intensive. And people are using more and more of them because they want to train bigger and bigger AI models. It’s kind of this vicious circle. When you deploy AI models, you have to have them always on. ChatGPT is never off.
Brian Calvert
Then, of course, there’s also a cooling process. We’ve all felt our phones heat up, or had to move off the couch with our laptops — which are never truly on our laps for long. Servers at data centers also heat up. Can you explain a little bit how they are cooled down?
Sasha Luccioni
With a GPU, or with any kind of data center, the more intensely it runs, the more heat it’s going to emit. And so in order to cool those data centers down, there’s different kinds of techniques. Sometimes it’s air cooling, but majoritarily, it’s essentially circulating water. And so as these data centers get more and more dense, they also need more cooling, and so that uses more and more water.
Brian Calvert
We have an AI summer, and we have some excitement and some hype. But we also have the possibility of things scaling up quite a bit. How might AI data centers be different from the data centers that we already live with? What challenges will that present from an ecological or environmental perspective going forward?
Sasha Luccioni
Data centers need a lot of energy to run, especially the hyperscale ones that AI tends to run on. And they need to have reliable sources of energy.
So, often they’re built in places where you have nonrenewable energy sources, like natural gas-generated energy or coal-generated energy, where you flip a switch and the energy is there. It’s harder to do that with solar or wind, because there’s often weather factors and things like that. And so what we’ve seen is that the big data centers are built in places where the grid is relatively carbon intensive.
Brian Calvert
What kinds of practices and policies should we be considering to either slow AI down or green it up?
Sasha Luccioni
I think that we should be providing information so that people can make choices, at a minimum. Eventually being able to choose a model, for example, that is more energy efficient, if that’s something that people care about, or that was trained on noncopyrighted data. Something I’m working on now is kind of an Energy Star rating for AI models. Maybe some people don’t care, but other people will choose a more efficient model.
Brian Calvert
What should I think about before upgrading my data plan? Or why should I hold off on asking AI to solve my kid’s math homework? What should any of us consider before getting more gadgetry or getting more involved with a learned machine?
Sasha Luccioni
In France, they have this term, “digital sobriety.” Digital sobriety could be part of the actions that people can take as 21st-century consumers and users of this technology. I’m definitely not against having a smartphone or using AI, but asking yourself, “Do I need this new gadget?” “Do I really need to use ChatGPT for generating recipes?” “Do I need to be able to talk to my fridge or can I just, you know, open the door and look inside?” Things like that, right? If it ain’t broke, don’t fix it with generative AI.