(Bloomberg) -- Artificial intelligence has become the tech industry’s shiny new toy, with expectations it’ll revolutionize trillion-dollar industries from retail to medicine. But the creation of every new chatbot and image generator requires a lot of electricity, which means the technology may be responsible for a massive and growing amount of planet-warming carbon emissions.
Microsoft Corp., Alphabet Inc.’s Google and ChatGPT maker OpenAI use cloud computing that relies on thousands of chips inside servers in massive data centers across the globe to train AI algorithms called models, analyzing data to help them “learn” to perform tasks. The success of ChatGPT has other companies racing to release their own rival AI systems and chatbots or building products that use large AI models to deliver features to anyone from Instacart shoppers to Snap users to CFOs.
AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 US homes use in an entire year. Yet the sector is growing so fast — and has such limited transparency — that no one knows exactly how much total electricity use and carbon emissions can be attributed to AI. The emissions could also vary widely depending on what type of power plants provide that electricity; a data center that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that draws power from solar or wind farms.
While researchers have tallied the emissions from the creation of a single model, and some companies have provided data about their energy use, they don’t have an overall estimate for the total amount of power the technology uses. Sasha Luccioni, a researcher at AI company Hugging Face Inc., wrote a paper quantifying the carbon impact of her company’s BLOOM, a rival of OpenAI’s GPT-3. She has also tried to estimate the same for OpenAI’s viral hit ChatGPT, based on a limited set of publicly available data.
“We’re talking about ChatGPT and we know nothing about it,” she said. “It could be three raccoons in a trench coat.”
Researchers like Luccioni say we need transparency on the power usage and emissions for AI models. Armed with that information, governments and companies may decide that using GPT-3 or other large models for researching cancer cures or preserving indigenous languages is worth the electricity and emissions, but writing rejected Seinfeld scripts or finding Waldo is not.
Greater transparency might also bring more scrutiny; the crypto industry could provide a cautionary tale. Bitcoin has been criticized for its outsized power consumption, using as much annually as Argentina, according to the Cambridge Bitcoin Electricity Consumption Index. That voracious appetite for electricity prompted China to outlaw mining and New York to pass a two-year moratorium on new permits for crypto-mining powered by fossil fuels.
Training GPT-3, which is a single general-purpose AI program that can generate language and has many different uses, took 1.287 gigawatt hours, according to a research paper published in 2021, or about as much electricity as 120 US homes would consume in a year. That training generated 502 tons of carbon emissions, according to the same paper, or about as much as 110 US cars emit in a year. That’s for just one program, or “model.” While training a model has a huge upfront power cost, researchers found in some cases it’s only about 40% of the power burned by the actual use of the model, with billions of requests pouring in for popular programs. Plus, the models are getting bigger. OpenAI’s GPT-3 uses 175 billion parameters, or variables, that the AI system has learned through its training and retraining. Its predecessor used just 1.5 billion.
OpenAI is already working on GPT-4, plus models must be retrained regularly in order to remain aware of current events. “If you don’t retrain your model, you’d have a model that didn’t know about Covid-19,” said Emma Strubell, a professor at Carnegie Mellon University who was among the first researchers to look into AI’s energy issue.
Another relative measure comes from Google, where researchers found that artificial intelligence made up 10 to 15% of the company’s total electricity consumption, which was 18.3 terawatt hours in 2021. That would mean that Google’s AI burns around 2.3 terawatt hours annually, about as much electricity each year as all the homes in a city the size of Atlanta.
While the models are getting larger in many cases, the AI companies are also constantly working on improvements that make them run more efficiently. Microsoft, Google and Amazon — the biggest US cloud companies — all have carbon negative or neutral pledges. Google said in a statement that it’s pursuing net-zero emissions across its operations by 2030, with a goal to run its office and data centers entirely on carbon-free energy. The company has also used AI to improve energy efficiency in its data centers, with the technology directly controlling cooling in the facilities.
OpenAI cited work it has done to make the application programming interface for ChatGPT more efficient, cutting electricity usage and prices for customers. “We take our responsibility to stop and reverse climate change very seriously, and we think a lot about how to make the best use of our computing power,” an OpenAI spokesperson said in a statement. “OpenAI runs on Azure, and we work closely with Microsoft’s team to improve efficiency and our footprint to run large language models.”
Microsoft noted it is buying renewable energy and taking other steps to meet its previously announced goal of being carbon negative by 2030. “As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” the company said in a statement.
“Obviously these companies don’t like to disclose what model they are using and how much carbon it emits,” said Roy Schwartz, professor at Hebrew University of Jerusalem, who partnered with a group at Microsoft to measure the carbon footprint of a large AI model.
There are ways to make AI run more efficiently. Since AI training can happen at any time, developers or data centers could schedule the training for times when power is cheaper or at a surplus, thereby making their operations more green, said Ben Hertz-Shargel of energy consultant Wood Mackenzie. AI companies that train their models when power is at a surplus could then tout that in their marketing. “It can be a carrot for them to show that they’re acting responsibly and acting green,” Hertz-Shargel said.
“It’s going to be bananas”
Most data centers use graphics processing units or GPUs to train AI models and those components are among the most power hungry the chip industry makes. Large models require tens of thousands of GPUs, with the training program ranging from weeks to months, according to a report published by Morgan Stanley analysts earlier this month.
One of the bigger mysteries in AI is the total accounting for carbon emissions associated with the chips being used. Nvidia, the biggest manufacturer of GPUs, said that when it comes to AI tasks, they can complete the task more quickly, making them more efficient overall.
“Using GPUs to accelerate AI is dramatically faster and more efficient than CPUs — typically 20x more energy efficient for certain AI workloads, and up to 300x more efficient for the large language models that are essential for generative AI,” the company said in a statement.
While Nvidia has disclosed its direct emissions and the indirect ones related to energy, it hasn’t revealed all of the emissions they are indirectly response for, said Luccioni, who asked for that data for her research.
When Nvidia does share that information, Luccioni thinks it'll turn out that GPUs burn up as much power as a small country. She said, “It’s going to be bananas.”