As misinformation and the threat of AI replacing human jobs continue to dominate the conversation about the dangers of artificial intelligence, a Boston University professor is sounding the alarm on another possible downside—the potentially large environmental impact of generative AI tools.
“As an AI researcher, I often worry about the energy costs of building artificial intelligence models,” wrote Kate Saenko, associate professor of computer science at Boston University, in an article on The Conversation. “The more powerful the AI, the more energy is needed.”
While the energy consumption of blockchains like Bitcoin and Ethereum is studied and debated from Twitter to the halls of Congress, the impact of the rapid development of AI on the planet has not yet received the same spotlight.
Professor Saenko aims to change that, but the article acknowledges that there is limited data on the carbon footprint of a generative AI question. However, he said research puts the number four to five times higher than a simple search engine query.
According to a 2019 report, Saenko said a generative AI model called Bidirectional Encoder Representations from Transformers (or BERT)—with 110 million parameters—consumed the energy of a round-trip transcontinental flight for a human using graphics processing units (or GPUs) to train the model.
In AI models, parameters are variables identified from data that guide the model’s predictions. More parameters in the mix often mean more complex models, requiring more data and computing power as a result. The parameters are adjusted during training to minimize errors.
Saenko noted in comparison that OpenAI’s GPT-3 model—with 175 billion parameters—consumed the equivalent amount of energy as 123 gasoline-powered passenger cars driven for a year, or about 1,287-megawatt hours of electricity. It also produced 552 tons of carbon dioxide. He added that the number comes from preparing the model to be launched before any consumers start using it.
“If chatbots become as popular as search engines, the energy costs of deploying AIs may increase,” said Saenko, referring to Microsoft’s addition of ChatGPT to its Bing web browser earlier this month.
Not helping matters is the fact that more and more AI chatbots, such as Perplexity AI and OpenAI’s very popular ChatGPT, are releasing mobile applications. That makes them easier to use and expose them to a wider audience.
Saenko highlighted a Google study that found that using a more efficient model architecture and processor and a greener data center could significantly reduce the carbon footprint.
“While a large AI model will not destroy the environment,” writes Saenko, “if a thousand companies develop slightly different AI bots for different purposes, each one will be used by millions of million customers, then energy use becomes an issue.”
Ultimately, Saenko concludes that more research is needed to make generative AI more efficient—but he’s optimistic.
“The good news is that AI can run on renewable energy,” he wrote. “By moving the calculation to where there is more green energy, or scheduling the calculation for the hours of the day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40 compared to using a grid dominated by fossil fuels.”
Interested in learning more about AI? Check out our latest Decrypt U course, “Getting Started with AI.” It covers everything from the history of AI to machine learning, ChatGPT, and ChainGPT. Find out more HERE.