top of page
  • Writer's pictureHelen Sanders

Study Reveals AI's Energy Consumption Could Rival That of a Nation

A recent study published in the journal Joule suggests that the environmental impact of artificial intelligence (AI) might be more substantial than previously anticipated. The research indicates that AI could potentially counteract carbon emissions reduction efforts by consuming energy equivalent to that of an entire country, possibly on the scale of Sweden, in the coming years.

The study's author, Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, points out that the rapid advancement of AI technology may lead to this energy-intensive scenario. He emphasizes that large language models (LLMs), such as ChatGPT, require significant amounts of data for training. "If you're going to be expending a lot of resources and setting up these really large models and trying them for some time, that's going to be a potential big waste of power," de Vries explains.

While AI training is known for its energy consumption, de Vries highlights that the "inference" phase, where AI models generate information based on new inputs, also contributes significantly to energy consumption. Google, for instance, reported that 60% of its AI-related energy consumption between 2019 and 2021 was attributed to the inference phase.

The study underscores the need to consider the entire life cycle of AI, not just the training phase.

It's worth noting that more than three-quarters of global greenhouse gas emissions result from energy production, as noted by the International Energy Agency. This has profound implications for climate change, and there's an urgent need to address this issue, as highlighted by the recent Intergovernmental Panel on Climate Change report.

De Vries predicts that by 2027, newly manufactured AI devices could consume as much electricity as countries like the Netherlands, Sweden, or Argentina. The increasing demand for AI chips, driven by the widespread adoption of AI products, is exemplified by Nvidia's substantial earnings, which reached $13.5 billion in the second quarter of 2023.

The study suggests that as AI technology continues to evolve, it may become more energy-efficient. Nevertheless, de Vries emphasizes the importance of AI developers being mindful of how and when they employ AI technology. While the worst-case scenario could see Google's AI alone consuming as much electricity as a country like Ireland, it is more likely that as AI evolves, technology will adapt to better support energy efficiency. In any case, a thoughtful approach to AI utilization is essential.

bottom of page