The Verge brings researchers’ study that calculated the carbon emissions caused by using AI models for different tasks.
Machine learning’s energy consumption is a growing concern, yet precise figures on its environmental impact remain elusive due to a lack of transparency from companies like Meta, Microsoft, and OpenAI. Training AI models is especially energy-intensive; for instance, training GPT-3 is estimated around 1,300 megawatt hours (MWh) of electricity, equivalent to the annual consumption of 130 US homes. In comparison, streaming an hour of Netflix uses just 0.0008 MWh, meaning you’d have to watch Netflix for over 1.5 million hours to consume the same amount of power. This gap illustrates the significant difference in energy usage.
Sasha Luccioni is researching the AI impact on society. She’s a researcher at Hugging Face, and her study with colleagues at Hugging Face and Carnegie Mellon University provides insights into the energy consumption of AI inference tasks. This study reveals that generating images consumes notably more energy than processing text. However, the variability in energy usage across different AI models highlights the complexity of accurately assessing their environmental impact.
Alex de Vries, a PhD candidate at VU Amsterdam, warns of the potential surge in AI’s energy consumption, projecting that by 2027, the sector could consume between 85 to 134 terawatt hours annually, comparable to the energy demand of a country like the Netherlands. Calls for energy efficiency ratings for AI models and a reassessment of whether AI is necessary for certain tasks highlight the need for broader discussions on mitigating the environmental impact of AI technology.