The Hidden Cost of AI: Massive Energy Consumption in Advanced Computing

AI’s voracious appetite for electricity is not only substantial, but expanding rapidly as the technology advances.
The Hidden Cost of AI: Massive Energy Consumption in Advanced Computing
A general view in the CERN Computer / Data Centre and server farm of the 1450 m2 main room during a behind-the-scenes tour at CERN, the World's Largest Particle Physics Laboratory in Meyrin, Switzerland, on April 19, 2017. Dean Mouhtaropoulos/Getty Images
Shawn Lin
Sean Tseng
Updated:
0:00
As artificial intelligence continues to evolve, its impact on society grows more profound. Yet, beyond the innovations, there exists a less discussed aspect: AI’s voracious appetite for electricity. This demand for power is not just substantial, it’s expanding rapidly as AI technologies, like those from OpenAI, advance.

Energy Consumption of AI Training

Training a sophisticated AI model requires a combination of big data, advanced computing power, and robust algorithms. Consider OpenAI’s GPT-3, a model with 175 billion parameters. It was trained using 1,024 graphics processing units (GPUs) running non-stop for a bit over a month.
Mosharaf Chowdhury, an associate professor of electrical engineering and computer science at the University of Michigan, estimates that the training session of GPT-3 consumed about 1,287 megawatt-hours (MWh) of electricity. To put this into perspective, that’s equivalent to the energy consumption of an average American household over 120 years.
Shawn Lin is a Chinese expatriate living in New Zealand. He has contributed to The Epoch Times since 2009, with a focus on China-related topics.