As artificial intelligence continues to evolve, its impact on society grows more profound. Yet, beyond the innovations, there exists a less discussed aspect: AI’s voracious appetite for electricity. This demand for power is not just substantial, it’s expanding rapidly as AI technologies, like those from OpenAI, advance.
Energy Consumption of AI Training
Training a sophisticated AI model requires a combination of big data, advanced computing power, and robust algorithms. Consider OpenAI’s GPT-3, a model with 175 billion parameters. It was trained using 1,024 graphics processing units (GPUs) running non-stop for a bit over a month.Mosharaf Chowdhury, an associate professor of electrical engineering and computer science at the University of Michigan, estimates that the training session of GPT-3 consumed about 1,287 megawatt-hours (MWh) of electricity. To put this into perspective, that’s equivalent to the energy consumption of an average American household over 120 years.