AI Training to Drive 50GW Power Demand by 2030
The accelerating pace of artificial intelligence development, particularly the intensive training of the large-scale models that underpin many popular applications, is poised to create an unprecedented surge in electricity demand. A new report jointly published by the Electric Power Research Institute (EPRI) and Epoch AI projects that by 2030, training a single leading AI model could necessitate over 4 gigawatts (GW) of power—an amount sufficient to electrify millions of U.S. households.
The energy footprint of training these sophisticated AI models has historically been substantial, requiring immense, concentrated power supplies. Despite significant strides in computational efficiency, the power required to train a cutting-edge model has more than doubled annually over the past decade. This escalating demand is driven by the AI industry’s pursuit of enhanced performance through increasingly larger and more complex models, which in turn necessitates greater computing power and, consequently, more electricity. The report indicates that this trend of scaling up AI models is likely to persist in the coming years, even as efficiency breakthroughs continue.
However, the overall power demand for artificial intelligence extends far beyond the training of these colossal models. A substantial portion of future power capacity will also be allocated to the deployment of AI services for end-users, the training of smaller, specialized models, and ongoing AI research. Current estimates place the total AI power capacity in the U.S. at approximately 5 GW. This figure is projected to skyrocket to more than 50 GW by 2030, a demand level that would equal the entire global power consumption of data centers today and represent a rapidly expanding share of overall data center energy needs.
Jaime Sevilla, director of Epoch AI, underscored the gravity of these projections, stating that the energy demands for training advanced AI models are doubling year-on-year, soon approaching the output of the largest nuclear power plants. He emphasized the report’s rigorous, data-driven analysis of these trends and their future trajectory, affirming Epoch AI’s commitment to continued investigation into AI’s energy footprint. Arshad Mansoor, President and CEO of EPRI, highlighted AI’s growing ubiquity and its anticipated pivotal role in the future energy landscape. He noted that to effectively meet these burgeoning energy demands, both data center developers and power providers are adopting innovative “build-to-balance” strategies. This approach involves constructing new infrastructure while simultaneously integrating flexibility into data center designs, which is deemed critical for accelerating grid connections, minimizing costs, and bolstering system reliability.
In response to these challenges, EPRI launched the DCFlex collaborative last year. This initiative aims to develop and demonstrate the technologies, policies, and tools necessary to realize the potential of data center flexibility. The concept of data center flexibility, particularly through geographically distributed training data centers, envisions transforming these facilities from passive consumers of electricity into active grid assets. This transformation promises to enhance grid reliability, reduce costs, and expedite new connections. The DCFlex effort has garnered significant industry support, bringing together over 45 companies, including foundational members like Google, Meta, NVIDIA, and various utility providers. The collaborative recently initiated its first real-world field demonstrations in key locations such as Lenoir, North Carolina; Phoenix, Arizona; and Paris, France, marking a tangible step towards a more resilient and responsive energy infrastructure for the AI era.