By Laila Kearney
NEW YORK (Reuters) – U.S. data-center power demand could nearly triple in the next three years, and consume as much as 12% of the country’s electricity, as the industry undergoes an artificial-intelligence transformation, according to a Department of Energy-backed study that was first reported by Reuters on Friday.
The Lawrence Berkeley National Laboratory produced the report as the U.S. power industry and government attempt to understand how Big Tech’s data-center demand will affect electrical grids, power bills and the climate.
By 2028, data centers’ annual energy use could reach between 74 and 132 gigawatts, or 6.7% to 12% of total U.S. electricity consumption, according to the Berkeley Lab report.
The report included ranges that depended partly on the availability and demand for a type of AI chip known as GPUs. Currently, data centers make up a little more than 4% of the country’s power load.
“This really signals to us where the frontier is in terms of growing energy demand in the U.S.,” said Avi Shultz, director of the DOE’s Industrial Efficiency and Decarbonization Office.
Swelling data-center electricity needs are accompanied by rising power consumption from onshoring of U.S. manufacturing and electrification of buildings and transportation. Overall U.S. power demand peaked in 2024 and is expected to hit another record next year.
“What this report is highlighting is what’s actually growing the fastest, and the leading edge of demand growth in the U.S. is the very new growth in artificial-intelligence data centers,” Shultz said.
Findings may inform DOE efforts to increase the flexibility and resiliency of the grid, including construction of long-duration battery storage at data-center sites and commercialization of new technologies such as small nuclear reactors and advanced geothermal, Shultz said.
POWER DOUBLING
Starting in 2017, deployment of GPU-accelerated servers led to a more than doubling of the sector’s power use over a six-year period, the report said.
AI, which requires increasingly powerful chips and intense cooling systems, is the primary driver for the projected data-center growth.
When the last report was released in 2016, AI servers in data centers accounted for about 2% of total server energy use.
The report’s lead researcher Arman Shehabi and his team recommend publishing the report annually, or biannually, to more closely track data-center trends.
Estimates in the report are based on calculations of electricity use from installed GPUs and other data-center IT equipment, using publicly available information, market-research firms and reviews by power-sector and data-center executives.
“By showing what the energy use is and, more importantly, what’s causing the growth in energy use, it helps us think about what opportunities there are for efficiencies,” Shehabi said.
The report also makes suggestions to further research and develop energy-efficiency strategies for the country’s booming AI data centers. New AI data centers are being built with power capacity as big as one gigawatt, enough to power all homes in Philadelphia.
(Reporting by Laila Kearney; Editing by Chizu Nomiyama and Rod Nickel)