Imagine a restaurant that’s open 24/7, fully staffed at all times. Outside of weekend nights and the lunch rush, it’s mostly empty—but the owner still pays for staff, rent, and utilities around the clock.
That’s similar to how the U.S. electricity grid works. The grid is built to handle “peak demand”—short periods of extremely high electricity use during scorching summers or freezing winters. Power outages can be dangerous, so utilities must ensure electricity is always available.
The result? Most of the time, the grid is running at only about half its capacity. This inefficiency drives up costs because building and maintaining the grid is expensive.
But some researchers see an opportunity. Over the next five years, peak electricity demand is expected to jump nearly 24%—a major change after decades of stability. Utilities are updating aging infrastructure and meeting demand from AI data centers, which pushes prices up. That unused capacity, experts say, could be put to work for new customers, including data centers.
“Many hours of the day or times of the year, you have a lot of spare capacity on the grid,” said Ryan Hledik, a principal at Brattle Group. “If you can add new electricity use when there’s spare capacity, you can spread the cost of the grid across more customers and bring rates down.”
Grid usage varies depending on weather, wind and solar output, and location. Utilization can be as low as 30% in rural areas and up to 60–70% in cities during extreme temperatures.
“We design to meet the peak load,” said Larry Bekkedahl, senior VP at Portland General Electric. “And that peak might only occur five days in the summer or five days in the winter.”
Utilities intentionally build extra capacity to handle rare but extreme events, like heatwaves or cold snaps, and outages at certain power plants. Because of this, utilities tend to be cautious.
“There’s a bias against anything that could put the system at risk,” said Oliver Kerr of Aurora Energy Research.
Low utilization isn’t necessarily bad—much infrastructure, like highways, is only busy part of the time. But utilities also earn profits when they expand infrastructure—like building new lines or plants—not from simply operating the existing system.
“If they were in the apple business, they get paid for planting new trees, not for growing apples,” said Amit Narayan, CEO of GridCARE. This can lead to overbuilding, raising costs for customers.
Some experts see a solution: add new customers to the grid—but only during non-peak periods. For instance, data centers could use power when demand is low but disconnect during the grid’s hottest days. A Duke University study found the current U.S. grid could supply around 100 gigawatts of extra power for data centers willing to curtail electricity use for short periods during peak events.
Some companies are already testing this idea. GridCARE helped Portland General Electric identify 80 megawatts of spare capacity for incoming data centers. Google announced similar plans in Indiana and Tennessee, ensuring its data centers draw less power during peak periods. During those times, data centers could shift tasks to other facilities or rely on backup generators. In theory, this could lower rates. More customers using the same grid spreads costs more widely.
“If I didn’t have to pay for that capital, I save that cost, and then I can reduce everybody’s rates,” said Bekkedahl.
However, most AI data centers haven’t adopted this strategy. Rising demand has forced utilities to expand the grid, often using natural gas, increasing both costs and carbon emissions. Ratepayers sometimes question why they are paying for the electricity needed to train new AI models.
Skeptics also note that data centers are multibillion-dollar operations—small electricity savings won’t sway them. Contracts would likely be needed to ensure participation. Supporters argue that this approach could ease the burden for utilities and consumers alike.
“It takes the burden away from utilities and their ratepayers,” Narayan said.
Source: yahoo!finance

