The massive energy consumption of artificial intelligence data centers has evolved from a technical concern into a full-blown political controversy. In an unusual alignment, Senator Bernie Sanders and Governor Ron DeSantis have emerged as the most prominent skeptics of the AI industry's infrastructure expansion—a development that signals gathering headwinds for the data center buildout that underpins America's AI ambitions.

The Grid at a Breaking Point

The numbers are staggering. Data centers are projected to consume 6.7% to 12% of U.S. electricity by 2028, up from 4.4% in 2023. Total U.S. power demand is projected to reach record levels in both 2025 and 2026—the first significant growth after years of relatively flat electricity consumption.

But the grid wasn't built for this. Much of America's electrical infrastructure was constructed decades ago, designed for a world where demand grew predictably and load centers were dispersed. The concentrated power requirements of modern AI data centers are straining systems never meant to handle them.

"It's at a crisis stage right now. PJM has never been this short. Data centers' energy needs are so great that PJM Interconnection projects that it will be a full six gigawatts short of its reliability requirements in 2027."

— Joe Bowring, President of Monitoring Analytics, PJM's independent market monitor

PJM Interconnection, which serves more than 65 million people across 13 states in the Mid-Atlantic and Midwest, has become the canary in the coal mine for the national grid's data center problem.

The Cost Falls on Consumers

The price to secure power capacity in PJM's territory has exploded, with $23 billion in costs attributable to data centers, according to watchdog Monitoring Analytics. Those costs don't stay with the tech giants building the facilities—they're passed on to residential and commercial electricity customers.

The impact is already visible in utility bills. Residential electricity prices are forecast to rise another 4% on average nationwide in 2026, following approximately 5% increases in 2025. In data center-heavy regions like Northern Virginia—home to the world's largest concentration of data centers—the increases have been steeper.

Virginia's Political Earthquake

Rising utility bills played a key role in the landslide victory of Democrat Abigail Spanberger in Virginia's governor's race this year. Her campaign explicitly tied the state's electricity cost increases to data center proliferation, turning what had been a technical issue into a potent political weapon.

The Virginia result has not gone unnoticed by politicians in other states facing similar pressures.

Strange Bedfellows

The alignment of Bernie Sanders and Ron DeSantis as data center skeptics illustrates how the issue scrambles traditional political categories. Sanders, approaching the issue from a progressive framework, has called for a national moratorium on data center construction until the industry addresses its energy and labor market impacts.

DeSantis, coming from a conservative perspective, has questioned whether the benefits of AI development justify the costs being imposed on ordinary ratepayers and has emphasized local control over data center siting decisions.

That a democratic socialist and a conservative Republican can find common ground suggests the political vulnerability of the AI industry's infrastructure demands.

Industry Response

Tech companies and data center operators argue that their facilities bring economic benefits—jobs, tax revenue, and technological advancement—that outweigh the power costs. They point to investments in renewable energy and note that many data centers are signing power purchase agreements that support new solar and wind development.

But critics counter that renewable energy built for data centers could otherwise serve existing customers, and that the pace of data center growth far outstrips new clean energy capacity. The net effect, they argue, is increased strain on the grid and higher bills for everyone else.

Grid Operators Sound Alarms

The warnings from grid operators have grown increasingly urgent. The North American Electric Reliability Corporation (NERC) has cautioned that data center loads are approaching levels the grid cannot safely handle.

"As these data centers get bigger and consume more energy, the grid is not designed to withstand the loss of 1,500-megawatt data centers. At some level, it becomes too large to withstand unless more grid resources are added."

— John Moura, NERC

The solution—building more grid capacity—takes years and requires massive investment. In the meantime, grid operators are improvising. Some data centers are being required to maintain on-site backup generation that can be activated during periods of grid stress.

The 2026 Power Revolution

Industry analysts predict that 2026 will be the year when power becomes the defining constraint on AI growth. As workloads scale from pilots to production, electricity demand is rising faster than infrastructure can accommodate.

Several potential responses are emerging:

  • On-site generation: Data centers increasingly building their own power plants, often using natural gas
  • Nuclear interest: Renewed attention to small modular reactors for dedicated data center power
  • Geographic diversification: Moving facilities to regions with more power headroom
  • Efficiency improvements: New chip designs and cooling technologies reducing power per computation

Regulatory Responses

State and local governments are beginning to respond with new requirements for data center development:

  • Impact assessments for grid effects
  • Requirements for on-site renewable generation
  • Community benefit agreements
  • Caps on data center development in constrained areas
  • Heat waste utilization mandates

Federal action remains limited, but the growing bipartisan concern suggests that could change. Congressional hearings on data center energy impacts are expected in 2026.

What This Means for AI Development

The power crunch could ultimately slow the pace of AI development in the United States. If data center capacity cannot grow fast enough to meet demand, the AI buildout will be constrained regardless of how much capital companies want to invest.

Some observers see this as inevitable market dynamics asserting themselves. Others worry that power constraints in the U.S. could push AI development to other countries with fewer limitations—potentially undermining American technological leadership.

The Consumer Impact

For ordinary Americans, the data center boom means higher electricity bills with no obvious direct benefit. While AI may eventually improve productivity and services, those gains are diffuse and long-term. The higher utility bills arrive monthly.

The political potency of this issue lies in that asymmetry: concentrated costs to consumers, diffuse benefits to society. It's a formula that has fueled backlashes against other industries, and there's no reason to think AI will prove immune.

As one energy analyst summarized: "The AI revolution runs on electricity. If the grid can't keep up, something has to give—and increasingly, it looks like that something might be consumer patience."