The artificial intelligence revolution runs on electricity—and it needs a lot more of it. New projections show data center power consumption is set to explode, potentially reaching 8.6% of all U.S. electricity demand by 2035, more than double today's 3.5% share. For a grid that hasn't seen meaningful capacity expansion in decades, the AI boom is forcing a reckoning.
The Numbers Are Staggering
U.S. data center power demand will rise to 75.8 gigawatts in 2026—up from roughly 50 GW today—according to S&P Global projections. By 2030, that figure nearly triples to 134.4 GW. To put this in perspective, a single large nuclear reactor produces about 1 GW of power.
Goldman Sachs Research is even more aggressive, forecasting global power demand from data centers to increase 50% by 2027 and by as much as 165% by the end of the decade. The firm estimates that about $720 billion of grid spending through 2030 may be needed just to keep up.
The International Energy Agency projects global electricity consumption for data centers will double to around 945 terawatt-hours by 2030, representing just under 3% of total global electricity consumption. In some countries, the impact is even more pronounced—Ireland, for example, could see data centers consuming 32% of its electricity by 2026.
Why AI Is Different
Not all computing is created equal. AI workloads are extraordinarily energy-intensive compared to traditional data center operations. Training a large language model like GPT-4 requires exponentially more compute—and therefore electricity—than running a typical cloud application.
Electricity consumption in accelerated servers, mainly driven by AI adoption, is projected to grow by 30% annually, while conventional server electricity consumption grows at just 9% per year. The shift toward AI is fundamentally changing the math of data center power consumption.
"The occupancy rate for data center infrastructure is projected to increase from around 85% in 2023 to a potential peak of more than 95% in late 2026," notes one industry analysis. Existing facilities are maxing out even as new construction struggles to keep pace.
The Infrastructure Bottleneck
Here's the problem: the United States hasn't built significant new electricity transmission infrastructure in years. Permitting delays, supply chain bottlenecks, and the sheer complexity of upgrading century-old grid systems mean that expanding capacity is measured in years, not months.
Data center supply has been constrained for the past 18 months specifically because utilities cannot expand transmission capacity fast enough. Some tech companies have gotten creative—one hyperscaler is partnering with a renewable developer and private equity to invest $20 billion in developing an energy park with colocated generation and storage, set to be operational by 2026.
But not every company has the resources for such solutions. OpenAI and SoftBank's Project Stargate, announced in January with plans to spend $500 billion on AI-focused data centers by 2029, has seen a slower-than-expected start due to infrastructure constraints.
The Competitiveness Question
The stakes extend beyond corporate profits. A RAND Corporation analysis warns that failure to address power bottlenecks may compel U.S. companies to relocate AI infrastructure abroad, potentially ceding technological leadership to nations with more available power.
Corporate investments in artificial intelligence, especially data center construction, have driven over one-third of U.S. GDP growth in the first nine months of 2025. If the grid can't support continued expansion, the economic consequences could be significant.
Investment Implications
For investors, the AI power crisis creates opportunities across multiple sectors:
Utilities with exposure to data center demand. Power companies serving regions with high data center concentration stand to benefit from increased demand and the capital investment needed to meet it.
Grid infrastructure plays. Companies manufacturing transformers, switchgear, and other grid components face a demand surge as utilities scramble to upgrade capacity.
Nuclear energy. Tech giants including Microsoft, Google, and Amazon have all signed agreements to power data centers with nuclear energy, viewing it as the only reliable source of carbon-free baseload power at scale.
Natural gas. In the near term, natural gas generation is filling the gap. Despite climate concerns, it remains the fastest way to add dispatchable power to the grid.
Renewable developers. Solar and wind projects co-located with data centers are increasingly attractive to tech companies seeking to claim carbon neutrality.
The Path Forward
By 2035, BloombergNEF expects global electricity demand from data centers to reach 1,200 terawatt-hours, climbing to 3,700 terawatt-hours by 2050. Meeting this demand will require a fundamental transformation of how the world generates and distributes power.
The United States leads the world in data centers and AI compute today. Whether it maintains that position depends heavily on its ability to solve the power problem. The AI revolution may be powered by software, but it runs on electricity—and right now, there isn't enough of it.
The Bottom Line
The AI boom's biggest constraint may not be chips or talent—it's power. As data centers consume an ever-larger share of U.S. electricity, the intersection of technology and energy infrastructure becomes one of the most consequential economic stories of the decade. Investors, policymakers, and business leaders who understand this dynamic will be better positioned for what comes next.