Data centers now consume enough electricity to power 13 million homes in the United States. As artificial intelligence drives an unprecedented surge in computing demand, that staggering figure may be just the beginning.
Today, roughly 5,400 data centers in the U.S. account for nearly 3% of the country’s overall electricity consumption. In my home state of Utah, grid demand from these data centers currently sits at 4 GW. If every proposed data center in the state were built, demand would soar to 12 GW, tripling today’s levels. Meanwhile, data centers in Northern Virginia, the leading market in the United States, require enough electricity to power 800,000 houses – that equates to more than a quarter of all homes located in the region. The state is now limiting power allocation for new data centers, which creates challenges for expansion.
This is just the beginning; according to the Western Electricity Coordinating Council, the projected growth of new data centers—some of which can be built in as little as 18 months—far outpaces the development of new electrical energy supply and transmission.
As innovation continues, these new data centers will run larger, more sophisticated AI models, which require more power. Already, there’s talk within the industry of training models that would require one gigawatt of power—enough to run a small city. As AI models grow and adoption spreads, we’re likely to see the need for computational power (and, therefore, electricity) to grow exponentially. That’s to say nothing about the continued proliferation of IoT devices, streaming services, and smartphones, all of which rely on data centers.
With all these considerations, data centers could require up to nine percent of total U.S. power demand by 2030.
The question looms large: Where will this electricity come from? AI and data infrastructure leaders are already sounding alarms about the potential energy supply bottleneck ahead.
Mark Zuckerberg, CEO of Facebook, noted on the Dwarkesh podcast, “I actually think before we hit [computing constraints], we’ll run into energy constraints.” Meanwhile, Mark Ganzi, CEO of DigitalBridge, noted during a Q1 2024 earnings call that the current data center growth rate could cause power depletion “in the next 18 to 24 months.”
Sam Altman, CEO of OpenAI, also echoed concern: “I think we still don’t appreciate the energy needs of [AI]. We need fusion, or we need, like, radically cheaper solar plus storage, or something, at massive scale — a scale that no one is really planning for.”
Energy constraints aren’t just a topic of conversation for industry leaders. We already see concrete examples of data center construction slowing due to dwindling supply. In 2022, Dominion Energy paused data center connections in Northern Virginia, illustrating that even leading geographic markets struggle to keep pace with demand. While connections have resumed, the region is now hurriedly investing in new transmission infrastructure and considering increasing zoning requirements for new facilities. This impact is not only regional. This year, Silicon Valley Power started limiting many proposed data centers in Northern California to a maximum electricity allocation of 2 MW.
Data centers and AI face a clear infrastructure challenge. So, what’s our solution?
At Torus, we see the data center boom as both a challenge and an opportunity to reshape energy infrastructure in ways that support AI-driven growth sustainably. We’ve concluded that to support rising demands, traditional infrastructure alone won’t cut it. The solution to meet growing energy demands lies in a distributed, flexible energy infrastructure.
Here's what that looks like: Rather than relying solely on large-scale grid expansions—which are often costly and slow—utilities and energy customers can adopt a faster, incremental approach. This involves deploying energy generation and storage systems directly on commercial properties, transforming each into a decentralized power node.
Key technologies in this distributed strategy include Flywheel Energy Storage Systems (FESS) and Battery Energy Storage Systems (BESS). FESS acts as an instant energy buffer, much like a water tank that stores rainwater, quickly absorbing and releasing energy to manage fluctuations. BESS, meanwhile, functions as a long-term energy reservoir, storing surplus electricity during low-demand periods and releasing it when demand surges. Together, FESS and BESS offer a dynamic solution for balancing grid frequency, preventing congestion, and enhancing resilience.
Artificial intelligence also plays a critical role, not only as a driver of demand but as a tool for intelligent energy management. When linked through advanced virtual systems, the decentralized nodes provide utilities with a scalable, networked energy generation and storage source. This allows them to balance grid demands in real-time, manage peak loads, and alleviate congestion. AI can optimize grid operations through real-time demand forecasting and automated energy distribution, ensuring that every kilowatt is used effectively.
While the path forward is complex, these combined innovations in distributed energy, storage, and AI-powered management can enable a sustainable, adaptive grid—one that’s ready to meet the demands of an AI-driven future and secure reliable energy for years to come.