Data centers form an almost unseen backbone enabling artificial intelligence systems, answering searches, online chats, and running online services from the essential to the merely entertaining. The advances in artificial intelligence (AI) have spurred concerns, as highlighted in a recent Wall Street Journal article, that AI will drive up electricity prices for everyone. This worry has dashes of both truth and exaggeration.
One problem with the narrative that AI is to blame is a focus on any single factor. Today, it’s already the case that certain areas experience price hikes due to concentrated demand from data centers. Data Center Alley in Loudon, Virginia, is one example. The omission that makes this into an exaggeration is that these spikes don’t happen in a vacuum. Poor policy makes meeting the shared need of affordable power for average families and new data centers a crisis of our own making.
The way out of this crisis is enabling prices to guide entrepreneurship. High prices signal high value. Yet we’ve failed to allow entrepreneurs to respond to that potential value. Fundamentally, electricity prices are subject to the same core economic forces as anything else–supply and demand.
Price spikes happen because of limited generation capacity and transmission constraints. If prices were rising for any other good or service, the problem would solve itself as new providers enter the market to earn those high prices. But in electricity markets, the system requires physical infrastructure—transmission lines, substations—that takes time to build.
Framing the challenge for enabling data centers as one about enabling entrepreneurs paints a clear picture of what policymakers can do to enable data center growth. Rather than blame data center developers, local and state policymakers should be evaluating their policies and regulations to enable faster responses to the needs of both everyday consumers and industrial energy users. The target for reformers should be antiquated and unwieldy processes that make the system unresponsive. Red tape across the country is ripe for cutting.
Prices guide entrepreneurs to opportunities
In a typical market, high demand leads to new suppliers entering the market. Think about it like this, there is some price of hamburgers that you and I would quit our jobs and start flipping burgers. The high prices that draw us into the market are also soothed by our new burger truck making the rounds. But electricity regulations don't work like that. You can't easily "plug in" more supply, or, at least, not overnight. The system's complexity and safety requirements mean that infrastructure upgrades are essential but slow.
It’s easy to underestimate the signals that electricity markets send. Price differences can be extreme. One substation can be in shortage and a neighboring substation can have an abundance. That’s the case with the Pleasant View and Goose Creek substations shown below. Prices are negative at Goose Creek because it has access to a lower voltage supply that Pleasant View does not.
![](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F098b90aa-fbda-4691-8c57-42689b5b47c6_886x821.png)
Data Center Alley illustrates exactly this problem of a slow response. Grid Status, for example, points out that despite seeing the iceberg in our path, the system is too slow to chart a new course. As Grid Status points out in a recent analysis:
In some ways, data center alley is a peculiar example of congestion because the issue is well understood by the local utility, and was known at least 3 years in advance of becoming an issue. However, due to the difficulty in building transmission right now the required upgrades can’t be completed in time to meet anticipated demand.
If you want to understand why reporting focuses on concerns about data centers raising prices, it comes down to exactly this: we can see a problem approaching, but we can’t do anything about it. The analysts at Grid Status suggest that the needed upgrades are likely another three years away–expected in 2027. Together, these points add up to a simple conclusion, it’s exactly a crisis of our own making.
Keep your wits about you
The first thing that policymakers should do in data center load growth conversations is to keep their wits and skepticism about them. Despite some of the claims suggesting that we are in lots of trouble, AI load growth will likely be much more manageable than some numbers thrown around look at first glance.
Some of the confusion represents that we struggle with percentages. A small change can look like a big percentage change. For example, Goldman Sachs predicts that data centers will move up from about 2% of worldwide electricity use to about 4% of electricity use by 2030. That’s just a two percentage point increase, with allowances for rounding, Goldman predicts a 160% increase. That sounds different, but means the same thing when you understand the context. Much like a label with the note “90% fat-free” emphasizes something different, but is equivalent to a label saying “10% fat”. To be fair, however, shifts of even small percentages in a global total represent a trend worth considering.
Another point of confusion is that AI companies are playing the odds. They make several requests each of their planned data center expansions. This makes the interconnection requests large, but some proportion of them is duplicative. A single project’s developers submitting multiple requests around the country–queue shopping. Companies are shopping for the fastest and cheapest power available. That means multiple requests for the same need. This is a strategic behavior, but an understandable response to the long delays companies are facing.
What can be done? Lots!
Policymakers have several tools to cut through the noise created by queue shopping and reveal the more serious proposals. One option is that they raise the cost of entry into the queue. States and localities could require either higher fees for interconnection requests or proof of credit on par with their relevant request. So requesting a 1GW connection requires a much higher application fee or proof of credit than, say, a 100MW request.
Another option is to require that interconnection requests show control of the land the data center would be built on. A speculative developer may submit a request for 250MW in one area, but if they can’t show that they have an option to lease the land, actually own the land, or have a legal ability to use that land, then their request could be sent back with a request for more information. Doing this could be as simple as checking that the zoning allows a data center development or that they have local siting approval. Application reviewers can also become more sophisticated in their reviews so you can throw out proposals that don’t make sense. For example, questioning if the proposed site makes sense given their sizing and power needs.
Another promising option is to let developers start the process with their own funds. Amazon, Meta, or OpenAI, could front the cash for equipment with long lead times. Or they could support the staffing costs to perform the interconnection reviews and studies. Utilities could set the specifications that developers need to meet. Companies willing to make this investment could be rewarded with a fast lane for approval.
Utilities can also rely more on those requesting interconnections for power flow modeling and forecasting demands on their system. There are several tools under development and testing at different scales. Tech companies like Amazon and Google are behind a few of them, likely because of the barriers that they’ve faced to connecting to the wider grid. Amazon Web Services is working with Duke Energy to speed up the power flow studies done before connecting to the local distribution system. Google is developing Tapestry, a platform for projecting energy needs and grid operations. Tapestry’s team is working with Chile to accomplish aggressive decarbonization goals and with AES in Ohio and Indiana on advanced grid modeling projects. New tools like these that enable faster rollout of new generators and better integration of variable sources should be top of mind for utilities looking to attract and enable data center investments.
Utilities also need much more flexible options for interconnecting data centers in the queue. In 2016, Microsoft set up a deal in Wyoming where the local utility can call on the 237MW data center to run on its own backup during stressed times on the area’s grid. This meant that the local utility did not need to build its own generation to enable the data center. Ultimately that’s a win-win for the utility’s ratepayers and the data center. These kinds of solutions are far from the norm but have a lot of potential for utilities. Creating interconnection pathways that incorporate a data center’s co-located backup generation system is a promising way to speed interconnections without creating risks for other consumers.
And a final potential option is to set free entrepreneurs. Glen Lyons and Travis Fisher have a proposal to allow sophisticated customers to set up their own electricity grid to power data centers or other uses. An obvious use case for a separate grid would be data centers since many developers are already throwing up their hands in exasperation at the existing system and doing it themselves. Allowing a setup like this at scale likely requires that states tweak their existing monopoly rules for electricity, of course, to allow these endeavors apart from the regulated system the utility operates. In Lyons and Fisher’s view, such a change would free entrepreneurs to experiment while insulating residential customers from the grid impacts of data centers (exaggerated and misunderstood as they may be).
The call is coming from inside the house
These solutions boil down to a simple catchphrase, policy should allow entrepreneurs to experiment, evaluate, and evolve their strategies. Barriers to that experimentation require a high bar because they slow progress.
Return to Data Center Alley. If, as in this case, the challenges were seen years before today, and yet the solution won’t materialize for several more years, that’s evidence of a broken system. That is, Data Center Alley’s struggles are evidence that the AI load growth concerns are a crisis of our own making. In this light, the call is coming from inside the house. Bad policy and too much red tape are in the way of artificial intelligence’s development.
For policymakers, the message is clear, build a system that can grow with demand, and you’ll attract significant investment. Don’t let this crisis continue. We have the knowledge and capability to meet AI's load growth demands—we just need the will to update the system and let it thrive.