By Thomas Gorbold
Artificial Intelligence (AI) and data centres that support the fastest-growing technology in history are hitting a wall made of steel, concrete and copper: the power grid.
In the United States, AI-related spending accounted for more than two-thirds of annualised GDP growth in the first half of the year, leaving the rest of the economy barely moving. Much of it is going into the data centre infrastructure needed to run AI systems, and demand is rising faster than grids can keep up.
Data centres run 24/7, draw as much electricity as small cities and require constant power in a way renewable energy cannot yet guarantee. What appears to be a race to build better AI models is, underneath, a race to secure the energy needed to power and cool these facilities. Estimates suggest the global build-out of AI and data centre infrastructure could reach $5-7 trillion over the next five years, with roughly $720 billion needed for the new power infrastructure required to keep them running.
JP Morgan expects around 122GW of new data centre capacity to be added between 2026 and 2030. To understand the true electricity required, analysts use PUE (Power Usage Effectiveness), a measure of how much extra power a data centre needs beyond the computing load itself. A perfectly efficient facility would have a PUE of 1.0, meaning all electricity goes to computing. In practice, modern data centres typically operate around 1.2-1.4, which accounts for the additional energy needed for cooling, lighting and electrical systems.
Applying a PUE of 1.3 to the projected 122GW of computing load brings the total requirement to nearly 160GW. It represents a block of new electricity demand roughly equal to Japan’s entire national consumption appearing within a few years. Goldman Sachs forecasts that global data centre electricity demand could rise by 165% by 2030 as AI systems become more intensive.
But how do we power that? The problem is not funding or ambition, but time. New electricity capacity cannot be built at the speed technology companies expand.
These constraints are already visible. In Silicon Valley, two completed data centre campuses stand empty because the local grid cannot supply the power they need, leaving centres idle while waiting for new transmission capacity. China faces fewer such bottlenecks. Its onshore wind and solar projects can be built in roughly two years, compared with five years or more in the US, giving China’s digital infrastructure a clear time-to-power advantage over the US.
China is now by far the largest energy investor in the world, spending almost as much as the EU and the United States combined. With fewer electricity constraints, China can scale AI by simply adding more hardware, even if those chips are older versions and less efficient. The United States, by contrast, must squeeze more performance out of each generation of processors because grid capacity cannot grow at the same speed.
With efficiency improvements levelling off, the US is constrained by its grid while China can continue expanding power supply far more quickly.
Gas on the rise
Electricity demand from AI is rising faster than most grids can expand. New grid lines are slowed by planning and local objections. Nuclear projects typically take more than a decade to build, coal is being phased out, and renewables without feasible battery storage cannot sustain the steady, high-intensity loads used by large AI facilities. This has pushed companies toward on-site power generation, with many new data centre projects planning their own gas-fired turbines because grid connections often take years to secure. Natural gas remains one of the few energy sources that can be deployed quickly while reliably covering the continuous baseload these facilities require.
Natural gas is therefore becoming the default bridge fuel in the AI build-out. Major oil-and-gas producers argue that meeting the surge in electricity demand will require all available energy sources, including gas, as countries compete to attract data centre investment. Gas is cleaner than coal, faster to install than nuclear and more dependable than wind and solar without large-scale storage. It is also cheaper to build at scale: EIA data shows that the average construction cost of a natural gas plant is roughly $820,000 per megawatt, compared with $1.45 million for wind and $1.6 million for solar. Given its lower cost and ability to operate continuously, gas remains more practical than wind or solar for meeting today’s data centre needs, which renewables alone cannot yet supply cost-effectively.
Emissions from major operators reflect the scale of this shift. Microsoft reported an increase of about a third in 2023 and Google’s emissions grew nearly 50 per cent between 2019 and 2023 as data centre activity expanded. In parallel, climate goals are increasingly being delayed or softened as countries prioritise reliable power and competitiveness in the AI race. In the long run, AI could help reduce emissions in other sectors by optimising and updating power systems, industrial processes and transport. For now, though, the energy required to run advanced models is adding to the carbon footprint of the tech sector.
Local resistance is growing in areas where data centre development is accelerating. In several parts of the United States, county authorities have paused new projects as concerns emerge over how rising electricity demand will affect local grids and future bills. Data centres now use about 5 per cent of the United States’ electricity, up from 2 per cent a decade ago and their share could approach 10 per cent by 2030. Supporting this growth requires new substations, transformers, transmission lines, with the expense often absorbed through general rate structures. In other words, this means the debt taken on to expand local networks can often be passed on in higher energy costs for local consumers. Higher energy costs raise operating expenses, weaken local competitiveness and cut into data centre margins, making further development more expensive in those areas.
Who wins the AI race?
The power grid was built for steady demand and is not ready for the surge created by AI. If the United States wants to compete with China, it will need to upgrade and, more importantly, optimise the system. Many parts of the grid still hold latent capacity that smarter coordination and new smart-grid solutions could bring into use. This is a clear opening for technology companies. Their ability to innovate at scale, combined with a more flexible regulatory environment, could modernise a power sector that has changed little in decades. If the US can streamline policies and let competitive markets drive new grid technologies, it will be better placed to outpace China in the next phase of the AI race.
AI may reshape economies but deploying it at scale will still come down to how fast nations can produce, transmit and afford reliable energy. Those that fall behind risk slower productivity gains and greater dependence on foreign technology. Power grids may end up determining who leads the next era of global growth.
The views expressed in this article are the author’s own and may not reflect the opinions of The St Andrews Economist.
Image Credit: PickPik

