Data Centers & Energy

Data centers consumed roughly 460 TWh of electricity in 2024 — about 1.7% of global power demand. The IEA projects this could double by 2030 as AI training and inference loads scale. The binding constraint is no longer capital or land but grid interconnection and reliable power supply.

460 TWh
Global data-center electricity demand (2024)
1.7%
Share of world electricity
1.20
Average hyperscale PUE
IEA projection: 2030 vs 2024

Key insights

🤖

AI is bending the curve

Through 2018 data-center energy demand was roughly flat — efficiency gains offset traffic growth. The post-2022 shift to large transformer training and inference reverses that. The IEA estimates AI-specific compute could consume 200+ TWh by 2030, more than current data-center demand in any country except the US and China.

🔌

Grid interconnection is the bottleneck

In northern Virginia (the world's largest data-center cluster) Dominion Energy quotes 4–7 year interconnection waits for new gigawatt-scale loads. Similar queues in Dublin, Frankfurt and Singapore have triggered de facto moratoriums on new builds. Microsoft, Amazon, Google and Oracle are signing long-term PPAs for solar+storage and increasingly for nuclear (small modular reactors, restarts of retired plants).

💧

Water use is locally significant

Cooling consumes 1.7 to 9 litres of water per kWh of IT load depending on climate and design. A single hyperscale campus can use as much water as a town of 50,000. Direct-to-chip liquid cooling and air-side economisers reduce water use but increase electricity demand — there's no free lunch.

Global data-center electricity demand 2010–2030

TWh per year, IEA central case

Key Finding: Demand was flat 2010–2018 despite massive traffic growth. The AI inflection points up sharply from 2023.

Average PUE — hyperscale vs enterprise (2015–2026)

Power Usage Effectiveness, lower is better

Key Finding: Hyperscale operators sit near the practical floor of 1.10–1.20; enterprise data centers lag at 1.6+ on average.

Methodology & caveats

PUE explained

Power Usage Effectiveness = total facility power / IT equipment power. PUE 1.0 is the theoretical lower bound (no overhead). Hyperscalers report 1.10–1.20 at modern sites; the global enterprise average sits near 1.55. PUE is a useful efficiency proxy but ignores water and carbon.

What 'AI energy use' includes

Training a frontier model can consume 50–100 GWh once. Inference (serving queries) is more distributed but in aggregate often exceeds training. Public figures generally cover training; total AI energy includes inference, data ingestion, fine-tuning and supporting infrastructure — and that aggregate is poorly measured.

Reporting gaps

Data-center energy is reported voluntarily through programmes like the EU Code of Conduct and operator ESG disclosures. National statistical agencies do not separate the sector cleanly from broader commercial demand. IEA estimates therefore rely on operator surveys plus modelled adjustments — confidence intervals are ±20% at the global level.