Social chatter about AI and power swings between denial and disaster. Both skip the boring middle: electricity is now part of the product boundary, and there is finally a public, model-based read from an authority that exists to track energy systems. In April 2025 the International Energy Agency published Energy and AI, a special report with global and regional modeling plus input from governments, utilities, and tech firms. This post is not a substitute for the full document; it is what I kept after reading the executive summary through a builder's lens.
Scale: small globally, loud locally
The IEA states that data centers accounted for about 1.5% of world electricity use in 2024 (415 TWh), with the United States at roughly 45% of that, China 25%, and Europe 15%. Consumption has grown near 12% per year since 2017, well above total power demand. None of that makes data centers the only story on the grid, but it explains why certain metro clusters suddenly care about transformer lead times.
By 2030, the same analysis puts data center demand at about 945 TWh, slightly above Japan's total electricity use today. AI is the main driver of that rise together with other digital load. In the United States, data centers account for nearly half of expected electricity demand growth through 2030; by decade's end the country could use more power for data centers than for aluminium, steel, cement, chemicals, and the rest of energy-intensive industry combined (again per the IEA base case, not a thread on X).
Grids and timelines matter more than slogans
The report is blunt about bottlenecks: long interconnection queues, multi-year transmission builds, stretched equipment suppliers. It estimates that roughly 20% of planned data center projects could face delays unless grid risks are managed. Half of growth in data center demand through 2035 is met by renewables in their storyline; dispatchable capacity (notably gas) and nuclear also play named roles. The point for software people is not to pick favorites in that mix; it is that hosting geography and contract shape are now strategic, not administrative.
Uncertainty and efficiency are engineering knobs
The IEA runs scenarios. In their high efficiency case, stronger gains in hardware and model efficiency push 2035 data center demand to about 20% below the base case. Across cases the band for 2035 spans roughly 700 to 1,700 TWh. That is not trivia for futurists; it is the difference between "we can ship bigger batches" and "we throttle every afternoon" if your team ignores power and cooling envelopes.
I read that the same way I read latency and dollar budgets: a constraint you design for up front, not after the rack is full.
Why this sits next to your other AI posts
None of this replaces threat modeling, protocol choice, or harness design. It is the physical floor underneath all of them. If you already think in cost tiers and fallbacks for model calls, add site power and fleet efficiency to the same spreadsheet. The IEA publication is the closest thing today to a shared reference for arguing about that floor with finance and facilities without relying on recycled infographic numbers.
If you want one line to remember: there is no AI without electricity. The open question is whether teams plan for that constraint before the grid planner calls them, not after.