Jan 10, 2026
6 min read
We have a problem. The latest foundational models are smarter, faster, and more capable than ever. They are also thirsty. The energy consumption of global AI datacenters has surpassed that of the entire aviation industry.
Jevons Paradox is in full effect: as AI becomes more energy-efficient per token, we find exponentially more uses for it, driving total consumption up.
The recent breakthroughs in commercial fusion power (specifically the ignition event at NIF-Commercial) offer a glimmer of hope. Several major tech giants have signed exclusivity deals with fusion startups.
In the meantime, "Green AI" is the new buzzword. Developers are optimizing for "inference-per-watt," and new hardware architectures are ditching the general-purpose GPU for highly specialized, low-power ASICs (Application-Specific Integrated Circuits).