What the AI Revolution Means for the Future of Energy

An age of convergence of bits and atoms has just begun.

“Eventually, the cost of intelligence, the cost of AI, will converge to the cost of energy. How much you can have—the abundance of it—will be limited by the abundance of energy. So in terms of long-term strategic investments for the U.S. to make, I can’t think of anything more important than energy.”

That observation, made on May 8, 2025 in testimony before the Senate Energy and Natural Resources Committee, didn’t come from an energy executive. It came from Sam Altman, CEO of OpenAI, the firm that developed ChatGPT and launched the age of artificial intelligence.

While predictions that AI will “change everything” are often overwrought, few doubt the technology’s significance. One of its most credible promises, as Altman put it, is to unlock “great productivity gains.” And productivity is what drives long-term economic growth.

Altman is far from alone in observing that deploying AI at scale will lead to massive increases in electricity demand. The same conclusion is found in an ever-expanding array of reports from government analysts, from investment bankers, and from news organizations including, to note only a handful, ones from the Federal Energy Regulatory Commission, the International Energy Agency (IEA), Morgan Stanley, Goldman Sachs, the Washington Post, and the New York Times. Every such report unavoidably acknowledges an important reality: the digital economy runs on hardware, and hardware consumes a lot of energy.

AI’s massive appetite for power should finally put to rest the trope that we’re transitioning from an “old” economy of atoms to a “new” one of bits. That idea, and the phrase from “atoms to bits,” finds its origin with Nicholas Negroponte, co-founder of the MIT Media Lab, in his book Being Digital, published 30 years ago.

While Negroponte presciently predicted many of the effects of the digital revolution, he created a misdirection with his framing that “bits are weightless and virtual and allow for instant global movement.” All bits exist in physical machines that have real weight and require real energy to build and operate. The astronomical magnitude of bits produced, moved, manipulated, and stored entails truly staggering quantities of hardware.

Nowhere is that more evident than with the latest giga-scale data centers. Building just one requires about 200,000 tons of concrete, 100,000 tons of steel, and 10,000 miles of power cables—more materials than are used to build a skyscraper. (Then there’s the hundreds of tons of silicon microprocessors costing billions of dollars, per data center.) Once completed, operating that single data center consumes as much natural gas every day as a single SpaceX rocket launch. Converting all that natural gas into electrons, of course, requires megatons of machinery. Generating that electricity using solar panels instead entails a radical increase in the tonnage of materials and machinery.

Just how much material and energy will be needed to power an AI-infused future is one of the questions of the decade. The digital cognoscenti have long known that building and sustaining information infrastructure uses massive amounts of energy; now, they must add AI to that accounting. The recent IEA study on the subject, a 300-page tome titled “Energy and AI,” notes that a single large AI data center can use as much electricity as 2 million households. Lighting up digital infrastructures soon will entail demands equivalent to reliably powering hundreds of millions of new households.

But set aside the forecasts. Last year, the U.S. already saw nearly 7,000 MW of new data center construction. (For context, New York City’s peak summer power demand is about 10,000 MW.) That 2024 build rate was roughly double the construction rate of 2023, and over fivefold greater than the average added annually over the prior decade, the era that marked the beginning of the buildout of the cloud infrastructure. The trillion-dollar question is how much more will get built in the next half-dozen years.

The leading Big Tech firms have announced a staggering $300 billion in AI infrastructure spending this year alone. BlackRock forecasts that annual spending could approach $1 trillion by 2030. No one knows whether the real number will be higher or lower, but even if spending flatlines in the $300-to-$400 billion per year rate, the power implications are sobering. Every $100 billion spent on new data centers will result in something like $100 billion spent on power over one decade of operation. That reality doubtless animated Altman’s observation about the convergence of the costs of energy and AI.

How much power AI will consume will depend on the intersection of two trends: the growth in demand for bits and the pace of energy-efficiency gains in digital hardware.

Some forecasters dismiss predictions of soaring AI power use as hype, arguing that efficiency gains are coming fast and will tamp down rising power demand. They’re half right. Energy efficiency is improving rapidly—but that’s exactly why demand will grow.

We’ve seen this movie before. The Internet, smartphones, and all the businesses built atop them owe their existence to extraordinary gains in compute-energy efficiency. If a smartphone operated at 1984 energy-efficiency levels, it would consume more electricity than a city block. A single data center would require the entire U.S. grid. It was exponential efficiency gains that enabled today’s world of billions of smartphones and thousands of data centers—hardware that, collectively, now uses more electricity than the country of Japan.

When it comes to the prospect for vastly more efficient AI chips and data centers, history won’t just rhyme—it will repeat. This time, the efficiency gains are coming faster, meaning we’ll see a net increase in power demands sooner.

Society’s demand for harvesting and consuming bits—the other variable—is the prime driver of all things digital. We enter unique territory with the nature of demand for bits compared with demands in the world of atoms. In the domain of atoms (buildings, food, cars) there are clear limits to how much society can consume as wealth rises. But in the domains of bits, of information, there are no bounds.

There is no limit to how much we want to and need to know about everything in our society, in our infrastructures, in our machines, in our bodies, or in nature. The declining costs and increasing sophistication and sensitivity of sensors enables data acquisition at increasingly deeper levels of granularity, at greater volumes and frequency. Society’s consumption of information will continue to expand exponentially.

None of this is the result of government incentives or diktats. Rather, it’s driven by the fact that the data can be used to improve lives, whether for the serious businesses of health-care or supply chains, or ostensibly frivolous uses in travel and entertainment. In short, it’s about the unlimited demand to improve the productivity of everything.

Google recently issued its own assessment and roadmap regarding the implications of AI and the power needed to sustain it. Google asserts that AI presents the United States with “a generational opportunity to build a new era of American innovation and growth.” To that, I say, “Amen.” If AI delivers on its promises of a productivity boom, the consequences will be tectonic.

If AI boosts the annual U.S. productivity growth rate merely to the long-run average since 1950—i.e., higher than the current anemic rate assumed in all government forecasts—that would add a cumulative $10 trillion above projections to U.S. GDP over the coming decade. That added growth would go a long way toward solving many so-called intractable problems, not least the deficit.

If that materializes, it will also induce another, largely ignored, impact: it will spur growth in energy demand. Wealthier people buy bigger homes, travel more, and spend more on all kinds of products and services. People in enervated economies do the opposite.

That brings us back to Altman’s thesis on the convergence of energy and AI. At a recent conference, a Nvidia executive said that the tech community wants “all options on the table” because at “the end of the day, we need power. We just need power.” We are witnessing the end of the past decade’s monomaniacal obsession with wind and solar as the only options.

Artificial intelligence is the latest and most dramatic example of the truism that history rhymes. Analogies are never perfect, but it’s been 150 years since a comparable pivot in history with the dawn of what was then termed “artificial illumination.” Light bulbs are no longer seen as quasi-magical, but both early bulbs and modern LEDs produce photons that are indeed weightless. Yet today, roughly one-fifth of the world’s electricity is used to produce weightless photons.

The IEA estimates that today’s global data centers, so far, use barely 2 percent of all the world’s electricity, a share it forecasts will double by 2030. But because bits have far more uses than photons, it’s a safe bet that the energy used for artificial intelligence will eventually outpace that used for artificial illumination. The age of convergence of bits and atoms has just begun.

 

*  Mark P. Mills is a City Journal contributing editor and the executive director of the National Center for Energy Analytics.