The Biological Baseline
Imagine comparing the electricity required to light up a server rack with the twenty years of sandwiches, school runs, and evolutionary trial-and-error required to produce a single adult human. That is exactly the audacious calculus OpenAI CEO Sam Altman is using to defend the energy demands of artificial intelligence.
At the India AI Impact Summit in Delhi on February 20, Altman threw out the standard playbook for evaluating machine learning’s environmental footprint. During a Q&A hosted by The Indian Express, he pushed back against critics who routinely weigh the immense energy loads of frontier AI models against the mere 20 watts a human brain uses to process a thought.
Altman argues this math is completely backwards. Once you account for the total systemic cost of human development, he claims, AI inference is already infinitely more efficient than human cognition.
The 17-Gallon Fiction
Before getting philosophical, Altman tackled the viral misinformation surrounding generative AI's physical footprint. The internet loves to claim that a single ChatGPT prompt guzzles 17 gallons of water. Altman didn't mince words, dismissing the viral stat as pure fiction. Those bloated figures rely on outdated data center designs heavily dependent on evaporative cooling—a technique the industry has largely abandoned.
Isolate the actual electrical draw of modern AI inference, and the numbers look entirely different. A typical ChatGPT query burns exactly 0.34 watt-hours of electricity. That matches the energy needed to keep a high-efficiency lightbulb illuminated for a couple of minutes, or to run a standard kitchen oven for roughly one second.
The Caloric Cost of Childhood
To make his case for silicon over synapses, Altman laid out a sprawling biological accounting system. "Training" a human takes about two decades. It absorbs twenty years of housing, clothing, and thousands of calorie-dense meals just to reach baseline intellectual maturity.
And that is just the immediate overhead. Altman factors in the evolutionary sink of the hundred billion humans who lived before us—the immense generational energy spent learning to avoid predators and invent basic science. When you ask a human a question today, you are tapping into millennia of evolutionary debt. AI models simply compress that sprawling training phase into a single, massive burst of data center electricity. Once trained, the model hums along continuously at fractions of a watt-hour, entirely free from the biological maintenance costs keeping human workers alive.
The Evolutionary Debt
The tech world might love a radical reframing, but ethicists and industry analysts immediately tore into Altman’s biological calculus. The most glaring blind spot? Human evolutionary energy built the very infrastructure keeping Altman's models afloat. The blueprints for early computing systems like ENIAC, and the subsequent decades of grueling semiconductor advancements, were paid for in human sweat and intellect. Comparing AI directly to biological costs conveniently ignores that artificial intelligence is a downstream byproduct of our own evolutionary investment.
But reducing human development to raw caloric math has understandably hit a nerve. Labor advocates and sociologists see the framing as deeply dehumanizing. Equating the caloric needs of a growing child with the gigawatt draw of a hyperscale server farm isn't just a quirky thought experiment—it provides philosophical cover. It justifies the massive diversion of physical resources like land, capital, and power grids away from human-centric needs and directly into machine infrastructure.
The Nuclear Reality
Even as he defends AI's per-query efficiency, Altman doesn't deny the staggering scale of the industry's aggregate power consumption. Plunging computational costs and rapid capability gains are triggering an infrastructure explosion that traditional power grids simply cannot handle.
The projections are impossible to ignore. McKinsey models suggest data centers will swallow a massive 14% of total United States power demand by 2050. Tech giants are already spending accordingly. Meta alone earmarked up to $135 billion in 2026 AI capex to aggressively build out 30 new data centers.
To keep the servers running without blacking out the suburbs, the industry is looking past wind and solar, aiming straight for nuclear base-load generation. Altman’s personal portfolio tells the whole story: his heavy backing of Helion fusion and his former chairmanship at Oklo reveal the practical reality behind his summit rhetoric. Whatever the philosophical math says about brains versus bots, Silicon Valley knows powering the next decade of computation requires a total reconstruction of global energy.
