The AI productivity narrative has a blind spot the size of a data center. Everyone's talking about 10x output. Nobody's talking about what it costs — in kilowatt-hours, in water, and in the human energy you burn to keep up with a machine that never gets tired. Two pieces of writing, published within weeks of each other, frame the problem from opposite ends. Simon Willison quantified the electrical cost. Steve Yegge named the human one. Together, they describe a paradox that every company adopting AI coding tools is about to discover the hard way. ## The Electrical Bill Nobody's Reading Simon Willison, the Django co-creator and open-source veteran, looked at what AI coding agents actually consume. Not the per-query napkin math that ChatGPT users cite — the real cost of agentic coding, where tools like Claude Code burn through thousands of tokens across dozens of tool calls per task. Simon P. Couch ran the numbers on his own usage. The result: approximately 4,400 typical LLM queries' worth of computation per day, at roughly $15-20 in API costs. In energy terms, that's equivalent to running a dishwasher once or keeping a refrigerator running for a day. That sounds manageable until you multiply it. The IEA projects global data center electricity consumption will hit 1,050 TWh by 2026. U.S. data centers alone consumed 183 TWh in 2024 and are projected to reach 426 TWh by 2030 — a 133% increase. AI data centers generated 105 million metric tons of CO2 in the twelve months ending August 2026, surpassing the aviation industry's carbon footprint. And the water. U.S. data centers consumed 17 billion gallons for cooling in 2023. That figure could quadruple by 2028. The dishwasher comparison is reassuring at the individual level. At civilization scale, we're building the most energy-intensive industry in history and calling it efficiency. ## The Human Vampire Steve Yegge's "AI Vampire" identifies the cost that doesn't show up on any utility bill. He borrows from "What We Do In The Shadows" — Colin Robinson, the energy vampire who drains people just by being in the same room. That's what AI does to developers. Not through malice or design, but through the sheer cognitive intensity of directing a system that can execute faster than you can think. Yegge's central finding: building with AI at full intensity is sustainable for about 3-4 hours per day. Not because the tools stop working, but because the human does. The concentration required to properly direct, review, and iterate with an AI agent — catching its mistakes, maintaining architectural coherence, making judgment calls it can't — is exhausting in a way that traditional coding never was. The productivity paradox: AI makes you dramatically more productive per hour, but you can sustain fewer productive hours. The net output might be higher. The experience of the workday is fundamentally different. ## The Value Extraction Problem Here's where the vampire metaphor turns from clever to structural. Companies see the 10x output number. They don't see the 3-4 hour ceiling. So they expect developers to maintain AI-boosted intensity across an 8-hour day — an impossibility that leads to the kind of burnout that doesn't announce itself until it's too late. Yegge's recommendation is radical by corporate standards: change expectations about how many hours constitute a workday. If AI genuinely delivers 10x per hour, then a 4-hour workday at AI intensity produces more than an 8-hour day without it. The math works. The culture doesn't. This is the pattern that repeats across every productivity revolution. The gains accrue to the organization. The costs accrue to the individual. The AI vampire doesn't drain your electricity. It drains you. ## The Infrastructure Paradox Zoom out further and the energy story gets stranger. Morgan Stanley is telling clients to invest in AI energy infrastructure — the companies building power capacity for data centers. Goldman Sachs found that massive AI investment contributed "basically zero" to U.S. economic growth. The market is simultaneously betting that AI needs more energy and questioning whether AI produces anything worth the energy it consumes. In Ireland, data centers could consume 32% of national electricity by 2026. In Virginia, they already take 26%. These aren't projections from AI doomers. They're from the International Energy Agency. The AI industry's answer is nuclear. Every major lab and cloud provider is exploring nuclear power deals. The industry that promised to make everything more efficient is now proposing to build nuclear reactors to power itself. ## The Contrarian Position The honest position is uncomfortable: AI coding agents are genuinely transformative, genuinely energy-intensive, and genuinely draining to use at full capacity. The dishwasher comparison cuts both ways. Yes, one developer's daily AI usage equals one dishwasher cycle. But we don't have 50 million dishwashers spinning up simultaneously in data centers that need nuclear reactors to stay online. The human vampire cuts deeper. The productivity gains are real — Willison, Yegge, and every developer who's shipped with Claude Code or Cursor can attest to that. But the assumption that humans can sustain AI-augmented intensity indefinitely is the kind of thinking that precedes every burnout epidemic in tech history. The companies that figure this out — shorter, more intense workdays with genuine recovery time — will keep their best developers. The ones that treat AI productivity as a reason to demand more hours will lose them. The vampire doesn't care about your quarterly targets.