
Every industry has its favourite shorthand for what AI feels like right now. The default answer is the dot-com bubble: inflated valuations, money appearing and disappearing, companies stapling a buzzword to their pitch deck and calling it a strategy. It is a tidy comparison, and it is not entirely wrong. But it may be leading executives, strategists, and creative leaders to the wrong conclusions about what actually needs to change.
Data thinker and Substack writer Joe Reis published an essay in April 2026 arguing that the dot-com frame is a distraction. The analogy that actually maps to the current AI moment, he writes, is electrification, specifically the period around 1905, not the roaring productivity boom of the 1920s. That gap between motor and outcome is the whole argument.
The Research Behind the Frame
Reis grounds his case in work by Paul David, the late Stanford economist, whose landmark 1990 paper "The Dynamo and the Computer" traced why electric power took so long to lift productivity. Electric motors arrived in factories in the 1880s. Meaningful productivity acceleration did not follow until the 1920s. David and fellow Stanford economic historian Gavin Wright later quantified the discontinuity: trend productivity growth moved from 1.5% per year during 1899 to 1914, to 5.1% per year during 1919 to 1929. Same technology across both periods. Forty years apart in outcome. The motor was never the problem.
The Real Constraint Was the Building
Factory owners in the 1880s and 1890s swapped steam engines for electric motors and left everything else untouched. Multi-story buildings, central power shafts, belt-and-pulley systems running to every floor. The new motor sat inside the old architecture. Real gains arrived only when factories were demolished and rebuilt: single-story layouts, individual motors per machine, workflows redesigned around electricity rather than steam constraints. Reis calls this "unit drive" and identifies it as the hinge moment. The layout changed. The results followed.
Data Has Been Running the Same Play for Decades
Reis traces the same failure pattern across the history of data infrastructure. Moving an on-premises warehouse to the cloud while keeping the same data models and batch ETL pipelines is a motor swap with the old factory left standing. Hadoop added a second motor to the same layout and created more complexity without changing enterprise decision-making. The modern data stack replaced ETL with ELT and monoliths with modular tooling, but the organisational reality stayed constant: a centralised team, a centralised pipeline, a centralised consumption model. Shinier equipment, same factory floor from the 1990s.
The Copilot Mandate Is Not Electrification
Reis describes a pattern he hears repeatedly inside large enterprises. Executives purchase Copilot subscriptions, adoption is minimal, a top-down mandate follows, and suddenly everyone reports using AI. He frames this as "the 2026 equivalent of bolting an electric motor onto your steam engine and declaring you've electrified the factory." He extends the critique to code migration projects that use AI to rewrite COBOL into Java or move legacy pipelines to modern tooling. Those efforts produce measurable wins while encoding every assumption of the old architecture into the new codebase. He calls it paving a cow path with better asphalt.
The 95% Failure Rate Explained
Reis cites a finding from the MIT NANDA report: roughly 95% of enterprise GenAI pilots fail to deliver measurable P&L impact. His reading of that number is not that AI is overhyped. It is that most organisations are still bolting motors onto old layouts. The technology functions. The architecture and organisational design will not accommodate it. The analytics request queue, he argues, is the modern equivalent of the central power shaft. One data team, one warehouse, a belt system of dashboards distributing insight to downstream consumers. The constraint is structural, not technical.
Who Actually Rebuilds the Factory
David's research pointed to a telling social dynamic: the factory owners who finally redesigned in the 1920s were often new entrants or next-generation managers who had never internalised the old layout as normal. Reis applies that directly to AI. Today's data leaders grew up in the warehouse paradigm. He estimates the real transformation timeline at 10 to 20 years, not 40, partly because the physical barrier of demolishing brick buildings does not apply, and partly because David's paper and Carlota Perez's work on techno-economic paradigm shifts are widely read. The trap is visible. Stepping out of it is still slow.
Reis published the essay on April 19, 2026, through his Substack newsletter. The piece attracted attention for framing a structural argument around a specific historical dataset rather than relying on sentiment or analogy alone. The productivity numbers from David and Wright give the claim a concrete anchor that most AI commentary lacks.
If the pattern holds, the organisations that close the gap between motor and outcome may not be today's incumbents at all. Reis suggests large companies could be eroded by smaller competitors who build the single-story factory from scratch, having no prior layout to defend. Whether creative and media agencies, which have their own version of the centralised pipeline problem, move faster than traditional enterprise data teams is an open question. The argument implies that the agencies already rethinking workflow and decision structure, rather than layering AI onto existing production pipelines, may be the ones who end up on the right side of the productivity gap.