Welcome to the age of industrial code
An industry of artisans tries to brute force its way into running factories
Over the last 10 years, I've baked somewhere between 500 and 1,000 loaves of sourdough bread. In addition to consuming the end result, I take immense pleasure from the slow, deliberate, and hands-on nature of the craft, as well as the now-intuitive skills I've gained by pursuing it for so long. I've gotten good enough that I can bake many different types of bread entirely by feel, without measuring ingredients or glancing at a clock.
Those of you in the tech world will read this with a knowing grin. I'm far from alone when it comes to technologists turning to baking as a past-time. Much of coding, too, is best understood as an artisanal craft, and many elements of it, particularly the acquisition of intuitive knowledge of a codebase over long periods of time, bear a striking similarity to baking with sourdough. It should come as no surprise that so many engineers have embraced baking as a hobby that translates their hard-won problem-solving skills to the fascinating and delicious world of culinary microorganisms.
Beyond the end result, there is one key difference between the socioeconomics of bread and software. The bread that I, and many of my peers, bake at home or in the increasing number of craft and micro bakeries, is a completely different product than the bread you find at most supermarkets. Like with virtually any other activity, when baking moves from craft to industrial scale, the knowledge of the artisan – how to modify the process in response to variables like temperature, humidity, and starter temperament – becomes a hinderance, rather than a help. Instead of adjusting to variables, at industrial scales you need to turn variables into constants. Doing so requires tools like climate-controlled environments, high-speed mixing and kneading machines, conveyor-belt ovens, dough-stabilizing additives, and, most importantly, commercial yeast, which grows at predictable rates at the expense of texture and flavor.
Unlike industrial bakeries, the code shipped by the Googles and Amazons of the world is not all that different from that produced by a hobbyist for a personal project. In fact, big tech companies routinely rely upon open-source projects that are essentially maintained by hobbyist volunteers. I'm fairly certain Bimbo has never asked a home baker to borrow a cup of sourdough starter.
Unique across potentially all mega-industries, programming has by and large remained an artisanal pursuit even as it has scaled to dominate the global economy and, increasingly, global politics. Even at the biggest firms, the process of producing code has until now beared a closer resemblance to the production line at a neighborhood bakery, with small teams of experts producing specialized pieces of output one at a time, than to a factory line producing thousands of loaves a day under the supervision of just a few workers.
There are structural reasons for this difference, of course. Unlike loaves of bread, the freshest artisanal code can be instantly copied and shipped across the world, no preservatives or cargo containers required. Big tech became so successful, so quickly, in part because it never needed to adopt industrial production methods, with their intensive capital costs, to sell its products at industrial scales.
This feature of the tech business has recently turned into a bug for its owner class. Workers, not capital, have historically been big tech's biggest cost, which gave software developers the power to command high salaries and even demand values-driven changes to their company's lines of business. After the 2024 election, when it became clear that the federal government would operate on a mob-like favors system, workers exercising collective power became untenable to the future of big tech businesses. In response, Silicon Valley's elite started cracking down, turning cultures that were once renowned for their lavish perks, openness, and transparency into more traditional command-and-control enterprises, particularly where political expression and government contracts are concerned.
Tech leaders were able to change the power dynamics so quickly for a few reasons. A major one is that tech workers, who lean liberal but with a libertarian streak, never solidified their power by unionizing. But Silicon Valley's elite also benefitted from good timing. The election coincided with an ongoing series of layoffs in the technology and service sectors, which, despite what you may have heard, were mostly caused by rising interest rates creating a hangover from pandemic-era hiring surges.
Finally, of course, these layoffs overlapped with the increasing adoption of AI, particularly LLM coding tools. These tools have changed the daily job of coding in myriad ways. But they've had a much bigger impact on big tech's balance sheets, which makes me (and plenty of other critics) wonder if the tail is starting to wag the dog.
Capital expenditures, mostly in the form of datacenter build-outs to power AI model training and serving, have exploded among all the major tech companies, as well as the new frontier AI labs and "hyperscalers." This "mother of all pivots" comes with the enormous risk that, by transforming into businesses that resemble heavy industries like manufacturing, big tech will begin to erode the incredible profit margins that have long justified their premium valuations.
To put these expenditures in perspective, projected AI capital spending for 2026 alone now exceeds the amount spent on the entire 1850s railroad boom and the late-20th-century telecom boom, both in real terms and as a percentage of GDP. You can debate whether these comparisons are apt on a technical level given that tech companies are (maybe?) largely re-investing their own revenues rather than debt-financing the buildouts, but one thing is certain: for the projected multi-trillion-dollar capital investments to ultimately pay off, LLM-based tools will need to have similar impacts to railroads or broadband. They'll need to become more than just productivity-enhancing, they'll need to transform the actual work of coding – and many other types of knowledge work – from bespoke artisanal pursuits into predictable industrial processes that justify price tags many times their current heavily subsidized rates, not to mention the potential future costs of enormous environmental externalities.
Like all good industrialists, a mere trillion-dollar question hasn't stopped tech oligarchs from repeatedly using the specter of job automation, and the need to invest in it, to justify poor decision-making and cow their workers into submitting to the new industrial order. The great Cory Doctorow covered this trend at Amazon, where he compared the company's shift from producing code with AI assistance to reviewing AI-generated code to the shift from manual to automated textile production that spawned the failed Luddite uprising at the dawn of the industrial revolution.
In the piece, Doctorow identifies a dichotomy that many of us in the tech industry have witnessed first-hand. Depending on how you use them, LLM coding tools can be amazing at removing tedium and increasing time spent on more creative parts of the craft, or heinous generators of bug-filled slop that must be painstakingly debugged by ever-vigilant programmers, assembly-line style. Hacker News, the homepage of choice for many professional developers, is full of threads debating this very topic.
Doctorow attributes this separation to whether workers have the power to decide how to deploy them, and that's certainly a major component of it. But I wonder if there's a bigger issue lurking in the way code LLMs have become so polarizing.
Like many technologists, I was fairly skeptical about the value of early code-generation LLMs. Sure, they could help save some time hunting for documentation or writing boilerplate code, but when you asked them to do anything new, they typically failed in, at best, amusing, and at worst, dangerously difficult to catch, ways. Then, in September 2025, Claude Code 2.0 was released, powered by the Sonnet 4.5 model, and I was forced to change my perspective.
Since adopting the latest versions of Claude Code and similar tools, the small development team at Knowledge Futures, led by our intrepid founder and Executive Director Travis Rich, has increased its velocity many times over. LLMs have helped us tackle everything from identifying and fixing costly memory leaks and database queries that we'd spent years fighting to shipping long-backlogged features in days or weeks that would have taken months or even quarters previously.
In the hands of our dedicated team, these tools have been so impactful in helping us cut infrastructure costs and build features to automate some of our most manual backend processes that we were able to save our main product, PubPub, from being retired due to funding cuts. Code LLMs still make plenty of mistakes and need lots of handholding, but by and large, the team seems to be enjoying spending more time solving higher level problems and less time in the weeds. For us, and many others like us, the value of code LLMs has suddenly become very real.
Despite these positive changes, I wouldn't say the work we're producing is all that different than before. We're still craftspeople, shipping code finely tuned to our product, our users, our values, and our financial priorities. If you swapped out our developers for ones who had no hard-won, intuitive understanding of our codebase or ecosystem, they would struggle mightily to produce production-quality work. It would take months or years of learning for them to be capable of coming up with solutions that fit our established patterns and conventions, prompting LLMs to follow them, identifying and correcting them when they go astray, and catching bugs unique to our codebase. It would be equally difficult to rebuild our product from scratch, as we've learned when trying to produce greenfield tools. The established codebase, full of carefully reviewed examples for these next-token-predictors to draw on, is critical to their success at generating new outputs.
In other words, our process is still artisanal, we're just using mixers now instead of kneading by hand. The same is true for friends I've talked to at companies big and small whose teams have successfully implemented LLMs. Their work has often been vastly accelerated, perhaps similar to the way garbage collection accelerated development in early programming languages. But even after all this investment, even at the largest scales, the fundamental work of coding, where LLMs have proven most successful, remains a craft.
There are good reasons to believe that coding, and knowledge work, will continue to be fundamentally artisanal for a long time to come. Industrial-scale production processes require extremely high levels of predictability to succeed, and their lower-quality outputs must be significantly cheaper for customers to accept them as replacements for craft alternatives. LLMs are unpredictable by design and extremely expensive to build and run. The degree to which big tech can square these circles will determine whether their gamble pays off.
Attempts to force the conversion to industrial code, like the ones at Amazon, have mostly resulted in predictable disaster. The dark vision Doctorow sketches, where poorly-trained and paid coders hunker over a virtual assembly line of automated pull requests trying to catch as many bugs per hour as possible before generated code hits production, has not yet come to pass. All the while, the clock is ticking as those investments start to depreciate, and far more rapidly than in most other capital-intensive industries.
When the AI bubble bursts, I suspect we will look upon this era with great regret. Unlike other products, code has never needed industrialization to scale, fabulously profitably, to the entire world. Code and other specialized types of LLMs could have been developed economically and environmentally responsibly to improve the work of programmers, lawyers, scientists, writers, and other knowledge crafters, rather than with the explicit purpose of replacing them. It is only because the lords of Silicon Valley, like the robber barons before them, became too powerful and too greedy to accept a future of working with creative humans who could limit their own power that we have been forced to trudge once again down this particular path.
Luckily, there is solace to be taken all around us, probably even in your neighborhood. A century and a half after the invention of mass-produced yeast, home and craft bakeries are on the rise, joined by farmer's markets and handicrafts, as consumers seek authenticity in an age of slop. Many of these artisans use scaled-down versions of tools originally developed for industry, like spiral mixers and steam ovens, to improve their own work at much smaller scales.
Whether big tech succeeds in its gamble to industrialize code or destroys itself in the attempt, my guess is there will still be plenty of room for artisanal alternatives. The real question is how long it will take us to rebuild the economy enough to support them.