The rise of industrial software

(chrisloy.dev)

136 points | by chrisloy 5 hours ago

32 comments

  • WJW 3 hours ago
    This essay, like so many others, mistakes the task of "building" software with the task of "writing" software. Anyone in the world can already get cheap, mass-produced software to do almost anything they want their computer to do. Compilers spit out new build of any program on demand within seconds, and you can usually get both source code and pre-compiled copies over the internet. The "industrial process" (as TFA puts it) of production and distribution is already handled perfectly well by CI/CD systems and CDNs.

    What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.

    It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.

    • nchmy 3 hours ago
      You're getting caught up on the technical meaning of terms rather than what the author actually wrote.

      Theyre explicitly saying that most software will no longer be artisianal - a great literary novel - and instead become industrialized - mass produced paperback garbage books. But also saying that good software, like literature, will continue to exist.

      • WJW 1 hour ago
        Yes, I read the article. I still think it's incorrect. Most software (especially by usage) is already not artisanal. You get the exact same browser, database server and (whatsapp/signal/telegram/whatever) messenger client as basically everyone else. Those are churned out by the millions from a common blueprint and designed by teams and teams of highly skilled specialists using specialized tooling, not so different from the latest iPhone or car.

        As such, the article's point fails right at the start when it tries to make the point that software production is not already industrial. It is. But if you look at actual industrial design processes, their equivalent of "writing the code" is relatively small. Quality assurance, compliance to various legal requirements, balancing different requirements for the product at hand, having endless meetings with customer representatives to figure out requirements in the first place, those are where most of the time goes and those are exactly the places where LLMs are not very good. So the part that is already fast will get faster and the slow part will stay slow. That is not a recipe for revolutionary progress.

        • bsenftner 1 hour ago
          I think the author of the post envisions more code authoring automation, more generated code/test/deployment, exponentially more. To the degree what we have now would be "quaint", as he says.

          Your point that most software uses the same browsers, databases, tooling and internal libraries is a weakness, a sameness that can be exploited by current AI, to push that automation capability much further. Hell, why even bother with any of the generated code and infrastructure being "human readable" anymore? (Of course, all kinds of reasons that is bad, but just watch that "innovation" get a marketing push and take off. Which would only mean we'd need viewing software to make whatever was generated readable - as if anyone would read to understand hundreds/millions of generated complex anything.)

          • yobbo 2 minutes ago
            LLMs produce human readable output because they learn from human readable input. It's a feature. It allows it to be much less precise than byte code, for example, which wouldn't help at all.
      • bruce511 3 hours ago
        I guess two things can be true at the same time. And I think AI will likely matter a lot more than detractors think, and nowhere near as much as enthusiasts think.

        Perhaps a good analogy is the spreadsheet. It was a complete shift in the way that humans interacted with numbers. From accounting to engineering to home budgets - there are few people who haven't used a spreadsheet to "program" the computer at some point.

        It's a fantastic tool, but has limits. It's also fair to say people use (abuse) spreadsheets far beyond those limits. It's a fantastic tool for accounting, but real accounting systems exist for a reason.

        Similarly AI will allow lots more people to "program" their computer. But making the programing task go away just exposes limitations in other parts of the "development" process.

        To your analogy I don't think AI does mass-produced paperbacks. I think it is the equivalent of writing a novel for yourself. People don't sell spreadsheets, they use them. AI will allow people to write programs for themselves, just like digital cameras turned us all into photographers. But when we need it "done right" we'll still turn to people with honed skills.

        • nchmy 2 hours ago
          > your analogy I don't think AI does mass-produced paperbacks

          It's the article's analogy, not mine.

          And, are you really saying that people aren't regularly mass-vibing terrible software that others use...? That seems to be a primary use case...

          Though, yes, I'm sure it'll become more common for many people to vibe their own software - even if just tiny, temporary, fit-for-purpose things.

      • pydry 2 hours ago
        This was already true before LLMs. "Artisinal software" was never the norm. The tsunami of crap just got a bit bigger.

        Unlike clothing, software always scaled. So, it's a bit wrongheaded to assume that the new economics would be more like the economics of clothing after mass production. An "artisanal" dress still only fits one person. "Artisanal" software has always served anywhere between zero people and millions.

        LLMs are not the spinning jenny. They are not an industrial revolution, even if the stock market valuations assume that they are.

        • cryptica 2 hours ago
          Agreed, software was always kind of mediocre. This is expected given the massive first mover advantage effect. Quality is irrelevant when speed to market is everything.
          • pydry 50 minutes ago
            Unlike speed to market it doesnt manifest in an obvious way but I've watched several companies lose significant market share because they didnt appreciate software quality.
    • furyofantares 52 minutes ago
      > It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian.

      The article is very clearly not saying anything like that. It's saying the greatest barrier to making throwaway comments on Russian social media is not speaking Russian.

      Roughly the entire article is about LLMs making it much cheaper to make low quality software. It's not about masterpieces.

      And I think it's generally true of all forms of generative AI, what these things excel at the most is producing things that weren't valuable enough to produce before. Throwaway scripts for some task you'd just have done manually before is a really positive example that probably many here are familiar with.

      But making stuff that wasn't worth making before isn't necessarily good! In some cases it is, but it really sucks if we have garbage blog posts and readmes and PRs flooding our communication channels because it's suddenly cheaper to produce than whatever minimal value someone gets out of hoisting it on us.

    • weslleyskah 1 hour ago
      > It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.

      And what do you feel is the role of universities? Certainly not just to learn the language right? I'm going through a computer engineering degree and sometimes I feel completely lost with an urge to give up on everything, even though I am still interested in technology.

      • estimator7292 1 hour ago
        One can go to school to learn the literary arts. Many do. A lot of authors do not.

        A lot of engineers and programmers did not go to school.

  • physicsguy 1 hour ago
    I've just done my first almost fully vibe coded hobby project from start to near completion, a village history website with a taxonomy, and it's taken so much poking and prodding and cajoling to get the software to do exactly what I want it to do. Having built plenty of production stuff, I know what I want it to look like and the data model was really clear, yet even trying every trick in the book to constrain them, I just found the LLMs went off and did totally random things, particularly as the project got further from the start.

    Maybe there'll be an enormous leap again but I just don't quite see the jump to how this gets you to 'industrial' software. It made it a lot faster, don't get me wrong, but you still needed the captain driving the ship.

    • Havoc 1 hour ago
      > you still needed the captain driving the ship.

      The question is more what becomes of all the rowers when you’re switching from captain + 100 rowers to captain + steam engine

      They’re not all going to get their own boat and captain hat

      • kj4211cash 52 minutes ago
        But were there ever 100 "rowers"? In this case, the commenter would have developed the website him- or herself instead of using AI. And it would have taken a little longer but probably been higher quality. In my experience, most developers are already capable captains and most of their job is "captaining." One of their main complaints is managers who treat them like rowers. AI just shifts what it means to captain?
      • gverrilla 47 minutes ago
        They could do a revolution. 100 rowers vs 1 captain - easy. Labor vs capital.
  • huevosabio 4 hours ago
    I've been thinking about this for a while, and largely agree that industralization of software development is what we are seeing. But the emphasis on low quality is misplaced.

    Take this for example:

    ``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```

    Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.

    Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.

    • nchmy 3 hours ago
      I generally agree. Industrialization puts a decent floor on quality, at low cost. But it also has a ceiling.

      Perhaps an industrial car is better than your or my artisanal car, but I'm sure there's people who build cars by hand of very high quality (over the course of years). Likewise fine carpentry vs mass produced stuff vs ikea.

      Or I make sourdough bread and it would be very impractical/uncompetitive to start selling it unless I scaled up to make dozens, maybe hundreds, of loaves per day. But it's absolutely far better than any bread you can find on any supermarket shelf. It's also arguably better than most artisanal bakeries who have to follow a production process every day.

      • pydry 2 hours ago
        The difference between an artisinal car and a mass produced car is that the former can only be used by one person.

        This has never been true for "artisanal" software. It could be used by nobody or by millions. This is why the economic model OP proposes falls apart.

    • swiftcoder 3 hours ago
      > but also they deliver quality way beyond what can be achieved in artisanal production

      I don't think this is true in general, although it may be in certain product categories. Hand-built supercars are still valued by the ultra-wealthy. Artisanal bakeries consistently make better pastries than anything mass produced... and so on

      • nchmy 3 hours ago
        Lol, I now see your comment already used the exact same examples that my sibling comment did
    • cinntaile 3 hours ago
      Better along which dimensions? Most luxury cars are made the artisanal way.
      • bluGill 2 hours ago
        That doesn't make them better. It makes them exclusive since only a few could have one.
        • cinntaile 2 hours ago
          I still don't know what better means in this context, so I don't understand what your comment adds to the discussion?
    • tosapple 4 hours ago
      How does that apply to amish furniture?
  • bolangi 5 hours ago
    This thought-provoking essay does not consider one crucial aspect of software: the cost of a user developing a facility with a given software product. Historically monopolistic software producers can force these costs to be borne because the user has no alternative to upgrading to the latest version of, for example, Windows, or gmail, or the latest version of the github GUI. A signficant portion of the open source / free software movement is software providing stable interfaces (including for the user) so that resources otherwise spent on compulsory retraining to use the latest version of something proprietary, can be invested in configuring existing resources to better suit the user's problem domain. For example, programs like mutt or vim, or my latest discovery, talon.
    • Etheryte 4 hours ago
      I don't think the division line runs on the open-source software front here. Windows has historically offered some of the most stable APIs, meanwhile there's plenty of examples of popular open-source software with a lot of breaking changes.
      • xorcist 3 hours ago
        The comment you replied to said "significant portion of" and I believe it is clear which portion that refers to: the culture around c, linux, vim and bash, not things like nodejs, java and (semi-open-source) elasticsearch which are culturally separate.
    • npunt 4 hours ago
      I've never found a term I liked for this particular concept at the intersection of education & business so I made one up a while back:

      A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.

      Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.

      • bsenftner 1 hour ago
        The formal term that business people use is "institutional knowledge".
  • motbus3 5 hours ago
    I'm not through yet but I don't know.

    As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.

    I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.

    Nothing changed with AI. I keep doing the same things, but maybe the output have colours.

    • samiv 3 hours ago
      Heh...I've worked for 25 years and basically I'm yet to put code into production. Mostly projects that were cancelled or scrubbed either during development or shortly after or just downright never used since they were POC/prototypes.

      I think I've overall just had just 2 or 3 projects where anyone has actually even tried the thing I've been working on.

    • Lumocra 3 hours ago
      That holds true for a tailor, even expensive clothing items eventually wear out and get thrown away. They are cared for better, repaired a few times, but in the end, disposed of. I’d say that analogy holds up for 'traditionally' created software vs. AI-created software. Handmade clothes vs. fast fashion.
  • philipallstar 5 hours ago
    These damn articles. Software moved into an industrial revolution when you could write in a high level language, and not in assembly. This has already happened.
    • jamesdhutton 5 hours ago
      The article makes this very point. From the article: “software has been industrialising for a long time: through reusable components (open source code), portability (containerisation, the cloud), democratisation (low-code / no-code tools), interoperability (API standards, package managers) and many other ways”
    • anthk 11 minutes ago
      Partially, but no. First:

      - Other input that a deck of cards. Terminals and teletypes were a revolution.

      - Assembly was much better than hardware switches.

      - Also, a proper keyboard input against some "monitor" software was zillions better than, again, a deck of cards/hardware toggles. When you can have a basic line basic editor and dump your changes in a paper tape or print your output you have now live editing instead of suffering batch jobs.

    • baq 5 hours ago
      You either see what codex and opus are capable of and extrapolate the trendline or you don’t; the author clearly saw and extrapolated.

      Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.

      Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.

      • anthk 10 minutes ago
        Portable bytecodes predate Windows and Macintosh, and maybe DOS too. (The Z Machine).
      • ReptileMan 4 hours ago
        And if we extrapolate 5% economic growth per year we will consume all the energy in our light cone in 1000 years.
        • K0balt 3 hours ago
          Huh. Your statement was probably hyperbole? But just back of the napkin:

          If we use about 20 TW today, in a thousand years of 5% growth we’d be at about 3x10^34. I think the sun is around 3.8x10^26 watts? That gives us about 8x10^7 suns worth of energy consumption in 1000 years.

          If we figure 0.004 stars per cubic light-year, we end up in that ballpark in a thousand years of uniform spherical expansion at C.

          But that assumes millions ( billions?) of probes traveling outward starting soon, and no acceleration or deceleration or development time… so I think your claim is likely true, in any practical sense of the idea.

          Time to short the market lol.

        • elzbardico 1 hour ago
          If we extrapolated the rise in the standards of living of a Detroit Black blue-collar factory worker in Detroit from the early 60s to our current days, most of them should own 64ft yachts by now.
        • baq 1 hour ago
          of course, but no need to look that far into the future - 400 years at 2.3% pa is enough to boil oceans.

          AI capabilities are growing exponentially thanks to exponential compute/energy consumption, but also thanks to algorithmic improvements. we've got a proof that human-level intelligence can run at 20W of power, so we've got plenty of room to offset the currently-missing compute.

        • nchmy 3 hours ago
          Economic growth is not directly proportional to energy consumption. A major feature of any useful tool is that it (often dramatically) reduces energy consumption.
          • ReptileMan 2 hours ago
            Economic growth tracks almost 100% with energy consumption. The earth at night map is quite telling on the matter.
            • nchmy 1 hour ago
              Newsflash! Less happens while people are asleep!

              Correlation doesnt say anything about the sensitivity/scaling. (i recognize that my original comment didnt quite make this point, though the correlation is definitely not 100%, so that point does still stand)

              can you note the difference between the earth being lit by torches, candles, kerosene lamps and incandescent bulbs, versus LED lights? LED isnt glowing harder, it just wastes less energy.

              A rocket stove, or any efficient furnace, can extract vastly more energy from the same fuel source than an open fire. I assume combustion engines have had significant efficiency improvements since first introduced. And electric engines are almost completely efficient - especially when fed by efficient, clean/renewable source.

              How about the computing power of a smartphone versus a supercomputer from 1980?

              What is more energy efficient, a carpenter working with crude stones or with sharp chisels?

              and we can, of course, put aside whether any measurement of economic value is actually accurate/useful... A natural disaster is technically good for many economic measures, since the destruction doesn't get measured and the wealth invested in rebuilding just counts as economic activity

              And, Of course, then there's creeptocurrencies which use an immense amount of energy to do something that was previously trivial. And worse, when it is used in place of cash. But even there, some are more efficient than others - not that anyone who uses them actually cares.

              • ReptileMan 29 minutes ago
                Facts are that you can absolutely tell how developed a region is by looking from above. And that there hasn't been a year in which the humanity has used less energy than the previous one and grown.
  • yosefk 3 hours ago
    You could say the same things about assemblers, compilers, garbage collection, higher level languages etc. In practice the effect has always been an increase in the height of a mountain of software that can be made before development grinds to a halt due to complexity. LLMs are no different
    • physicles 3 hours ago
      In my own experience (and from everything I’ve read), LLMs as they are today don’t help us as an industry build a higher mountain of software because they don’t help us deal with complexity — they only help us build the mountain faster.
  • zkmon 4 hours ago
    A question that was not addressed in the article and contrasts software with industrialized products from the past is - who are the consumers of the software produced at industrial scale? Stitching of clothes by machines accelerated garment product only because there was demand and consumption tied to population. But software is not tied to population similar to food and clothes. It doesn't deprecate, it is not exclusively consumed by persons.

    Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.

    • perlgeek 1 hour ago
      > who are the consumers of the software produced at industrial scale?

      Basically every company that does anything non-trivial could benefit from tailor-made software that supports their specific workflow. Many small companies don't have that, either they cannot afford their own development team, or they don't know that/how software could improve their workflow, or they are too risk-averse.

      Heck, even my small family of 4 persons could benefit from some custom software, but only in small ways, so it's not worth it for me to pursue it.

      Once we're at the point where a (potentially specialized) LLM can generate, securely operate and maintain software to run a small to medium-sized business, we'll probably find that there are far more places that could benefit from custom software.

      Usually if you introduce, say, an ERP system into a company that doesn't use one yet, you need to customize it and change workflows in the company, and maybe even restructure it. If it were cheap enough to build a custom ERP system that caters to the existing workflows, that would be less disruptive and thus less risky.

    • tossandthrow 4 hours ago
      > but decades of feedback, tuning and fixing

      On the contrary, this is likely the reason why we can disrupt these large players.

      Experience from 2005 just don't hold that much value in 2025 in tech.

      • zkmon 3 hours ago
        Software was never coded in a big-bang one shot fashion. It evolves through years of interacting with the field. That evolution takes almost same time with AI or not. Remember a version release has many tasks that need to go at human speed.
        • tossandthrow 1 hour ago
          On that we agree.

          But taking out features are difficult - even when they have near to zero value.

          Why it sometimes make sense for new players to enter the market and start over - without the legacy.

          This is indeed one of the value propositions in the startup I work in.

      • swiftcoder 3 hours ago
        > Experience from 2005 just don't hold that much value in 2025 in tech

        That would be why a significant portion of the world's critical systems still run on Windows XP, eh?

        • tossandthrow 1 hour ago
          No, that is likely because there is no economic benefit to do anything about it - definitely not UX concerns.
    • bodegajed 3 hours ago
      code has no use-value. it is like being a baker in an island. the value comes from its user base.
      • zkmon 2 hours ago
        User base comes from the value you provide. Value comes from the product features. Features come from code. If code is easy, anyone with 10K bucks in their pocket can provide those features and product. The only thing missing is, is the product battle-tested? That fortunately remains out of reach for AI.
        • jeltz 1 hour ago
          I would say unfortunately out of reach since so far it seems AI will mostly fill out world with bad code which is not battle tested.
  • choeger 3 hours ago
    Thing is: Industrialization is about repeating manufacturing steps. You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.

    The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.

    We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.

    • alexjurkiewicz 3 hours ago
      The "industrialisation" concept is an analogy to emphasize how the costs of production are plummeting. Don't get hung up pointing out how one aspect of software doesn't match the analogy.
      • mgh95 2 hours ago
        > The "industrialisation" concept is an analogy to emphasize how the costs of production are plummeting. Don't get hung up pointing out how one aspect of software doesn't match the analogy.

        Are they, though? I am not aware of any indicators that software costs are precipitously declining. At least as far as I know, we aren't seeing complements of software developers (PMs, sales, other adjacent roles) growing rapidly indicating a corresponding supply increase. We aren't seeing companies like mcirosoft or salesforce or atlassian or any major software company reduce prices due to supply glut.

        So what are the indicators (beyond blog posts) this is having a macro effect?

    • willtemperley 3 hours ago
      People re-create the same algorithms all the time for different languages or because of license incompatibility.

      I'm personally doing just that because I want an algorithm written in C++ in a LGPL library working in another language

      • nuancebydefault 2 hours ago
        In fact this is a counter argument to the point of the article. You're not making 'just more throwaway software' but instead building usable software while standing on the shoulders of existing algo's and libraries.
        • willtemperley 1 hour ago
          Well yes. To me industrial software is hardened algorithms, not throwaway slop like the author is arguing. LLMs are very good at porting existing algorithms and as you say it’s about standing on the shoulders of giants. I couldn’t write these from scratch but I can port and harden an algo with basic engineering practices.

          I like the article except the premise is wrong - industrial software will be high value and low cost as it will outlive the slop.

    • pyrale 3 hours ago
      > You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.

      ...Or so think devs.

      People responsible for operating software, as well as people responsible for maintaining it, may have different opinions.

      Bugs must be fixed, underlying software/hardware changes and vulnerabilities get discovered, and so versions must be bumped. The surrounding ecosystem changes, and so, even if your particular stack doesn't require new features, it must be adapted (a simple example: your react front breaks because the nginx proxy changed is subdirectory).

  • ciconia 4 hours ago
    Hmm, I'm not sure I see the value in "disposable software". In any commercial service people are looking for software solutions that are durable, dependable, extensible, maintainable. This is the exact opposite of disposable software.

    The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.

    • alexjurkiewicz 2 hours ago
      I play several sports across several teams and leagues. Each league has their own system for delivering fixtures. Each team has its own system of communication.

      What I want is software that can glue these things together. Each week, announce the fixture and poll the team to see who will play.

      So far, the complete fragmentation of all these markets (fixtures, chat) has made software solutions uneconomic. Any solution's sales market is necessarily limited to a small handful of teams, and will quickly become outdated as fixtures move and teams evolve.

      I'm hopeful AI will let software solve problems like this, where disposable code is exactly what's needed.

      • gnz11 2 hours ago
        That sounds more like a bureaucratic problem (access to data) than a software problem.
    • cryptica 4 hours ago
      Yes, software needs to be secure. If we accept the premise that software is going to be churned out in bulk, then the mechanisms for securing software must evolve rapidly... I don't see a world where there is custom software for everything but all insecure in different ways.
      • swiftcoder 3 hours ago
        Not only secure. It needs to be reliable (don't corrupt my data). It needs to be durable (I need to be able to access my data 10 years from now). etc.
        • cryptica 1 hour ago
          Yes, agreed. producing reliable software is one of these things which sounds trivial but is actually extremely difficult.

          With my last side project, I became frustrated with my non-technical founder because he would have a lot of vague ideas and in his mind, he was sure that he had a crystal clear vision of what he wanted... But it was like, every idea he had, I was finding massive logical holes in them and finding contradictions... Like he wanted a feature and some other feature but it was physically impossible to have both without making the UX terrible.

          And it wasn't just one time, it was constantly.

          He would get upset at me for pointing out the many hurdles ahead of time... When in fact he should have been thanking me for saving us from ramming our heads into one wall after another.

      • speeder 3 hours ago
        This only means you didn't interacted enough with IOT or junky viral games market...
  • xorcist 3 hours ago
    Too many articles are written comparing LLMs to high-level languages. Sure, if you squint enough, both has to do with computers. But that comparison misses everything that is important about LLMs.

    High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.

    • tgv 3 hours ago
      Perhaps you mean reliability rather than determinism/reproduceability?
  • sriku 1 hour ago
    I find it hard to think of code as being the output of programming. I keep re-reading Naur's "Programming as theory building" paper and it still feels relevant and closer to how the activity feels to me, AI or no AI.

    https://pages.cs.wisc.edu/~remzi/Naur.pdf

    • sriku 1 hour ago
      The frame set by the OP completely out me off and dissuaded me from reading the rest of the article beyond the first paragraph. Didn't feel like much throught was given to what was being said.
  • djantje 3 hours ago
    It is just going to be even more less important software.

    There is a difference between writing for mainstream software and someone's idea/hope for the future.

    Software that is valued high enough will be owned and maintained.

    Like most things in our world, I think ownership/stewardship is like money and world hunger, a social issue/question.

  • MORPHOICES 3 hours ago
    What I've been pondering is the nature of what makes the user interface of some software "industrial" versus "complicated." ~

    “The difference I return to again and again isn’t tech depth. It’s constraints.”

    "Rough framework I’m using lately:"

    Consumer software aims at maximizing joy.

    “Enterprise software is all about coordination.”

    "Industrial software operates in a environment of the real-world "mess", yet its

    "Industrial stuff appears to be more concerned with:

          a.
    
    failure modes

    long-term maintenance

    predictable behavior vs cleverness

    But as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.

    Curious how others see it:

    What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?

    • alexjurkiewicz 2 hours ago
      The article isn't talking talking about "industrial" in relation to user interfaces. It isn't talking about user interfaces at all.

      Your consumer/enterprise/industrial framework is orthogonal to the articles focus: how AI is massively reducing the cost of software.

  • japhyr 2 hours ago
    > Previous industrial revolutions externalised their costs onto environments that seemed infinite until they weren't. Software ecosystems are no different: dependency chains, maintenance burdens, security surfaces that compound as output scales. Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it. In an era of mass automation, we may find that the hardest problem is not production, but stewardship. Who maintains the software that no one owns?

    This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.

    There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.

  • Abh1Works 4 hours ago
    But I think the important part of this is the reach that the Industrial Revolution had. Consumer facing software, or the endusers who were able to "benefit" from the Industrial Revolution, and individual needs for all of these mass produced goods.

    The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.

    So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.

  • zx8080 1 hour ago
    > This website uses anonymous cookies to enhance the user experience.

    This sounds weird, or wrong. Does anonymous stats need cookies at all?

    • te7447 51 minutes ago
      If you want to track how many times users revisit the site, you could do that anonymously by setting a visit counter cookie, e.g. VISITS: 1, VISITS: 2, etc. This would track the user over different IPs, but since the cookie only has a counter, it doesn't tell you if two people with "VISITS: 2" set is the same user.

      That's the first example I can think of off the top of my head.

  • Deukhoofd 4 hours ago
    The industrial revolution was constrained by access to the means of production, leaving only those with capital able to actually produce, which lead to new economic situations.

    What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?

  • pyrale 3 hours ago
    The article kind of misses that cost has two axes : development cost and maintenance cost.

    low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.

    On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.

  • memoriuaysj 5 hours ago
    Steve Yegge called it "factory farmed code"
  • empiko 4 hours ago
    Not convinced. There is an obvious value in having more food or more products for almost anybody on Earth. I am not sure this is the case for software. Most people's needs are completely fulfilled with the amount and quality of software they already have.
    • tossandthrow 4 hours ago
      > There is an obvious value in having more food or more products for almost anybody on Earth

      Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.

      I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.

    • npodbielski 4 hours ago
      But how about some silly software for just a giggle. Like 'write website that plays fart sound when you push button'? That can be a thing for the kids at school.
  • dustinboss 3 hours ago
    "Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it." Such a great line.
    • skydhash 2 hours ago
      And also false. Good programmers are always aware of the debt. It’s just not easily quantifiable as part of it can only be estimated when a change request has been made. And truly known when implementing the change.

      It’s always a choice between taking more time today to reduce the cost of changes in the future, or get result fast and be less flexible later. Experience is all about keeping the cost of changes constant over time.

  • vincnetas 4 hours ago
    i would say comparing making of software and working factory makes analogy mistake. complete software is analogy to running factory. making software is making of the factory. that is specialised tooling, layouts, supply chain etc. when you have all this your factory runs on industrial scale and produces things. like your software produces value when its completed and used by enduser.
  • torginus 3 hours ago
    So many fallacies here, imprecise, reaching arguments, attempts at creating moral panic, insistence that most people create poor quality garbage code, in start contrast to the poster, the difference between his bespoke excellence, and the dreck produced by the soulless masses is gracefully omitted.

    First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.

    Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.

    I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.

    This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.

  • npodbielski 4 hours ago
    If that is true we will live in a funny world when you will loose all your money because you where running some outdated, riddled with holes software written by LLM running on some old router old cheap camera. Or some software will stop working after an update because some fix was written by LLM and nobody checked that nor tested. Or they will 3 outages of big internet services in 2 months.

    Oh wait. It is already a thing.

  • spiderfarmer 4 hours ago
    “industrialisation of agriculture led to ultraprocessed junk food“

    The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.

    Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.

    • fuzzfactor 1 hour ago
      Shelf life is one of the major factors allowing treatment more similar to nonperishable commodities.

      I think some people do instinctively feel like all different kinds of software have different shelf lives or useful lifetimes for different reasons.

      But there's always so much noise it's not very easy to get the expiration date correct.

      Mass production is pretty much a given when it comes to commodities, and things like long shelf life are icing on the cake.

      The inversion comes when mass production makes the highly processed feed more affordable than the unprocessed. After both have scaled maximally, market forces mean more than the amount of labor that was put in.

      Strange indeed.

  • ofalkaed 4 hours ago
    Personally I think AI is going to turn software into a cottage industry, it will make custom software something the individual can afford. AI is a very long ways off from being able to allow the average person to create the software they want unless they are willing to put a great deal of time into it, but it is almost good enough that the programmer can take the average person's idea and execute it at an affordable price. Probably only a year or two from when a capable programmer will be able to offer any small buisness a completely customized POS setup for what the cost of a canned industrial offering today; I will design your website and build you a POS system tailored to your needs and completely integrated with the website, and for a little more I can throw in the accounting and tax software. A bright dishwasher realizing they can make things work better for their employer might be the next billionaire revolutionizing commerce and the small buisness.

    I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.

    Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.

    • elAhmo 4 hours ago
      > and for a little more I can throw in the accounting and tax software

      As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.

      • ofalkaed 4 hours ago
        Sure, that is still a ways off, but being able to hire a programmer to meet my personal modest software needs is almost there. Also, the needs of any company that required a hundred people and millions of dollars is very different from the needs of a small restaurant or the like; anyone with enough ambition to run a small restaurant can manage the accounting and taxes for that restaurant, the same can not be said for the sort of buisness you are describing. You are comparing an apple to an orange orchard.

        Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.

        • tosapple 2 hours ago
          Business*, if your "tax-accounting" manager made THAT mistake with numbers you would be screwed.
          • ofalkaed 1 hour ago
            Occasionally when right clicking on a misspelled word to correct it, I bump the track pad and accidentally add the misspelled word to my dictionary. Business is one of those words I apparently did that with. I have never been able to figure out how to remove words from my dictionary, but honestly never looked that hard, for some ridiculous reason I think people will focus on what was said instead of looking for nits to pick despite all the evidence suggesting otherwise.
            • tosapple 1 hour ago
              I considered it may be a dictionary correction issue but i am sort've railing against the suggestion of current level LLMs being used for tax-software and POS design.

              Edit: And if I was using C or C++ above my lack of capitalization would either evoke an error too OR passably continue foward referencing the wrong variable and result in a similar error to your transposition.

              • ofalkaed 1 hour ago
                I said that was something which would happen in the future, as in not the current level LLMs. But this is what people will pay the programmer for, the programmer will (hopefully) know when and where the LLM can be used to offload the grunt work and where they should just skip the LLM and hand code it, those things the average person will not know, the full system and this applies to current level LLMs
                • tosapple 1 hour ago
                  In the future, this will be all voice controlled making most of our user-interfaces and expenditures on adapting to this intermediate stage moot.
    • ofalkaed 2 hours ago
      I find it interesting but not surprising that this got downvoted. Sure my idea of the craft is different than the article's and of many people but if the craft only there if it is pure hand written code then it is a craft which the vast majority can not afford. I can pay a luthier a few thousand and get my dream guitar and would happily spend that sort of money on getting custom software but that is not going to happen if I insist on 100% handwritten code, just as getting my dream guitar would not happen if I insisted on the luthier only using hand tools.
  • bgwalter 5 hours ago
    Another AI entrepreneur who writes a long article about inevitability, lists some downsides in order to remain credible but all in all just uses neurolinguistic programming on the reader so that the reader, too, will think the the "AI" revolution is inevitable.
    • motbus3 4 hours ago
      Tldr; initially I thought we might be onto something, but now, I don't see much of a revolution.

      I won't put intention into the text because I did not check any other posts from the same guy.

      That said, I think this revolution is not revolutionary yet. Not sure if it will be, but maybe?

      What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.

      There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.

      The economy (all around the world) is bonkers right now. Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.

      Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.

      Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster. Easier to look for code, find data and read through blocks of code I don't see in a while. Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)

  • eitally 2 hours ago
    I spent 15 years writing literal industrial software (manufacturing, test, and quality systems for a global high-tech manufacturing company, parts of which operated in regulated industries).

    One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.

    On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.

    On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.

    What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.

    Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.

    I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.

  • constantcrying 4 hours ago
    I think the idea is interesting, but immensely flawed.

    The following is just disingenuous:

    >industrialisation of printing processes led to paperback genre fiction

    >industrialisation of agriculture led to ultraprocessed junk food

    >industrialisation of digital image sensors led to user-generated video

    Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.

    The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.

    >This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.

    This just is not true and goes against all available evidence, as well as basic economics.

    >For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.

    This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.

    Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.

    • PaulRobinson 4 hours ago
      I agree with you on all of this, and found myself wondering if the author had actually studied the Industrial Revolution at all.

      The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.

      The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.

      Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.

      I'm not sure we're seeing this in AI software generation yet.

      Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.

      It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).

      I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.

      I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.

  • pharrington 4 hours ago
    This is written by the same guy who proudly blogged about not knowing how computers work. [https://chrisloy.dev/post/2013/04/27/no-i-can't-fix-your-com...]