We're in the wrong moment

(ezrichards.github.io)

50 points | by chilipepperhott 4 hours ago

21 comments

  • doug_durham 3 hours ago
    This seems to romanticize the past. I've been doing this for 40 years and I don't see that much has changed. I would code even if I didn't get paid for it. That said I've always seen writing code as a means to an end. I use GenAI every day to write code, and it brings pure joy when there's boiler plate that I don't need to write so I can focus on the fun stuff. There is zero value in me writing yet another Python argparse routine. I've done it and I've learn everything I'm ever going to learn about it. Let me get on to the stuff that I don't know how to do.
    • spockz 2 hours ago
      Okay I get the desire of not wanting to do repetitive stuff. It appears doing this with an llm scratches your itch. Before the same - focusing on the intrinsic complexity instead of the accidental - could be achieved by using libraries, toolkits, frameworks, better compiler (plugins), or “better” languages.

      What plagues me about LLMs is that all that generated code is still around in the project making reviews harder as well s understanding the whole program source. What is in there that makes you prefer this mechanism instead of the abstractions that have been increasingly available since forever?

      • seer 1 hour ago
        Isn't this compiled languages vs writing pure machine code argument all over again?

        The compiler produces a metric shit ton of code that I don't see when I'm writing C++ code. And don't get me started on TypeScript/Clojure - the amount of code that gets written underneath is mindbogglingly staggering, yet I don't see it, for me the code is "clean".

        And I'm old enough to remember the tail end of the MachineCode -> CompiledCode transition, and have certainly lived through CompiledCode -> InterpretedCode -> TranspiledCode ones.

        There were certainly people who knew the ins and outs of the underlying technology who produced some stunningly fast and beautiful code, but the march of progress was inevitable and they were gradually driven to obscurity.

        This recent LLM step just feels like more of the same. *I* know how to write an optimized routine that the LLM will stumble to do cleanly, but back in the day lots of assembler wizards were doing some crazy stuff, stuff that I admired but didn't have the time to replicate.

        I imagine in the next 10-20 years we will have Devs that _only_ know English, are trained in classical logic and have flame wars about what code exactly would their tools generate given various sentence invocations. And people would benchmark and investigate the way we currently do about JIT compilation and CPU caching - very few know how it actually works but the rest don't have to, as long as the machine produces the results we want.

        Just one more step on the abstraction ladder.

        The "Mars" trilogy by Kim Stanley Robinson had very cool extrapolations where this all could lead, technologically, politically, social and morally. LLMs didn't exists when he was writing it, but he predicted it anyway.

    • hinkley 1 hour ago
      I've seen a lot change. I used to have a seemingly bottomless list of things we are doing wrong and about half of them have dropped off in the last twenty years. Did they all turn out as well as we hoped they would? No. I don't think a single one did. We are half-assing a lot of things that we used to laugh off entirely. In most of these cases some is better than none, but could be a lot better.

      What I worry about is that my list has gotten shorter not because everything is as it should be but because I have slowed down.

      Quite a lot of things on that list were of the "The future is here but it's not evenly distributed" sort. XP was about a bunch of relatively simple actions that were force multipliers with a small multiple on them. What was important was that they composed. So the benefit of doing eight of them was more than twice the benefit of doing four. Which means there's a lot of headroom still from adding a few more things.

    • mattikl 1 hour ago
      That's certainly a more positive way to look at this. Working software has always relied on having people who grok the code, and this happens by spending a lot of time thinking about the code while writing it. And it's undocumented, because the nature of it is something you cannot really document.

      If AI is writing all the code, how do we keep the quality good? It's so obvious with the current GenAI tools that they're getting great at producing code, but they don't really understand the code.

      We don't really know how this story unfolds, so it's good to keep a positive mindset.

    • imiric 1 hour ago
      Code generation tools of today are pretty good at writing the boring boilerplate. I think the author is aware of this.

      But what happens when they get really good at generating the not-so-boring bits? They're much better at this than they were a year or two ago, so it's not unthinkable that they will continue to improve.

      I'm a firm "AI" skeptic, and don't buy into the hype. It's clear that the brute force approach of throwing more data and compute at the problem has reached diminishing returns. And yet there is ample room for improvements by doing good engineering alone. Most of what we've seen in the past year is based on this: MCP, agents, "skills", etc.

      > I would code even if I didn't get paid for it.

      That's great, but once the market value of your work diminishes, it's no longer a career—it's a hobby. Which doesn't mean there won't be demand for artisanal programming, but it won't power the world anymore. It will be a niche skill we rely on for very specific tasks, but our jobs will be relegated to steer and assist the "AI" into producing reliable software. At least in the short-term. It's doubtful whether the current path will get us to a place where these tools are fully self-sufficient.

      This is the bleak present and future the article is talking about. Being an assistant to code generation tools is far removed from the practice of programming. I personally find it tedious, unengaging, and extremely boring. There's little joy in the experience beyond having a working product. The road to get there is not a journey of discovery, serendipity, learning, and dopamine hits. It is a slog of writing software specs, balancing contextual information and prompts, and coaxing a human facsimile using natural language into producing working software. I dislike every part of this process.

  • beej71 3 hours ago
    The fun is still there. I'm relearning Rust and generative AI is really useful to help with understanding concepts and improving code. But I'm still the one understanding and improving.

    Still an infinite amount to learn and do. It's still not hard to have more skill than an AI. Of course AI can solve all the dumbbell problems you get in school. They're just there to build muscle. Robots can lift weights better than you, too, but that doesn't mean there's no value in you doing it.

  • weavejester 3 hours ago
    "I’m not sure if anyone else feels this way, but with the introduction of generative AI, I don’t find coding fun anymore. It’s hard to motivate myself to code knowing that a model can do it much quicker. The joy of coding for me was literally the process of coding."

    I experimented with GPT-5 recently and found its capabilities to be significantly inferior to that of a human, at least when it came to coding.

    I was trying to give it an optimal environment, so I set it to work on a small JavaScript/HTML web application, and I divided the task into small steps, as I'd heard it did best under those circumstances.

    I was impressed overall by how far the technology has come, but it produced a number of elementary errors, such as putting JavaScript outside the script tags. As the code grew, there was also no sense that it had a good idea of how to structure the codebase, even when I suggested it analyze and refactor.

    So unless there are far more capable models out there, we're not at the stage where generative AI can match a human.

    In general I find current model to have broad but shallow thinking. They can draw on many sources, which is extremely useful, but seem to have problems reasoning things through in depth.

    All this is to say that I don't find the joy of coding to have gone at all. In fact, there's been a number of really thorny problems I've had to deal with recently that I'd love to have side-stepped, but due to the currently limitations of LLMs I had to solve them the old-fashioned way.

    • Finbel 2 hours ago
      It's so strange. I do all the things you mention and it works brilliantly well 10 times out of 11.
      • EagnaIonat 2 hours ago
        You are probably doing something others have done before frequently.

        I find the LLMs struggle constantly with languages there is little documentation or out of date. RAG, LoRA and multiple agents help, but they have their own issues as well.

        • nl 1 hour ago
          The OP was working on a "a small JavaScript/HTML web application"

          This is a particular sweetspot for LLMs at the moment. I'll regularly one-shot entire NextJS codebases with custom styling in both Codex and Claude.

          But it turns out the OP is using Copilot. That just isn't competitive anymore.

    • wseqyrku 2 hours ago
      > and found its capabilities to be significantly inferior to that of a human, at least when it came to coding.

      I think we should step back and ask: do we really want that? What does that imply? Until recently nobody would use a tool and think, yuck, that was inferior of a human.

    • CamperBob2 3 hours ago
      I experimented with GPT-5 recently

      GPT-5 what? The GPT-5 models range from goofily stupid to brilliant. If you let it select the model automatically, which is the case by default, it will tend to lean towards the former.

      • weavejester 2 hours ago
        I was using GitHub Copilot Pro with VS Code, and the agent was labelled "GPT-5". Is this a particularly poor version of the model?

        I also briefly tried out some of the other paid-for models, but mostly worked with GPT-5.

        • nl 1 hour ago
          Try OpenAI Codex with GPT5-codex medium

          The technology is progressing very fast, and that includes both the models and the tooling around it.

          For example, Gemini 2.5 was considered a great model for coding when it launched. Now it is far inferior to Codex and Claude code.

          The Githib Copilot tooling is (currently) mediocre. It's ok as a better autocomplete but can't really compete with Codex or Claude or even Jules (Gemini) when using it as an agent.

        • spenczar5 2 hours ago
          Frankly, yes.

          The models are one part of the story. But the software around it matters at least as much: what tools does the model have access to, like bash or just file reading or (as in your example!) just a cache of files visited by the IDE (!). How does the software decide what extra context to provide to the model, how does it record past learnings from conversations and failed test runs (if at all!) and how are those fed in. And of course, what are the system prompts.

          None of this is about the model; its all "plain old" software, and is the stuff around the model. Increasingly, that's where the quality differences lie.

          I am sorry to say but Copilot is just sort of shoddy in this regard. I like Claude, some people like Codex, there are a bunch of options.

          But my main point is - its probably not about the model, but about the products built on the models, which can vary wildly in quality.

          • noduerme 2 hours ago
            In my experience with both Copilot and Claude, Claude makes subtler mistakes that are harder to spot, which also gobbles up time. Yes, giving it CLI access pretty cool and helps with scaffolding things. But unless you know exactly what you want to write, and exactly how it should work, to the degree that you will notice the footguns it can add deep in your structures, I wouldn't recommend anyone use it to build something professional.
  • snayan 3 hours ago
    Having gone through a bit of a crisis of meaning personally lately, this article resonates deeply. I would encourage the author to look inward and question the beliefs that got them here.

    I'd argue you didn't lose the joy of coding, you lost the illusion that coding made you real, that it made you you.

    • anonzzzies 2 hours ago
      I came to the same conclusion after 40+ years of programming: better if you come to that realisation earlier. Still love coding though, but I leave the paid work to my colleagues and llms: I just code for fun these days. I also write for fun and find it pretty similar, feeling and satisfaction wise.
      • dimator 2 hours ago
        But, what about the graduating senior who, yeah started because they love the craft, but also need a way to pay the bills for a few decades of their life?
    • hinkley 1 hour ago
      While there's truth in what you say, I don't think anyone should ever lose feeling for an act of creation.

      It is never everything, but it should also never be nothing.

    • leptons 1 hour ago
      There definitely are times that I lose the "joy of coding" and it has nothing to do with any illusions, it has everything to do with the kind of programming tasks I have to work on. Greenfield projects are the best, tech debt is the worst. Working on fun stuff is just fun.
    • uhhhd 3 hours ago
      This is wise
  • zkmon 3 hours ago
    It may not be that hard to see where this is all going. At least with some precision. Think of global arms race, or industrialization. Humans and this planet did not need any of that. Planet did not need it, because when you look at these cities from a flight, they look exactly like wounds that disrupt the continuity of greenery and terrain. Cities and industries don't belong to this planet. And no need to say much about the silly arms race and business-driven tech that humans have.

    AI is just one of those arms races that we imposed on ourselves, with desire to dominate others, or to protect ourselves from such domination. It is irreversible, just like the other things. It survives by using the same tactic of a cheap salesman - tell the first buyer that they can dominate the world, and then tell next buyers that they need to protect themselves from the first one.

    We transformed our lifestyles to live with those unnecessary, business/politics driven "advancements". The saga continues.

    BTW, electronic calculators, when they came up, did a similar thing, erasing the fun out of calculations by hand.

    • noduerme 1 hour ago
      All of life is an arms race. Look at fungi vs. bacteria. All those grasses in the fields and trees in the forests got there by outcompeting other organisms. We're actually the only species which can reason about our resource consumption as a whole, and which has a chance to do something about it. But while we find forests beautiful, they're a blight on grasslands, which are a blight on mosses, which are a blight on plain old rocks.

      What's beautiful is complexity, what's ugly is the destruction of complexity. That's why we find the destruction of forests to be repellent. Because we appreciate the more complex over the less complex. Possibly because complexity is the universe's way of observing itself. None of that means that our own complexity is necessarily wicked or irrelevant. It may just be a natural stage in the evolution of a planet. Grassland had 3 billion years to change, and it largely stayed the same. What's a couple thousand years of us blowing shit up, really?

      • zkmon 43 minutes ago
        Great points. Thank you. I realized (just after posting) that the wound part is not well-defined. Any abrupt change could be seen as a wound.

        But we need to define "progress" as species. Grasslands, trees and dolphins seem to have defined their progress as better adaptation helped by their organic evolution, which contributed to their ultimate goal of reproduction via survival.

        How is human race defining their progress? Since we are just one of the animal species, the root goal remains as reproduction. Instead of waiting for our biological evolution to enhance our survival (and thus reproduction), maybe we are augmenting human abilities with artificial means which is quicker.

        But then the artificial augmentations could become you, replacing whatever your essence was. A weapon in your hand and AI chip in your head could make you a different beast. We can argue that even without such tools, human is mostly made up of bacterial colonies dictating human thought and life. But we accepted that as our identity. Now the artificial implements are taking up our identity. This is not natural and that is what is wicked.

        Also arms race not same as how species out-competed each other. Our arms race and most of what we call as tech progress is spawned by competition internal to our species, not for competing with other species.

        Universe did not favor complexity. Universe destroys order and moves towards more entropy. Life is something that goes against this. Life probably was required to trap Sun's energy so that Earth can cool itself.

        In geologic time scale, yes, a couple thousand years is puny. But it also indicates a rapid change. Most rapid changes lead to extinction events.

        • noduerme 18 minutes ago
          I don't know if dolphins define their purpose or progress. More likely, we look at them in a state of equilibrium now, and think "that looks nice", but at various points over the past hundreds of millions of years, virtually every species experienced drastic shifts within a brief few millenia which also nearly wiped them out. We are clearly in the middle of something like that, which means we're in a poor position to make predictions. We went from something like equilibrium, more or less the same from 1M years ago to 5,000 years ago, to a radically different state. It may very well not last another 5,000 years.

          I think maybe we need to see these arms races as short ramps, periods of chaos, which lead either to long plateaus or very quick collapses.

          >> Universe did not favor complexity. Universe destroys order and moves towards more entropy. Life is something that goes against this. Life probably was required to trap Sun's energy so that Earth can cool itself.

          I think if the Universe were not programmed to generate complexity, there would only be one or two elements. I think the tendency toward entropy is a necessary condition to force complexity and life to evolve. The Universe is slowly trading global energy everywhere for local complexity somewhere. This is how energy is turned into information. If energy is nearly infinite, then clearly it is cheaper and less valuable to whoever is reading the output than the valuable limited information it can produce (with tons of wasted energy). I believe this because I believe that converting energy to information is not a side-effect of the Universe, but its ultimate purpose.

          So yeah, forests are beautiful. Beehives are beautiful. Colonies of fungus are beautiful. Kansas City from the air at night is... well, we shouldn't underrate ourselves.

  • analog31 3 hours ago
    >>> The joy of coding for me was literally the process of coding.

    Maybe I was lucky. For me, the joy was the power of coding. Granted, I'm not employed as a coder. I'm a scientist, and I use coding as a problem solving tool. Nothing I write goes directly into production.

    What's gone is the feeling that coding is a special elite skill.

    With that said, I still admire and respect the real software developers, because good software is more than code.

  • miellaby 1 hour ago
    I share with the author this youth where a child learns coding before everything else. I really loved coding and made it my carrier. Yet I don't think I would have been on the side of recognized genius if born earlier. I don't think any of them spent most of their time smashing keys. They were rather conceptualizing and planning stuff, and had human skills I could only dream of.

    That being said, we untalented programmers are experiencing what most jobs suffered in the last 2 centuries: massive automation of their everyday activities. I especially identify with these traditional farmers who took their life as their way of life was wiped out by artificial fertilizers, mechanic, chemicals and hyperscaling.

  • bob1029 1 hour ago
    Simply being interested in the tools and technologies is no longer sufficient for success or happiness.

    There was a time when you could walk in the door with a handful of proper nouns printed on a piece of paper. The low hanging fruit has all been collected by now. But, there is always fruit available higher up in the tree. It's just harder to get to. Most people don't know how to climb the tree. They say they can, or that they do it all the time, but they're usually full of shit. It takes a lot of practice and discipline to do this safely.

    To be clear, the tree is the customer in this analogy. Your tech and tools are only useful in so far as they complete a valuable job for some party. Reselling value-added tools to other craftsmen is also a viable path, but you have to recognize that the most wizened operators tend to use the older and more boring options. Something that looks incredibly clever to a developer with 3 years of experience is often instantly disregarded by someone with 4 years of experience. The rate at which you stop being a noob is ideally exponential.

    I often look back on the things I thought were absolutely mandatory from a technology standpoint and feel really silly about much of it. I wish there was a better way to ramp developers without them causing total destruction. Right now it's like we're training electrician apprentices by having them work on HV switch gear at a nuclear power plant.

    There is still a huge gap in ideas like apprenticeship in technology. Being able to code is such a tiny piece of the pie. Being able to engage in dialog with the non technical business owners such that your code has effect on target is ~ the rest of the pizza. A laser guided munition delivered from 60k feet will not be very useful if you don't know where it needs to go or how many targets there are. A lot of what I see on the HN front page is tantamount to carpet bombing the jungle non-stop in hopes of jostling an apple out of a tree somewhere.

  • yes_man 1 hour ago
    Just putting aside the bold assumption that LLMs do make coders obsolete or coding unnecessary, it is possible to find similar joy in the end result as one does (or did, given the article) for programming itself. Focusing on what kind of tools or products are being created, and what problems are being solved, and together with LLMs achieving that goal better and faster than without them and finding joy in solving problems this world has. That’s typically why anyone would have paid you to code anyway even before LLMs.

    Of course in reality there’s weird economical mechanics where making the most money and building something that benefits the world don’t necessarily collide, but theres always demand for and joy in solving complex problems, even if its on a higher abstraction level than coding with your favorite language.

  • koliber 2 hours ago
    Software is still eating the world. It was always about efficiency. It was about getting rid of manual data entry by building CSV based export-import flows. It was about getting rid of hundreds of chatbot operators answering mundane questions with an AI bot that could do a decent job with the easy-but-voluminous conversations. Now it’s about getting rid of the tedious coding jobs and replacing them with code gen tools.

    In each of these cases, lots of relatively low-value jobs were no longer needed and a few very-high-value jobs sprang into existence.

    The author of the article loves coding. But software is about solving problems efficiently, not punching the keyboard. The other parts of the job might not be as fun for everyone, but they are even more valuable than typing code. Great programmers could always do both. Now they can focus on the higher value work more by leveraging tools that can do the lower-value work.

    Work is not supposed to be fun. That’s why they pay you to do it. If it was fun, you would have to pay your employer. (Tongue in cheek advice).

  • vydra 2 hours ago
    Its definitely much harder to get into the industry than it was a few years ago and if its coding you were after, you may indeed be disappointed. But give Software Engineering a try! We need to rewrite many of our critical systems and we are afraid to do so primarily to the lack of truly skilled software engineers. IMHO, the AI agents are creating time for us to study what really matters. I would start with Modern Software Engineering by Dave Farley. DM me directly if you want to chat on this topic. https://www.linkedin.com/in/dvydra/
  • muldvarp 1 hour ago
    I genuinely feel like I got bait-and-switched by computer science. If I could go back and study something different I would do it in a heartbeat.

    Sadly, there's very little I can do now. I don't have the financial means to meaningfully change careers now. Pretty much the only thing I could do now that pays somewhat well and doesn't require me to go to university again is teaching. I think I will ride this one out and end it when it ends.

    • heddycrow 1 hour ago
      What if you go back and discover every path you could have taken is a bait-and-switch?
      • muldvarp 1 hour ago
        I did like the (short) LLM-free part of my career. The bait-and-switch refers specifically to the changes due to the introduction of LLMs. Any career where LLMs don't play a big role would not have been a bait-and-switch.

        That said, I don't understand the point of "what if nothing ever works out for you?"-type questions. What do you expect me to answer here? That I'm secretly a wizard and with the flick of my magic wand I'll make something work out?

  • heddycrow 1 hour ago
    Choose your own adventure:

    1) you are using coding assistant too much - you aren't yet ready for the Senior role that requires. Advice: chill out with that and get back to coding solo

    or

    2) you haven't used coding assistant enough to realize it's an idiot savant grade Junior to Mid programmer. Advice: use coding assistant more and then see #1

    Real talk: all moments suck and all moments are wonderful. Source: have lived through few computer moments.

    What a time to be alive!

  • jandrewrogers 2 hours ago
    I still enjoy coding. AI mostly doesn’t produce adequate quality or correctness for the type of code I enjoy writing. There are several domains where AI is worse than useless because training data doesn’t exist. Obviously my experience doesn’t generalize but writing software is a vast, unbounded domain.

    If you find coding boring, explore the frontiers. You will find a lot of coding wilderness where no AI has trod.

    • muldvarp 1 hour ago
      > AI mostly doesn’t produce adequate quality or correctness for the type of code I enjoy writing.

      This assumes that companies care about "code quality" and customers care about bugs.

      > If you find coding boring, explore the frontiers. You will find a lot of coding wilderness where no AI has trod.

      There are a lot of software engineers and not a lot of frontier.

    • tonyhart7 2 hours ago
      "If you find coding boring, explore the frontiers. You will find a lot of coding wilderness where no AI has trod."

      this, AI is nothing without data set

      so if you working in bleeding edge technology where your tools is only have 3 contributor and a way to access them via IRC channel once a day, things get interesting

  • MiiMe19 2 hours ago
    I relate to this on a level that I have felt before. I went from wanting to program for literally any company just as long as I could write code to just wanting to finish my degree and make enough money to live in the middle of nowhere with no internet access for the rest of my life.
  • Ericson2314 3 hours ago
    I have 2000s web nostalgia, but I think the modern dot com and onward software SV was, frankly, mostly up to stupid shit. Not something to romanticize

    Good things to look forward to are:

    - Lean and mathlib revolutionizing math

    - Typst replacing latex and maybe some adobe prosuc

    - Fuschia/Redox/wasi replacing Unix

    - non-professional-programmers finally learning programming en mass

    I think the latter is maybe the most profound. Tech may not grow at a break-neck pace, but erasing the programmer vs computer illiterate dichotomy will mean software can way the world in much less Kafkaesque ways.

    • Pamar 3 hours ago
      I think that what erased "programmer vs computer illiterate" dichotomy was BASIC in the 80s.

      I've met lots of "digital natives" and they seem to use technology as a black box and click/touch stuff at random until it sorta works but they do not very good at creating at mental model of why something is behaving in a way which is not what was expected and verify their own hypothesis (i.e. "debugging").

      • maegul 3 hours ago
        Agreed. And I feel it fair to argue that this is the intended interface between proprietary software and its users, categorically.

        And more so with AI software/tools, and IMO frighteningly so.

        I don’t know where the open models people are up to, but as a response to this I’d wager they’ll end up playing the Linux desktop game all over again.

        All of which strikes at one of the essential AI questions for me: do you want humans to understand the world we live in or not?

        Doesn’t have to be individually, as groups of people can be good at understanding something beyond an individual. But a productivity gain isn’t on it’s a sufficient response to this question.

        Interestingly, it really wasn’t long ago that “understanding the full computing stack” was a topic around here (IIRC).

        It’d be interesting to see if some “based” “vinyl player programming” movement evolved in response to AI in which using and developing tech stacks designed to be comprehensively comprehensible is the core motivation. I’d be down.

    • safety1st 3 hours ago
      It's always about what you choose to focus on. I'm a guy who came of age in the middle of the PC revolution and has been daily driving Linux for over a decade.

      In the last few years we've seen first Valve with SteamOS, and now 37signals with Omarchy, release Linux distros which are absolutely great for their target audience and function as a general purpose operating system just fine. Once might just be a fluke... Twice is a pattern starting to emerge.

      Are we witnessing the beginning of a new operating system ecosystem where you only have to be a billion dollar company to produce a viable operating system instead of a trillion dollar one?

      How many of our assumptions about computing are based on the fact that for 30+ years, only Microsoft, Apple and Google got to do a consumer OS?

      And a preponderance of the little components that make up this "new" OS ecosystem were developed by some of the most radical software freedom fighters we've got.

      Is this a long shot I'm thinking about, you bet. But the last time I was this excited about the future I was a teenager and most homes still didn't have a PC.

    • righthand 3 hours ago
      > non-professional-programmers finally learning programming en mass

      I don’t think this is what you think it is. It’s more like non-professional-programmers hacking together all the applications they wanted to hack together before. The Llm is just the glue.

      IMO, they are NOT learning programming.

      • MiiMe19 2 hours ago
        this exactly. No one is getting smarter or learning anything, the bar to make something just no longer requires you to be competent.
    • trenchpilgrim 2 hours ago
      Isn't Fuchsia a dead project?
  • TheDong 3 hours ago
    I identify with this, though I'm further along the path.

    Coding was incredibly fun until working in capitalist companies got involved. It was then still fairy fun, but tinged by some amount of "the company is just trying to make money, it doesn't care that the pricing sucks and it's inefficient, it's more profitable to make mediocre software with more features than really nail and polish any one part"

    Adding on AI impacts how fun coding is for me exactly how they say, and that compounds with company's misaligned incentives.

    ... I do sometimes think maybe I'm just burned out though, and I'm looking for ways to rationalize it, rather than doing the healthy thing and quitting my job to join a cult-like anti-technology commune.

    • maegul 3 hours ago
      I resonate.

      For me I’m vaguely but persistently thinking about a career change, wondering if I can find something of more tangible “real world” value. An essential basis of which being the question of whether any given tech job just doesn’t hold much apparent “real world value”.

  • empressplay 1 hour ago
    LLMs are a useful tool but they are basically idiot savants.

    They still need someone with higher reasoning skills (eg humans) to verify what they cough up. This need is likely to continue for quite some time (since LLMs simply aren't capable of higher reasoning).

    Learning to code effectively using LLMs is probably the best path forward, from a career standpoint.

  • Hannah203 3 hours ago
    [dead]
  • almosthere 4 hours ago
    [flagged]