For my whole life I’ve been trying to make things—beautiful elegant things.
When I was a child, I found a cracked version of Photoshop and made images which seemed like magic.
When I was in college, I learned to make websites through careful, painstaking effort.
When I was a young professional, I used those skills and others to make websites for hospitals and summer camps and conferences.
Then I learned software development and practiced the slow, methodical process of writing and debugging software.
Now, I get to make beautiful things by speaking, guiding, and directing a system which is capable of handling the drudgery while I think about how to make the system wonderful and functional and beautiful.
It was, for me, never about the code. It was always about making something useful for myself and others. And that has never been easier.
I like coding, I really do. But like you, I like building things more than I like the way I build them. I do not find myself miss writing code by hand as much.
I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".
But I do worry. The main question is this - will there be a day that AI will know what are "the right things to build" and have the "agency" (or illusion of) to do it better than an AI+human (assuming AI will get faster to the "build things right" phase, which is not there yet)
My main hope is this - AI can beat a human in chess for a while now, we still play chess, people earn money from playing chess, teaching chess, chess players are still celebrated, youtube influencers still get monetized for analyzing games of celebrity chess players, even though the top human chess player will likely lose to a stockfish engine running on my iPhone. So maybe there is hope.
> I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".
I've always been strongly in the first category, but... the issue is that 10x more people will be able to build the right things. And if I build the right thing, it will be easy to copy. The market will get crowded, so distribution will become even harder than it is today. Success will be determined by personal brand, social media presence, social connections.
Yeah, seems like too many went into this field for money or status not because they like the process. Which is not an issue by itself, but now these people talk about how their AI assistant of choice made them some custom tool in two hours that would have taken them three weeks. And it's getting exhausting.
I went into this field because I love programming. I didn't even know how well these jobs paid until my junior year of college. I constantly programmed and read programming texts in my spare time growing up, in college, and after work.
I love AI tools. I can have AI do the boring parts. I can even have to write polished, usable apps in languages that I don't know.
I miss being able to think so much about architecture, best practices, frameworks/languages, how to improve, etc.
Isn't this like saying that if better woodworking tools come out, and you like woodworking, that woodworking somehow 'isn't your craft'. They said that their craft is about making things.
There are woodworkers on YouTube who use CNC, some who use the best Festool stuff but nothing that moves on its own, and some who only use handtools. Where is the line at which woodworking is not their craft?
The better analogy is you're now a shop manager or even just QA. You don't need to touch, look at, or think about the production process past asking for something and seeing if the final result fits the bill.
You get something that looks like a cabinet because you asked for a cabinet. I don't consider that "woodworking craft", power tools or otherwise.
Woodworking is, like, the quintessential craft. I think it is very useful to bring it in when discussion "craft"!
I am not myself a woodworker, however I have understood that part of what makes it "crafty" is that the woodworker reads grain, adjusts cuts, and accepts that each board is different.
We can try to contrast that to whatever Ikea does with wood and mass production of furniture. I would bet that variation in materials is "noise" that the mass production process is made to "reject" (be insensitive to / be robust to).
But could we imagine an automated woodworking system that takes into account material variation, like wood grain, not in an aggregate sense (like I'm painting Ikea to do), but in an individual sense? That system would be making judgements that are woodworker-like.
The craft lives on. The system is informed by the judgement of the woodworker, and the craftperson enters an apprenticeship role for the automation... perhaps...
Until you can do RL on the outcome of the furniture. But you still need craft in designing the reward function.
It is a different kind of code. Just a lot of programmers can’t grock it as such.
I guess I started out as a programmer, then went to grad school and learned how to write and communicate my ideas, it has a lot in common with programming, but at a deeper level. Now I’m doing both with AI and it’s a lot of fun. It is just programming at a higher level.
As someone who started with Borland DOS-era IDEs I can tell you that IDEs did get a lot better over the years. I'm still fascinated every day by JetBrains IDEs.
I've seen a hundred ai-generated things, and they are rarely interesting.
Not because the tools are insufficient, it's just that the kind of person that can't even stomach the charmed life of being a programmer will rarely be able to stomach the dull and hard work of actually being creative.
Why should someone be interested in you creations? In what part of your new frictionless life would you've picked up something that sets you apart from a million other vibe-coders?
> stomach the dull and hard work of actually being creative
This strikes me as the opposite of what I experience when I say I'm "feeling creative", then everything comes easy. At least in the context of programming, making music, doing 3D animation and some other topics. If it's "dull and hard work" it's because I'm not feeling "creative" at all, when "creative mode" is on in my brain, there is nothing that feels neither dull nor hard. Maybe it works differently for others.
I don't think 5 years is necessary. I think after two years of this agentic orchestration if you rarely touch code yourself skill will degrade to the point they won't be able to write anything non-trivial without assistance.
Depends how long you've done it, and how much the landscape has changed since then. I can still hop back into SQL and it all comes back to me though I haven't done it regularly at all for nearly 10 years.
In the web front-end world I'd be pretty much a newbie. I don't know any of the modern frameworks, everything I've used is legacy and obsolete today. I'd ramp up quicker than a new junior because I understand all the concepts of HTTP and how the web works, but I don't know any of the modern tooling.
Adam Neely has a video on GenAI and it's impact on the music industry. There is a section in the video about beauty and taste and it's pretty different from your conclusions. One example I remember is would an AI find beauty in a record scratch sound?
I want to be in your camp, and am trying hard. But the OP's blog entry should at least give us a moment to "respect the dead". That's all he's asking, I think.
> For my whole life I’ve been trying to make things—beautiful elegant things.
Me too, but... The ability to code was a filter. With AI, the pool of people who can build beautiful elegant software products expands significantly. Good for the society, bad for me.
AI agents seem to be a powerful shortcut to the drudgery. But let's not forget, that powerful software rests on substance. My hope is the substance will increase, after all.
So when you "learned software development and practiced the slow, methodical process of writing and debugging software", it wasn't about code? I don't get it. Yes, building useful things is the ultimate goal, but code is the medium through which you do it, and I don't understand how that cannot be an important part of the process.
It's like a woodworker saying, "Even though I built all those tables using precise craft and practice, it was NEVER ABOUT THE CRAFT OR PRACTICE! It was about building useful things." Or a surgeon talking about saving lives and doing brain surgery, but "it was never about learning surgery, it was about making people get better!"
Not the GP I feel some of that energy. The parts I most enjoy are the interfaces, the abstractions, the state machines, the definitions. The code I enjoy too, and I would be sad to lose all contact with it, but I've really appreciated AI especially for helping me get over the initial hump on things like:
- infrastructure bs, like scaffold me a JS GitHub action that does x and y.
- porting, like take these kernel patches and adjust them from 6.14 to 6.17.
- tools stuff, like here's a workplace shell script that fetches a bunch of tokens for different services, rewrite this from bash to Python.
- fiddly things like dealing with systemd or kubernetes or ansible
- fault analysis, like here's a massive syslog dump or build failure, what's the "real" issue here?
In all these cases I'm very capable of assessing, tweaking, and owning the end result, but having the bot help me with a first draft saves a bunch of drudgery on the front end, which can be especially valuable for the ADHD types where that kind of thing can be a real barrier to getting off the ground.
In my opinion the relationship between level of detailed care and resulting beauty is proportional. Can you get the same level without getting your hands dirty? Sure, maybe, but I doubt a painter or novelist could really produce beautiful work without being intimately familiar with that work. The distance that heavy use of AI tools creates between you and the output does not really lend itself to beauty. Could you do it, sure, but at that point it's probably more efficient to just do things yourself and have complete intimate control.
To me, you sound more utilitarian. The philosophy you are presenting is a kind of Ikea philosophy. Utility, mass production, and unique beauty are generally properties that do not cohere together, and there's a reason for this. I think the use of LLMs in the production of digital goods is very close to the use of automation lines in the production of physical goods. No matter how you try some of the human charm, and thus beauty will inevitably be lost, the number of goods will increase, but they'll all be barely differentiable souless replications of more or less the same shallow ideas repeated as infinitum.
It's like learning to cook and regularly making your own meals, then shifting to a "new paradigm" of hiring a personal chef to cook for you. Food's getting made either way, but it's not really the same deal.
No, it's more like moving from line cook, to head chef in charge of 30 cooks.
Food's getting made, but you focus on the truly creative part -- the menu, the concept, the customer experience. You're not boiling pasta or cutting chives for the thousandth time. The same way now you're focusing on architecture and design now instead of writing your 10,000th list comprehension.
Except the cooks don't exist anymore as they all have become head chefs (or changed careers) and the food is being cooked by magical cooking black boxes
Because such people are not sincere either to themselves about who they are or to others. It's really hard for me to take seriously phrases like "I joined this industry to make things, not to write code".
Do painters paint because they just like to see the final picture? Or do they like the process? Yes, painting is an artistic process, not exactly crafting one. But the point stand.
Woodworkers making nice custom furniture generally enjoy the process.
Writing code is my favorite activity. I hate these takes, you never liked writing code, who cares. This probably just translates to you sucked at it. Get out of our trade please.
I started programming over 40 years ago because it felt like computers were magic. They feel more magic today than ever before. We're literally living in the 1980s fantasy where you could talk to your computer and it had a personality. I can't believe it's actually happening, and I've never had more fun computing.
I can't empathize with the complaint that we've "lost something" at all. We're on the precipice of something incredible. That's not to say there aren't downsides (WOPR almost killed everyone after all), but we're definitely in a golden age of computing.
I feel like we've reached the worst age of computing. Where our platforms are controlled by power hungry megacorporations and our software is over-engineered garbage.
The same company that develops our browsers and our web standards is also actively destroying the internet with AI scrapers. Hobbyists lost the internet to companies and all software got worse for it.
Our most popular desktop operating system doesn't even have an easy way to package and update software for it.
Dystopian cyberpunk was always part of the fantasy. Yes, scale has enabled terrible things.
There are more alternatives than ever though. People are still making C64 games today, cheap chips are everywhere. Documentation is abundant... When you layer in AI, it takes away labor costs, meaning that you don't need to make economically viable things, you can make fun things.
I have at least a dozen projects going now that I would have never had time or energy for. Any itch, no matter how geeky and idiosyncratic, is getting scratched by AI.
I’m not the OP, but my answer is that there’s a big difference between building products and building businesses.
I’ve been programming since 1998 when I was in elementary school. I have the technical skills to write almost anything I want, from productivity applications to operating systems and compilers. The vast availability of free, open source software tools helps a lot, and despite this year’s RAM and SSD prices, hardware is far more capable today at comparatively lower prices than a decade ago and especially when I started programming in 1998. My desktop computer is more capable than Google’s original cluster from 1998.
However, building businesses that can compete against Big Tech is an entirely different matter. Competing against Big Tech means fighting moats, network effects, and intellectual property laws. I can build an awesome mobile app, but when it’s time for me to distribute it, I have to either deal with app stores unless I build for a niche platform.
Yes, I agree that it’s never been easier to build competing products due to the tools we have today. However, Big Tech is even bigger today than it was in the past.
You don't need billions of dollars to write an app. You need billions of dollars to create an independent platform that doesn't give the incumbent a veto over your app if you're trying to compete with them. And that's the problem.
In some ways, I'd say we're in a software dark age. In 40 years, we'll still have C, bash, grep, and Mario ROMs, but practically none of the software written today will still be around. That's by design. SaaS is a rent seeking business model. But I think it also applies to most code written in JS, Python, C#, Go, Rust, etc. There are too many dependencies. There's no way you'll be able to take a repo from 2026 and spin it up in 2050 without major work.
One question is how will AI factor in to this. Will it completely remove the problem? Will local models be capable of finding or fixing every dependency in your 20yo project? Or will they exacerbate things by writing terrible code with black hole dependency trees? We're gonna find out.
Glad to see this already expressed here because I wholly agree. Programming has not brought me this much joy in decades. What a wonderful time to be alive.
We have what I've dreamed of for years: the reverse dictionary.
Put in a word and see what it means? That's been easy for at least a century. Have a meaning in mind and get the word? The only way to get this before was to read a ton of books and be knowledgable or talk to someone who was. Now it's always available.
The "reverse dictionary" is called a "thesaurus". Wikipedia quotes Peter Mark Roget (1852):
> ...to find the word, or words, by which [an] idea may be most fitly and aptly expressed
I briefly investigated LLMs for this purpose, back when I didn't know how to use a thesaurus; but I find thesauruses a lot more useful. (Actually, I'm usually too lazy to crack out a proper thesaurus, so I spend 5 seconds poking around Wiktionary first: that's usually Good Enough™ to find me an answer, when I find an answer I can trust it, and I get the answer faster than waiting for an LLM to finish generating a response.)
There's definitely room to improve upon the traditional "big book of synonyms with double-indirect pointers" thesaurus, but LLMs are an extremely crude solution that I don't think actually is an improvement.
Taking advantage of the fact my passive vocabulary is greater than my active vocabulary: no, no, yes. (I've spuriously rejected "multipurpose" – a decent synonym of "versatile [tool]" – but that doesn't matter.) I'm pretty sure WordHippo is machine-generated from some corpus, and a lot of these words don't mean "very useful", but they're good at playing the SEO game, and I'm lazy. Once we have versatile, we can put that into an actual thesaurus: https://dictionary.cambridge.org/thesaurus/versatile. But none of those really have the same sense as "versatile" in the context I'm thinking of (except perhaps "adaptable"), so if I were writing something, I'd go with "versatile".
Total time taken: 15 seconds. And I'm confident that the answer is correct.
By the way, I'm not finding "multifarious" anywhere. It's not a word I'm familiar with, but that doesn't actually seem to be a proper synonym (according to Wiktionary, at least: https://en.wiktionary.org/wiki/Thesaurus:heterogeneous). There are certainly contexts where you could use this word in place of "versatile" (e.g. "versatile skill-set" → "multifarious skill-set"), but I criticise WordHippo for far less dubious synonym suggestions.
I'm not going to code by hand if it's 4x slower than having Claude do it. Yes, I can do that, but it just feels bad.
The analogy I like is it's like driving vs. walking. We were healthier when we walked everywhere, but it's very hard to quit driving and go back even if it's going to be better for you.
I retired a few years ago, so I have no idea what AI programming is.
But I mourned when CRT came out, I had just started programming. But I quickly learned CRTs were far better,
I mourned when we moved to GUIs, I never liked the move and still do not like dealing with GUIs, but I got used to it.
Went through all kinds of programming methods, too many to remember, but those were easy to ignore and workaround. I view this new AI thing in a similar way. I expect it will blow over and a new bright shiny programming methodology will become a thing to stress over. In the long run, I doubt anything will really change.
I think you're underestimating what AI can do in the coding space. It is an extreme paradigm shift. It's not like "we wrote C, but now we switch to C++, so now we think in objects and templates". It's closer to the shift from assembly to a higher level language. Your goal is still the same. But suddenly you're working in a completely newer level of abstraction where a lot of the manual work that used to be your main concern is suddenly automated away.
If you never tried Claude Code, give it s try. It's very easy to get I to. And you'll soon see how powerful it is.
I agree with you with the caveat that all the "ease of building" benefits, for me, could potentially be dwarfed by job losses and pay decreases. If SWE really becomes obsolete, or even if the number of roles decrease a lot and/or the pay decreases a lot (or even fails to increase with inflation), I am suddenly in the unenviable position of not being financially secure and being stuck in my 30s with an increasingly useless degree. A life disaster, in other words. In that scenario the unhappiness of worrying about money and retraining far outweighs the happiness I get from being able to build stuff really fast.
Fundamentally this is the only point I really have on the 'anti-AI' side, but it's a really important one.
Anybody who says this kind of thing, I assume you weren't very good at programming, you ultimately didn't like doing it and so you probably climbed the business ladder. I can't say for sure but I'd bet you're upper management and therefor this is magic to you, its a threat to all of us. You probably despised writing code because you weren't good at it so you welcome these tools to patch that insecurity of yours.
I didn't imagine I would be sending all my source code directly to a corporation for access to an irritatingly chipper personality that is confidently incorrect the way these things are.
There have been wild technological developments but we've lost privacy and autonomy across basically all devices (excepting the people who deliberately choose to forego the most capable devices, and even then there are firmware blobs). We've got the facial recognition and tracking so many sci-fi dystopias have warned us to avoid.
I'm having an easier time accomplishing more difficult technological tasks. But I lament what we have come to. I don't think we are in the Star Trek future and I imagined doing more drugs in a Neuromancer future. It's like a Snow Crash / 1984 corporate government collab out here, it kinda sucks.
While I'm on the fence about LLMs there's something funny about seeing an industry of technologists tear their own hair out about how technology is destroying their jobs. We're the industry of "we'll automate your job away". Why are we so indignant when we do it to ourselves...
This article isn't really about losing a job. Coding is a passion for some of us. It's similar to artists and diffusion, the only difference being that many people can appreciate human art - but who (outside of us) cares that a human wrote the code?
The people outside of us didn’t care about your beautiful code before. Now we can quickly build their boring applications and spend more time building beautiful things for our community’s sake. Yes, there are economic concerns, but as far as “craft” goes, nothing is stopping us from continuing to enjoy it.
I love programming, but most of that joy doesn't come from the type of programming I get paid to do. I now have more time and energy for the fun type, and I can go do things that were previously inconceivable!
Last night "I" "made" 3D boids swarm with directional color and perlin noise turbulence. "I" "did" this without knowing how to do the math for any of those things. (My total involvement at the source level was fiddling with the neighbor distance.)
Obviously that matters, but how much does it matter? Does it matter if you don't learn anything about computer architecture because you only code in JS all day? Very situational.
I disagree a bit. Coding can remain an artistic passion for you indefinitely, it's just your ability to demand that everyone crafts each line of code artisinally won't be subsidized by your employer for much longer. There will probably always be a heavily diminished demand for handcrafted code.
I think this is really it. Being a musician was never a very reliable way to earn a living, but it was a passion. A genuine expression of talent and feeling through the instrument. And if you were good enough you could pay the bills doing work work for studios, commercials, movies, theater. If you were really good you could perform as a headliner.
Now, AI can generate any kind of music anyone wants, eliminating almost all the anonymous studio, commercial, and soundtrack work. If you're really good you can still perform as a headliner, but (this is a guess) 80% of the work for musicians is just gone.
Agreed. I've always thought the purpose of all automation was to remove needless toil. I want computers to free people. I guess I subscribe to the theory of creative destruction.
Maybe it comes down to the definition of "toil". Some people find typing to be toiling, so they latch on to not having to type as much when using LLMs. Other people see "chores" as toiling, and so dream of household robots to take on the burden of that toil. Some people hate driving and consider that to be needless toil, so self-driving cars answer that—and the ads for Waymo latch onto this.
Personally, I am not stymied by typing nor chores nor driving. For me, typing is like playing a musical instrument: at some point you stop needing to think about how to play and you just play. The interaction and control of the instrument just comes out of your body. At some point in my life, all the "need to do things around the house" just became the things I do, and I'm not bothered by doing them, such that I barely notice doing them. But it's complex: the concept of "chores" is front and center when you're trying to get a teenager to be responsible for taking care of themselves (like having clean clothes, or how the bathroom is safer if it's not a complete mess) and participating in family/household responsibilities (like learning that if you don't make a mess, there's nothing to clean up). Can you really be effective at directing someone/something else without knowing how to do it yourself? Probably for some things, but not all.
> Maybe it comes down to the definition of "toil".
For sure.
I idealize a future where people can spend more time doing things they want to do, whatever those avocations might be. Freedom from servitude. I guess some kind of Star Trek / The Culture hybrid dream.
The world we have is so far from that imaginary ideal. Implicit in that ideal would be elimination of inequality, and I'm certain there are massive forces that would oppose that elimination.
For me it's because the same tech is doing it to everyone else in a more effective way (i.e. artists especially). I'm an "art enjoyer" since I was a child and to see it decimated by people who I once looked up to is heartbreaking. Also, if it only affected software, I would've been happy to switch to a more artistic career, but welp there goes that plan.
> Now is the time to mourn the passing of our craft.
Your craft is not my craft.
It's entirely possible that, as of now, writing JavaScript and Java frontends (what the author does) can largely be automated with LLMs. I don't know who the author is writing to, but I do not mistake the audience to be "programmers" in general...
If you are making something that exists, or something that is very similar to something that exists, odds are that an LLM can be made to generate code which approximates that thing. The LLM encoding is lossy. How will you adjust the output to recover the loss? What process will you go through mentally to bridge the gap? When does the gap appear? How do you recognize it? In the absolute best case you are given a highly visible error. Perhaps you've even shipped it, and need to provide context about the platform and circumstances to further elucidate. Better hope that platform and circumstance is old-hat.
LLMs are only a threat if you see your job as a code monkey. In that case you're likely already obsoleted by outsourced staff who can do your job much cheaper.
If you see your job as a "thinking about what code to write (or not)" monkey, then you're safe. I expect most seniors and above to be in this position, and LLMs are absolutely not replacing you here - they can augment you in certain situations.
The perks of a senior is also knowing when not to use an LLM and how they can fail; at this point I feel like I have a pretty good idea of what is safe to outsource to an LLM and what to keep for a human. Offloading the LLM-safe stuff frees up your time to focus on the LLM-unsafe stuff (or just chill and enjoy the free time).
I see my job as having many aspects. One of those aspects is coding. It is the aspect that gives me the most joy even if it's not the one I spend the most time on. And if you take that away then the remaining part of the job is just not very appealing anymore.
It used to be I didn't mind going through all the meetings, design discussions, debates with PMs, and such because I got to actually code something cool in the end. Now I get to... prompt the AI to code something cool. And that just doesn't feel very satisfying. It's the same reason I didn't want to be a "lead" or "manager", I want to actually be the one doing the thing.
You won't be prompting AI for the fun stuff (unless laying out boring boilerplate is what you consider "fun"). You'll still be writing the fun part - but you will be able to prompt beforehand to get all the boilerplate in place.
These comments are comical. How hard is it to understand that human beings are experiential creates. Our experiences matter, to survival, to culture, and identity.
I mourn the horse masters and stable boys of a century past because of their craft. Yeats of intuition and experience.
Why do you watch a chess master play, or a live concert, or any form of human creation?
Should we automate parts of our profession? Yes. Should he mourn the loss of our craft. Also yes.
This is what I don't really understand. It's a bit difficult to take "wait x months" at face value because I've been hearing it for so long. Wait x months for what? Why hasn't it happened yet?
Things seem to be getting better from December 2022 (chatgpt launch), sure, but is there a ceiling we don't see?
"Self-driving cars" and Fusion power also come to mind. With the advent of photography, it was widely believed that drawing and painting would vanish as art forms. Radio would obsolete newspapers, becoming obsolete themselves with television, and so on. Don't believe the hype.
Waymos require a highly mapped environment to function safely in. Not to take away from what Waymo has accomplished, but it's a far more bounded problem that what the "self driving" promise has been.
Um.. Claude Code has been out less than a YEAR.. and the lift in capability in the last year has been dramatic.
It does seem probable based on progress that in 1-2 more model generations there will be little need to hand code in almost any domain. Personally I already don't hand code AT ALL, but there are certainly domains/languages that are under performing right now.
Right now with the changes this week (Opus 4.6 and "teams mode") it already is another step function up in capability.
Teams mode is probably only good for greenfield or "green module" development but I'm watching a team of 5 AI's collaborating and building out an application module by module. This is net new capability for the tool THIS WEEK (Yes I am aware of earlier examples).
I don't understand how people can look at this and then be dismissive of future progress, but human psychology is a rich and non-logical landscape.
Agree with the author. I like the process of writing code, typing method names and class definitions while at the same time thinking ahead about overall architecture, structure, how much time given function would run for, what kind of tests are necessary.
I find it unsettling how many people in the comments say that they don't like writing code. Feels aliens to me. We went into this field for seemingly very different reasons.
I do use LLMs and even these past two days I was doing vibe coding project which was noticeably faster to setup and get to its current state than if I wrote in myself. However I feel almost dirty by how little I understand the project. Sure, I know the overall structure, decisions and plan. But I didn't write any of it and I don't have deep understanding of the codebase which I usually have when working on codebase myself.
It's not so much the writing of the code (which I did like), it's the aesthetic of the code. It's solving a problem with the right code and the right amount of code (for now). That's still the case, even with AI writing most of the code. You have to steer it constantly because it has very bad instincts, because most people in the profession aren't good at it, so it has bad training data. Mainly because the "learn to code" movement and people getting into this profession just for the money and not the love. Those people are probably screwed.
To the people who are against AI programming, honest question: why do you not program in assembly? Can you really say "you" "programmed" anything at all if a compiler wrote your binaries?
This is a 100% honest question. Because whatever your justification to this is, it can probably be used for AI programmers using temperature 0.0 as well, just one abstraction level higher.
I'm 100% honestly looking forward to finding a single justification that would not fit both scenarios.
But I've seen this conversation on HN already 100 times.
The answer they always give is that compilers are deterministic and therefore trustworthy in ways that LLMs are not.
I personally don't agree at all, in the sense I don't think that matters. I've run into compiler bugs, and more library bugs than I can count. The real world is just as messy as LLMs are, and you still need the same testing strategies to guard against errors. Development is always a slightly stochastic process of writing stuff that you eventually get to work on your machine, and then fixing all the bugs that get revealed once it starts running on other people's machines in the wild. LLMs don't write perfect code, and neither do you. Both require iteration and testing.
I'm not particularly against AI programming but I don't think these two things are equivilent. A compiler translates code to specifications in a deterministic way, the same compiler produces the same output from the same code, it is all completely controlled. AI is not at all deterministic, temperature is built into LLMs and furthermore the lack of specificity in prompts and our spoken languages. The difference in control is significant enough to me not to put compilers and AI coding agents into the same catagory even though they are both taking some text and producing some other text/machine code.
I often venerate antiques and ancient things by thinking about how they were made. You can look at a 1000-year-old castle and think: This incredible thing was built with mules and craftsmen. Or look at a gorgeous, still-ticking 100-year-old watch and think: This was hand-assembled by an artist. Soon I'll look at something like the pre-2023 Linux kernel or Firefox and think: This was written entirely by people.
At least with physical works (for now, anyway), the methods the artisans employ leave tell-tale signs attesting to the manner of construction, so that someone at least has the choice of going the "hand made" route, and others, even lay people without special tooling, can tell that it indeed was hand made.
Fully AI generated code has similar artifacts. You can spot them pretty easily after a bit. Of course it doesn't really matter for the business goals, as long as it works correctly. Just like 99% of people don't care if their clothing was machine made vs. handmade. It's going to be a tiny minority that care about handmade software.
One other helpful frame: I consider LLMs simply to be very flexible high-level 'language' Compilers. We've moved up the Abstraction Chain ever since we invented FORTRAN and COBOL (and LISP) instead of using assembly language.
We're 'simply' moving up the abstraction hierarchy again. Good!
> I didn’t ask for the role of a programmer to be reduced to that of a glorified TSA agent, reviewing code to make sure the AI didn’t smuggle something dangerous into production.
This may be the perspective of some programmers. It doesn't seem to be shared by the majority of software engineers I know and read and listen to.
Do you mean the perspective that he is a "glorified TSA agent" or that he doesn't like it? Because in this thread it seems that some people agree but they just like it :)
I don't mourn coding for itself, since I've always kinda disliked that side of my work (numerical software, largely).
What I do mourn is the reliability. We're in this weird limbo where it's like rolling a die for every piece of work. If it comes up 1-5, I would have been better off implementing it myself. If it comes up 6, it'll get it done orders of magnitude faster than doing it by hand. Since the overall speedup is worthwhile, I have to try it every time, even if most of the time it fails. And of course it's a moving target, so I have to keep trying the things that failed yesterday because today's models are more capable.
But I am still quite annoyed at the slopful nature of the code that is produced when you're not constantly nagging it to do better
We've RLed it to produce code that works by hook or by crook, putting infinity fallback paths and type casts everywhere rather than checking what the semantics should be.
I wonder if this is just a matter of degree. In a few years (or less) you may not have to "skillfully guide" anything. The agents will just coordinate themselves and accomplish your goals after you give some vague instruction. Will you still feel proud? Or maybe a bit later then agents will come up with their own improvements and just ship them without any input at all. How about then?
> We’ll miss the sleepless wrangling of some odd bug that eventually relents to the debugger at 2 AM.
I'll miss it not because the activity becomes obsolete, but because it's much more interesting than sitting till 2am trying to convince LLM to find and fix the bug for me.
We'll still be sitting till 2am.
> They can write code better than you or I can, and if you don’t believe me, wait six months.
I've been hearing this for the last two years. And yet, LLMs, given abstract description of the problem, still write worse code than I do.
Or did you mean type code? Because in that case, yes, I'd agree. They type better.
This perspective was mine 6 months ago. And god damn, I do miss the feeling of crafting something truly beautiful in code sometimes. But then, as I've been pushed into this new world we're living in, I've come to realize a couple things:
Nothing I've ever built has lasted more than a few years. Either the company went under, or I left and someone else showed up and rewrote it to suit their ideals. Most of us are doing sand art. The tide comes in and its gone.
Code in and of itself should never have been the goal. I realized that I was thinking of the things I build and the problems I selected to work on from the angle of code quality nearly always. Code quality is important! But so is solving actual problems with it. I personally realized that I was motivated more by the shape of the code I was writing than the actual problems it was written to solve.
Basically the entire way I think about things has changed now. I'm building systems to build systems. Thats really fun. Do I sometimes miss the feeling of looking at a piece of code and feeling a sense of satisfaction of how well made it is? Sure. That era of software is done now sadly. We've exited the craftsman era and entered into the Ikea era of software development.
The acceleration of AI has thrown into sharp relief that we have long lumped all sorts of highly distinct practices under this giant umbrella called "coding". I use CC extensively, and yet I still find myself constantly editing by hand. Turns out CC is really bad at writing kubernetes operators. I'd bet it's equally bad at things like database engines or most cutting edge systems design problems. Maybe it will get better at these specific things with time, but it seems like there will always be a cutting edge that requires plenty of human thought to get right. But if you're doing something that's basically already been done thousands of times in slightly different ways, CC will totally do it with 95% reliability. I'm ok with that.
It's also important to step back and realize that it goes way beyond coding. Coding is just the deepest tooth of the jagged frontier. In 3 years there will be blog posts lamenting the "death of law firms" and the "death of telemedicine". Maybe in 10 years it will be the death of everything. We're all in the same boat, and this boat is taking us to a world where everyone is more empowered, not less. But still, there will be that cutting edge in any field that will require real ingenuity to push forward.
I think there's clearly a difference in opinion based on what you work on. Some people were working on things that pre-CC models also couldn't handle and then CC could, and it changed their opinions quickly. I expect (but cannot prove of course) that the same will happen with the area you are describing. And then your opinion may change.
The death of a means to an end is the birth of an end itself.
When cameras became mainstream, realism in painting went out of fashion, but this was liberating in a way as it made room for many other visual art styles like Impressionism. The future of programming/computing is going to be interesting.
I fall in the demographic discussed in the article but I’ve approached this with as much pragmatism as I can muster. I view this as a tool to help improve me as a developer. Sure there will be those of us who do not stay ahead (is that even possible?) of the curve and get swallowed up but technology has had this affect on many careers in the past. They just change into something different and sometimes better. It’s about being willing to change with it.
Like other tech disrupted crafts before this, think furniture making or farming, that's how it goes. From hand-made craft, to mass production factories (last couple of decades) to fully automated production.
The craft was dying long before LLMs. Started in dotcom, ZIRP added some beatings, then LLMs are finishing the job.
This is fine, because like in furniture making, the true craftsmen will be even more valuable (overseeing farm automation, high end handmade furniture, small organic farms), and the factory worker masses (ZIRP enabled tech workers) will move on to more fulfulling work.
That’s not how it goes for the worker. If you are a capitalist then it doesn’t matter, you own the means of production. The laborer, however, has to learn new skills, which take time and money. If your profession no longer exists, unless you have enough capital to retool/be a capitalist, then you will personally get poorer.
Where do people find this optimism? I reckon when the software jobs fall everything else will follow shortly too. That's just the first target because it's what we know and the manual stuff is a little harder for now. The "good news" is everyone might be in the same boat so the system will have to adapt,
There's a commercial building under construction next to my office. I look down on the construction site, and those strapping young men are digging with their big excavators they've been using for years and taking away the dirt with truck and trailer.
Why use a spade? Even those construction workers use the right sized tools. They ain't stupid.
I think OP is coming at this more from an artisan angle. Perhaps there were shoveler artisans who took pride in the angle of their dirt-shoveling. Those people perhaps do lament the advent of excavators. But presumably the population who find code beautiful vs the art of shoveling are of different sizes
I feel like we are long into the twilight of mini blogs and personal sites. Its like people trying to protect automotive jobs, the vas majority were already lost
I thought I'd miss all the typing and syntax, but I really don't. Everyone has their own relationship with coding, but for me, I get satisfaction out of the end product and putting it in front of someone. To the extend that I cared about the code, it mainly had to do with how much it allowed the end product to shine.
Yes, there's clearly a big split in the community where perhaps ~50% are like OP and the other ~50% are like you. But I think we should still respect the views of the other side and try to empathize.
Coding is an abstraction. Your CPU knows nothing of type safety, bloom filters, dependencies, or code reuse.
Mourning the passing of one form of abstraction for another is understandable, but somewhat akin to bemoaning the passing of punch card programming. Sure, why not.
Your entire brain's model of the world is an abstraction over its sensory inputs. By this logic we might as well say you shouldn't mourn anything since all it means is a minor difference in the sensory inputs your brain receives.
You know who else mourned the loss of craft? People that don't like PHP and Wordpress because they lower the barrier to entry to creating useful stuff while also leaving around a fair amount of cruft and problems that the people that use them don't understand how to manage.
Like iambateman said: for me it was never about code. Code was a means to an ends and it didn't stop at code. I'm the kind of software engineer that learned frontends, systems, databases, ETLs, etc -- whatever it was that was that was demanded of me to produce something useful I learned and did it. We're now calling that a "product engineer". The "craft" for me was in creating useful things that were reliable and efficient, not particularly how I styled lines, braces, and brackets. I still do that in the age of AI.
All of this emotional spillage feels for not. The industry is changing as it always has. The only constant I've ever experienced in this industry is change. I realized long ago that when the day comes that I am no longer comfortable with change then that is my best signal that this industry is no longer for me.
I think it's a bit different when you can opt out. If you didn't want to use PHP you didn't have to. But it's getting increasingly hard to opt out of AI.
Some code is worth transcribing by hand — an ancient practice in writing, art and music.[0] Some isn't even worth looking at.
I find myself, ironically, spending more time typing out great code by hand now. Maybe some energy previously consumed by tedium has been freed up, or maybe the wacky machines brought a bit of the whimsy back into the process for me.
[0] And in programming, for the readers of Zed Shaw's books :)
I get where this is coming from. But at the same, AI/LLMs are such an exciting development. As in "maybe I was wrong and the singularity wasn't bullshit". If nothing else, it's an interesting transition to live through.
It makes me sad to read posts like this. If it is a necessary step for you on the journey from denial to acceptance to embracing the new state of the world, then sure, take your time.
But software engineering is the only industry that is built on the notion of rapid change, constant learning, and bootstrapping ourselves to new levels of abstraction so that we don't repeat ourselves and make each next step even more powerful.
Just yesterday we were pair programming with a talented junior AI developer. Today we are treating them as senior ones and can work with several in parallel. Very soon your job will not be pair programming and peer reviewing at all, but teaching a team of specialized coworkers to work on your project. In a year or two we will be assembling factories of such agents that will handle the process from taking your requirements to delivering and maintaining complex software. Our jobs are going to change many more times and much more often than ever.
And yet there will still be people finding solace in hand-crafting their tools, or finding novel algorithms, or adding the creativity aspect into the work of their digital development teams. Like people lovingly restoring their old cars in their garage just for the sake of the process itself.
> software engineering is the only industry that is built on the notion of rapid change, constant learning, and bootstrapping ourselves to new levels of abstraction
Not sure I agree. I think most programming today looks almost exactly the same as it did 40 years ago. You could even have gotten away with never learning a new language. AI feels like the first time a large percentage of us may be forced to fundamentally change the way we work or change careers.
I’m in my 40 something and it’s game over for my career. The grey in my hair makes it so that I never get past the first round. The history on my resume makes it so I’m lucky to get a round. The GPT’s and Claude have fundamentally changed how I view work and frankly, I’m over it.
I’m in consulting now and it’s all the same crap. Enterprises want to “unleash AI” so they can fire people. Maximize profits. My nephews who are just starting their careers are blindly using these tools and accepting the PR if it builds. Not if it’s correct.
I’m in awe of what it can do but I also am not impressed with the quality of how it does it.
I’m fortunate to not have any debt so I can float until the world either wises up or the winds of change push me in a new direction.
I liked the satisfaction of building something “right” that was also “useful”. The current state of Opus and Codex can only pretend to do the latter.
People have to stop talking like LLMs solved programming.
If you're someone with a background in Computer Science, you should know that we have formal languages for a reason, and that natural language is not as precise as a programming language.
But anyway we're peek AI hype, hitting the top on HN is worth more than a reasonable take, reasonableness doesn't sell after all.
So here we're seeing yet another text about how the world of software was solved by AI and being a developer is an artifact of the past.
Right? At least on HN, there's a critical mass of people loudly ignoring this these days, but no one has explained to me how replacing formal language with an english-language-specialized chatbot - or even multiple independent chatbots (aka "an agent") - is good tradeoff to make.
December a few years ago, pre-ChatGPT I did Advent of Code in Rust. It was very difficult, had never done the full month before, barely knew Rust and kept getting my ass kicked by it. I spent a full Saturday afternoon solving one of the last problems of the month, and it was wonderful. My head hurt and I was reading weird Wikipedia articles and it was a blast. Nothing is stopping me from doing that sort of thing again, and I feel like I might need to, to counteract the stagnation I feel at times mentally when it comes to coding. That spark is still in there I feel, buried under all the slop, and it would reappear if I gave it the chance, I hope. I have been grieving for the last years I think and only recently have I come to terms with the changes to my identity that llm's have wrought.
Great post. Super sad state of affairs but we move on and learn new things. Programming was always a tool and now the tool has changed from something that required skill and understanding to complaining to a neural net. Just have to focus on the problem being solved more.
This makes me think about the craftsmen whose careers vanished or transformed through the ages due to industries, machines etc. They did not have online voices to write 1000's of blogs everyday. Nor did they have people who can read their woes online.
I absolutely disagree with this. All the things the author said will still exist and keep on existing.
Nothing will prevent you from typing “JavaScript with your hands”, from “holding code in our hands and molding it like clay…”, and all the other metaphors. You can still do all of it.
What certainly will change is the way professional code will be produced, and together with that, the avenue of having a very well-paid remuneration, to write software line-by-line.
I’ll not pretend that I don’t get the point, but it feels like the lamentation of a baker, tailor, shoemaker, or smith, missing the days of old.
And yet, most people prefer a world with affordable bread, clothes, footware, and consumer goods.
Will the world benefit the most from “affordable” software? Maybe yes, maybe not, there are many arguments on both sides. I am more concerned the impact on the winners and losers, the rich will get more rich and powerful, while the losers will become even more destitute.
Yet, my final point would be: it is better or worse to live in a world in which software is more affordable and accessible?
> All the things the author said will still exist and keep on existing.
Except the community of people who, for whatever reason, had to throw themselves into it and had critical mass to both distribute and benefit from the passion of it. This has already been eroded by the tech industry coopting programming in general and is only going to diminish.
The people who discovered something because they were forced to do some hard work and then ran with it are going to be steered away from that direction by many.
I don’t think it’s that simple. A couple of examples:
Food:
A lot of the processed foods that are easily available make us unhealthy and sick. Even vegetables are less nutritious than they were 50 years ago. Mass agriculture also has many environmental externalities.
Consumer goods:
It has become difficult to find things like reliable appliances. I bought a chest freezer. It broke after a year. The repairman said it would cost more to fix than to buy a new one. I asked him if there was a more reliable model and he said no: they all break quickly.
Clothing:
Fast fashion is terrible for the environment. Do we need as many clothes as we have? How quickly do they end up in landfills?
Would we be better off as a society repairing shoes instead of buying new ones every year?
This. People are way too easily impressed. I don't think this easily-impressedness will generalize to most people in the real world.
If you really buy all that you'd be part of the investor class that crashed various video game companies upon seeing Google put together a rather lame visual stunt and have their AI say, and I quote because the above-the-fold AI response I never asked for has never been more appropriate to consult…
"The landscape of AI video game generation is experiencing a rapid evolution in 2025-2026, shifting from AI-assisted asset creation to the generation of entire interactive, playable 3D environments from text or image prompts. Leading initiatives like Google DeepMind's Project Genie and Microsoft's Muse are pioneering "world models" that can create, simulate physics, and render games in real-time."
And then you look at what it actually is.
Suuuure you will, unwanted AI google search first response. Suuure you will.
2. the tools still need a lot of direction, i still fight claude with opus to do basic things and the best experiences are when i provide very specific prompts
3. being idealistic on a capitalist system where you have to pay your bills every month is something i could do when my parents paid my bills
These apocalyptic posts about how everything is shit really don't match my reality at all. I use these tools every day to be more productive and improve my code but they are nowhere close to doing my actual job, that is figuring out WHAT to do. How to do it is mostly irrelevant, as once i get to that point i already know what needs to be done and it doesn't matter if it is me or Opus producing the code.
I'll believe it when I start seeing examples of good and useful software being created with LLMs or some increase in software quality. So far it's just AI doom posting, hype bloggers that haven't shipped anything, anecdotes without evidence, increase in CVEs, increase in outages, and degraded software quality.
It would be helpful if you could define “useful” in this context.
I’ve built a number of team-specific tools with LLM agents over the past year that save each of us tens of hours a month.
They don’t scale beyond me and my six coworkers, and were never designed to, but they solve challenges we’d previously worked through manually and allow us to focus on more important tasks.
The code may be non-optimal and won’t become the base of a new startup. I’m fine with that.
It’s also worth noting that your evidence list (increased CVEs, outages, degraded quality) is exclusively about what happens when LLMs are dropped into existing development workflows. That’s a real concern, but it’s a different conversation from whether LLMs create useful software.
My tools weren’t degraded versions of something an engineer would have built better. They’re net-new capability that was never going to get engineering resources in the first place. The counterfactual in my case isn’t “worse software”—it’s “no software.“
Well, on the surface it may seem like there’s nothing being created of value, but I can assure you every company from seed stage to unicorns are heavily using claude code, cursor, and the like to produce software. At this point, most software you touch has been modified and enhanced with the use of LLMs. The difference in pace of shipping with and without AI assistance is staggering.
Some people say that working with an agent or an agents orchestrator is like being a technical lead. But I've been a technical lead for quite a while, and the experience of working with an agent doesn't even come close. I think that when people talk about the agents' coding abilities they're talking about the average ability. But as a team lead, I don't care about average ability. I care only about the worst case. If I have any doubt that someone might not complete a task, or at least accurately explain why it's proving difficult, with at least 95% certainty, I won't assign them the task. If I have any doubt that the code they produce might not be up to snuff, I don't assign them the task. I don't need to review their code; they review each others'. When I have to review code I'm no longer a team lead but a programmer.
I often have one programming project I do myself, on the side, and recently I've been using coding agents. Their average ability is no doubt impressive for what they are. But they also make mistakes that not even a recent CS graduate with no experience would ever make (e.g. I asked the agent for it's guess as to why a test is failing; it suggested it might be due to a race condition with an operation that is started after the failing assertion). As a lead, if someone on the team is capable of making such a mistake even once, then that person can't really code, regardless of their average performance (just as someone who sometimes lands a plane in the wrong airport or even crashes without their being a catastrophich condition outside their control can't really fly regardless of their average performance). I wish the agent could work like a team of programmers and I would be doing my familiar role of a project lead, but it doesn't.
The models do some things well. I believe that programming is an interesting mix of inductive and deductive thinking (https://pron.github.io/posts/people-dont-write-programs), and the models have the inductive part down. They can certainly understand what a codebase does faster than I can. But their deductive reasoning, especially when it comes to the details, is severely lacking (e.g. I asked the agent to document my code. It very quickly grasped the design and even inferred some important invariants, but when it saw an `assert` in one subroutine it documented it as guarding a certain invariant. The intended invariant was correct, it just wasn't the one the assertion was guarding). So I still (have to) work as a programmer when working with coding assistants, even if in a different way.
I've read about great successes at using coding agents in "serious" software, but what's common to those cases is that the people using the agents (Mitchell Hashimoto, antirez) are experts in the respective codebase. I don't know what the future will bring, but at the moment, the craft isn't dead. When AI can really program, i.e. the experience is really like that of a team lead, I don't think that the death of programming would concern us, because once they get to that point, the agents will also likely be able to replace the team lead. And middle management. And the CTO, the CFO, and the CEO, and not just at software companies.
> If I have any doubt that someone might not complete a task, or at least accurately explain why it's proving difficult, with at least 95% certainty, I won't assign them the task
It gets hard to compare AI to humans. You can ask the AI to do things you would never ask a human to do, like retry 1000 times until it works, or assign 20 agents to the same problem with slightly different prompts. Or re-do the entire thing with different aesthetics.
No doubt, I'm just saying that working with a coding agent is not even remotely similar to being a team lead. If a member of your team can't complete a task and can't accurately explain what the difficulty is, you're in trouble.
But you have to admit it loses a certain shine in the cases where you know that what you're doing is no longer solving a problem that could be solved simpler and cheaper another way.
If you want to build a house you still need plans. Would you rather cut boards by hand or have a power saw. Would you rather pound nails, pilot hole with a bit and brace and put in flat head screws... or would you want a nail gun and an impact driver.
And you still need plans.
Can you write a plan for a sturdy house, verify that it meets the plan that your nails went all the way in and in the right places?
You sure can.
Your product person, your directors, your clients might be able to do the same thing, it might look like a house but its a fire hazard, or in the case of most LLM generated code a security one.
The problem is that we moved to scrum and agile, where your requirements are pantomime and postit notes if your lucky, interpretive dance if you arent. Your job is figuring out how to turn that into something... and a big part of what YOU as an engineer do is tell other people "no thats dumb" without hurting their feelings.
IF AI coding is going to be successful then some things need to change: Requirements need to make a come back. GOOD UI needs to make a comeback (your dark pattern around cancelation, is now going to be at odds with an agent). Your hide the content behind a login or a pay wall wont work any more because again, end users have access too... the open web is back and by force. If a person can get in, we have code that can get in now.
There is a LOT of work that needs to get done, more than ever, stop looking back and start looking forward, because once you get past the hate and the hype there is a ton of potential to right some of the ill's of the last 20 years of tech.
LLMs have made a lot of coding challenges less painful: Navigating terrible documentation, copilot detecting typos, setting up boilerplate frontend components, high effort but technically unchallenging code completions. Whenever I attempted LLMs for tools I’m not familiar with I found it to be useful with setting things up but felt like I had to do good old learning the tool and applying developer knowledge to it. I wonder if senior developers could use LLMs in ways that work with them and not against them. I.e create useful code that has guardrails to avoid slop
Ephemeralization: the ability thanks to technological advancement to do "more and more with less and less until eventually you can do everything with nothing." —Buckminster Fuller
Dunno, LLMs writing code still feels like they memorized a bunch of open source code and vomited them out in worse condition.
It's not that impressive that Claude wrote a C compiler when GitHub has the code to a bunch of C compilers (some SOTA) just sitting there.
I'm using an LLM to write a compiler in my spare time (for fun) for a "new" language. It feels more like a magical search engine than coding assistant. It's great for bouncing ideas from, for searching the internet without the clutter of SEO optimized sites and ads, it's definitely been useful, just not that useful for code.
Like, I have used some generated code in a very low stakes project (my own Quickshell components) and while it kind of worked, eventually I refactored it myself into 1/3 of the lines it produced and had to squash some bugs.
It's probably good enough for the people who were gluing React components together but it still isn't on the level where I'd put any code it produces into production anywhere I care about.
That is my experience from a year ago but I no longer feel that way. I write a few instructions, guide an agent to create a plan, and rarely touch the code myself. If I don’t like something, I ask the agent to fix it.
I'm surprised so many people are only waking up to this now. It should have been obvious as soon as ChatGPT came out that even with only incremental improvements, LLMs would kill programming as we knew it. And the fact that these utterances, however performative, from developers expressing grief or existential despair have become commonplace tells me as much about the power of these systems than whatever demo Anthropic or OpenAI has cooked up.
I would also point out that the author, and many AI enthusiasts, still make certain optimistic assumptions about the future role of "developer," insisting that the nature of the work will change, but that it will somehow, in large measure, remain. I doubt that. I could easily envision a future where the bulk of software development becomes something akin to googling--just typing the keywords you think are relevant until the black box gives you what you want. And we don't pay people to google, or at least, we don't pay them very much.
Speak for yourself. I don't miss writing code at all. Agentic engineering is much more fun.
And this surprises me, because I used to love writing code. Back in my early days I can remember thinking "I can't believe I get paid for this". But now that I'm here I have no desire to go back.
I had that same epiphany when I discovered AI is great at writing complicated shell command lines for me. I had a bit of an identity crisis right there because I thought I was an aspiring Unixhead neckbeard but in truth I hated the process. Especially the scavenger hunt of finding stuff in man pages.
Speak for yourself. If you find the agentic workflow to be more fun, more power to you.
I for one think writing code is the rewarding part. You get to think through a problem and figure out why decision A is better than B. Learning about various domains and solving difficult problems is in itself a reward.
Same here i'm a decade plus in this field, writing code was by far the number 1 and the discussion surrounding system design was a far second. Take away the coding i don't think i will make it to retirement being a code/llm PR auditor for work. So i'am already planning on exiting the field in the next decade.
>You get to think through a problem and figure out why decision A is better than B. Learning about various domains and solving difficult problems is in itself a reward.
So just tell the LLM about what you're thinking about.
Why do you need to type out a for loop for the millionth time?
I'm that 40 year old now. Been writing code since grade 5. Loved it so much I got a PhD, was an academic, then moved into industry.
I don't mourn or miss anything. No more then the previous generation mourned going from assembly to high level languages.
The reason why programming is so amazing is getting things done. Seeing my ideas have impact.
What's happening is that I'm getting much much faster and better at writing code. And my hands feel better because I don't type the code in anymore.
Things that were a huge pain before are nothing now.
I didn't need to stay up at night writing code. I can think. Plan. Execute at a scale that was impossible before. Alone I'm already delivering things that were on the roadmap for engineering months worth of effort.
I can think about abstractions, architecture, math, organizational constraints, product. Not about what some lame compiler thinks about my code.
And if someone that's far junior to me can do my job. Good. Then we've empowered them and I've fallen behind. But that's not at all the case. The principals and faculty who are on the ball are astronomically more productive than juniors.
> They can write code better than you or I can, and if you don’t believe me, wait six months.
No they cannot, And an AI bro squeezing every talking point into a think piece while pretending to have empathy doesn't change that. You just want an exit, and you want it fast.
I wonder whether, in the end, it was simply poor accessibility that made programmers special, and whether it is that what some of them are missing. Being special by "talking" a special language their customers can't comprehend.
Sure, they are still needed for debugging and for sneering at all those juniors and non-programmers who will finally be able to materialise their fantasies, but there is no way back anymore, and like riding horses, you can still do it while owning a car.
it definitely sucks to be honest, and theres a lot of cope out there.
fact of the matter is, being able to churn out bash oneliners was objectively worth $100k/year, and now it just isnt anymore. knowing the C++ STL inside-out was also worth $200k/year, now it has very questionable utility.
a lot of livelihoods are getting shaken up as programmers get retroactively turned into the equivalent of librarians, whose job is to mechanically index and fetch cognitive assets to and from a digital archive-brain.
Yeah, I notice a lot of the optimism is from people who have been in the field for decades. I'm newish to the field, half a decade out of undergrad. It definitely feels like almost all of what I learned has been (or will soon be) completely devalued. I'm sure this stuff feels a lot less threatening if you've had decades to earn a great salary and save a bunch of money. If money wasn't a concern I'd be thrilled about it too.
I mean go ahead and cry if you want. You are losing time best spent caring about stuff, and overlooking many alarming gotchas through blindly accepting SV hype. I'd have thought crypto would teach people something, but apparently not.
Do what isn't replaceable. You're being told literally everything is replaceable. Note who's telling you that and follow the money.
I feel bad for this essayist, but can't really spare more than a moment to care about his grief. I got stuff to do, and I am up and doing. If he was in any way competing with the stuff I do? One less adversary.
I would rather bring him into community and enjoy us all creating together… but he's acting against those interests and he's doomering and I have no more time for that.
I do not mourn.
For my whole life I’ve been trying to make things—beautiful elegant things.
When I was a child, I found a cracked version of Photoshop and made images which seemed like magic.
When I was in college, I learned to make websites through careful, painstaking effort.
When I was a young professional, I used those skills and others to make websites for hospitals and summer camps and conferences.
Then I learned software development and practiced the slow, methodical process of writing and debugging software.
Now, I get to make beautiful things by speaking, guiding, and directing a system which is capable of handling the drudgery while I think about how to make the system wonderful and functional and beautiful.
It was, for me, never about the code. It was always about making something useful for myself and others. And that has never been easier.
I like coding, I really do. But like you, I like building things more than I like the way I build them. I do not find myself miss writing code by hand as much.
I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".
But I do worry. The main question is this - will there be a day that AI will know what are "the right things to build" and have the "agency" (or illusion of) to do it better than an AI+human (assuming AI will get faster to the "build things right" phase, which is not there yet)
My main hope is this - AI can beat a human in chess for a while now, we still play chess, people earn money from playing chess, teaching chess, chess players are still celebrated, youtube influencers still get monetized for analyzing games of celebrity chess players, even though the top human chess player will likely lose to a stockfish engine running on my iPhone. So maybe there is hope.
> I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".
I've always been strongly in the first category, but... the issue is that 10x more people will be able to build the right things. And if I build the right thing, it will be easy to copy. The market will get crowded, so distribution will become even harder than it is today. Success will be determined by personal brand, social media presence, social connections.
>It was, for me, never about the code.
Then it wasn't your craft.
Yeah, seems like too many went into this field for money or status not because they like the process. Which is not an issue by itself, but now these people talk about how their AI assistant of choice made them some custom tool in two hours that would have taken them three weeks. And it's getting exhausting.
I went into this field because I love programming. I didn't even know how well these jobs paid until my junior year of college. I constantly programmed and read programming texts in my spare time growing up, in college, and after work.
I love AI tools. I can have AI do the boring parts. I can even have to write polished, usable apps in languages that I don't know.
I miss being able to think so much about architecture, best practices, frameworks/languages, how to improve, etc.
Isn't this like saying that if better woodworking tools come out, and you like woodworking, that woodworking somehow 'isn't your craft'. They said that their craft is about making things.
There are woodworkers on YouTube who use CNC, some who use the best Festool stuff but nothing that moves on its own, and some who only use handtools. Where is the line at which woodworking is not their craft?
The better analogy is you're now a shop manager or even just QA. You don't need to touch, look at, or think about the production process past asking for something and seeing if the final result fits the bill.
You get something that looks like a cabinet because you asked for a cabinet. I don't consider that "woodworking craft", power tools or otherwise.
Woodworking is, like, the quintessential craft. I think it is very useful to bring it in when discussion "craft"!
I am not myself a woodworker, however I have understood that part of what makes it "crafty" is that the woodworker reads grain, adjusts cuts, and accepts that each board is different.
We can try to contrast that to whatever Ikea does with wood and mass production of furniture. I would bet that variation in materials is "noise" that the mass production process is made to "reject" (be insensitive to / be robust to).
But could we imagine an automated woodworking system that takes into account material variation, like wood grain, not in an aggregate sense (like I'm painting Ikea to do), but in an individual sense? That system would be making judgements that are woodworker-like.
The craft lives on. The system is informed by the judgement of the woodworker, and the craftperson enters an apprenticeship role for the automation... perhaps...
Until you can do RL on the outcome of the furniture. But you still need craft in designing the reward function.
Perhaps.
The only confusion is in the use of the term "woodworking".
For the power tool user, "woodworking with hand tools" isn't their craft.
For the CNC user, "woodworking with manual machines" isn't their craft.
It's feeling much closer to hiring a woodworker to make you something, not woodworking tools
It is a different kind of code. Just a lot of programmers can’t grock it as such.
I guess I started out as a programmer, then went to grad school and learned how to write and communicate my ideas, it has a lot in common with programming, but at a deeper level. Now I’m doing both with AI and it’s a lot of fun. It is just programming at a higher level.
That's just gatekeeping.
It was and is my craft. I've been doing it since grade 5. Like 30 years now.
Writing tight assembly for robot controllers all the way to AI on MRI machines to security for the DoD and now the biggest AI on the planet.
But my craft was not typing. It's coding.
If you're typist you're going to mourn the printer. But if you're a writer you're going to see how the improves your life.
Right, there is a non-zero overlap between the VIM Andy's and AI nay-sayers.
No true programmer is excited for the future.
And no true scotsman puts sugar in his porridge
Yes, that was the reference!
Possibly too obscure. I can't tell whether I'm being downvoted by optimists who missed the joke, or by pessimists who got it.
this is so true.
never once in my life i saw anything get better. except for metal gear solid psx and gears of wars
As someone who started with Borland DOS-era IDEs I can tell you that IDEs did get a lot better over the years. I'm still fascinated every day by JetBrains IDEs.
> https://www.amazon.co.uk/WD_BLACK-SN850X-2280-Gaming-speed/d...
have you used them recently?
terrible, is the word I would use
I've seen a hundred ai-generated things, and they are rarely interesting.
Not because the tools are insufficient, it's just that the kind of person that can't even stomach the charmed life of being a programmer will rarely be able to stomach the dull and hard work of actually being creative.
Why should someone be interested in you creations? In what part of your new frictionless life would you've picked up something that sets you apart from a million other vibe-coders?
> stomach the dull and hard work of actually being creative
This strikes me as the opposite of what I experience when I say I'm "feeling creative", then everything comes easy. At least in the context of programming, making music, doing 3D animation and some other topics. If it's "dull and hard work" it's because I'm not feeling "creative" at all, when "creative mode" is on in my brain, there is nothing that feels neither dull nor hard. Maybe it works differently for others.
What sets you apart from millions of manual programmers?
I've been a professional programmer for 8+ years now. I've stomached that life. I've made things people used and paid for.
If I can do that typing one line at a time, I can do it _way_ faster with AI.
You may be mistaking some ai dev with non, because it doesn't have tell tails
I love building things too, but for me, the journey is a big part of what brings me joy. Herding an LLM doesn't give me joy like writing code does.
Well said. This sums up my own feeling. I joined this craft and love this craft for the simple ability to build beautiful and useful things.
This new world makes me more effective at it.
And this new world doesn’t prevent me from crafting elegant architectures either.
Wait 5 years and your skills are down
What infrastructure has gone through the last 15 years would like a word.
Half the people I work with can't do imperative jQuery interfaces. So what I guess. I can't code assembly.
A programming language is still an additional language with all the benefits of being multilingual.
AI will kill that.
I don't think 5 years is necessary. I think after two years of this agentic orchestration if you rarely touch code yourself skill will degrade to the point they won't be able to write anything non-trivial without assistance.
Depends how long you've done it, and how much the landscape has changed since then. I can still hop back into SQL and it all comes back to me though I haven't done it regularly at all for nearly 10 years.
In the web front-end world I'd be pretty much a newbie. I don't know any of the modern frameworks, everything I've used is legacy and obsolete today. I'd ramp up quicker than a new junior because I understand all the concepts of HTTP and how the web works, but I don't know any of the modern tooling.
How much do you think Linus Torvalds has coded over the last decade? Why is he still able to do his job?
Adam Neely has a video on GenAI and it's impact on the music industry. There is a section in the video about beauty and taste and it's pretty different from your conclusions. One example I remember is would an AI find beauty in a record scratch sound?
https://youtu.be/U8dcFhF0Dlk
I want to be in your camp, and am trying hard. But the OP's blog entry should at least give us a moment to "respect the dead". That's all he's asking, I think.
> For my whole life I’ve been trying to make things—beautiful elegant things.
Me too, but... The ability to code was a filter. With AI, the pool of people who can build beautiful elegant software products expands significantly. Good for the society, bad for me.
AI agents seem to be a powerful shortcut to the drudgery. But let's not forget, that powerful software rests on substance. My hope is the substance will increase, after all.
So when you "learned software development and practiced the slow, methodical process of writing and debugging software", it wasn't about code? I don't get it. Yes, building useful things is the ultimate goal, but code is the medium through which you do it, and I don't understand how that cannot be an important part of the process.
It's like a woodworker saying, "Even though I built all those tables using precise craft and practice, it was NEVER ABOUT THE CRAFT OR PRACTICE! It was about building useful things." Or a surgeon talking about saving lives and doing brain surgery, but "it was never about learning surgery, it was about making people get better!"
I mean sure yeah but also not really.
Not the GP I feel some of that energy. The parts I most enjoy are the interfaces, the abstractions, the state machines, the definitions. The code I enjoy too, and I would be sad to lose all contact with it, but I've really appreciated AI especially for helping me get over the initial hump on things like:
- infrastructure bs, like scaffold me a JS GitHub action that does x and y.
- porting, like take these kernel patches and adjust them from 6.14 to 6.17.
- tools stuff, like here's a workplace shell script that fetches a bunch of tokens for different services, rewrite this from bash to Python.
- fiddly things like dealing with systemd or kubernetes or ansible
- fault analysis, like here's a massive syslog dump or build failure, what's the "real" issue here?
In all these cases I'm very capable of assessing, tweaking, and owning the end result, but having the bot help me with a first draft saves a bunch of drudgery on the front end, which can be especially valuable for the ADHD types where that kind of thing can be a real barrier to getting off the ground.
I couldn't agree more.
In my opinion the relationship between level of detailed care and resulting beauty is proportional. Can you get the same level without getting your hands dirty? Sure, maybe, but I doubt a painter or novelist could really produce beautiful work without being intimately familiar with that work. The distance that heavy use of AI tools creates between you and the output does not really lend itself to beauty. Could you do it, sure, but at that point it's probably more efficient to just do things yourself and have complete intimate control.
To me, you sound more utilitarian. The philosophy you are presenting is a kind of Ikea philosophy. Utility, mass production, and unique beauty are generally properties that do not cohere together, and there's a reason for this. I think the use of LLMs in the production of digital goods is very close to the use of automation lines in the production of physical goods. No matter how you try some of the human charm, and thus beauty will inevitably be lost, the number of goods will increase, but they'll all be barely differentiable souless replications of more or less the same shallow ideas repeated as infinitum.
But you don’t make.
You order it.
Right.
It's like learning to cook and regularly making your own meals, then shifting to a "new paradigm" of hiring a personal chef to cook for you. Food's getting made either way, but it's not really the same deal.
No, it's more like moving from line cook, to head chef in charge of 30 cooks.
Food's getting made, but you focus on the truly creative part -- the menu, the concept, the customer experience. You're not boiling pasta or cutting chives for the thousandth time. The same way now you're focusing on architecture and design now instead of writing your 10,000th list comprehension.
Except the cooks don't exist anymore as they all have become head chefs (or changed careers) and the food is being cooked by magical cooking black boxes
Sure, but the point is you're now doing the most creative and satisfying part. Not the drudgery.
It's not that you've stopped doing anything at all, like the other commenter claimed in their personal chef analogy.
Because such people are not sincere either to themselves about who they are or to others. It's really hard for me to take seriously phrases like "I joined this industry to make things, not to write code".
Do painters paint because they just like to see the final picture? Or do they like the process? Yes, painting is an artistic process, not exactly crafting one. But the point stand.
Woodworkers making nice custom furniture generally enjoy the process.
The transition is from author to editor/publisher. Both play an important role in bringing something new into the world.
It's true, but ask an author and 99% of them will say they don't want to be an editor.
Writing code is my favorite activity. I hate these takes, you never liked writing code, who cares. This probably just translates to you sucked at it. Get out of our trade please.
I started programming over 40 years ago because it felt like computers were magic. They feel more magic today than ever before. We're literally living in the 1980s fantasy where you could talk to your computer and it had a personality. I can't believe it's actually happening, and I've never had more fun computing.
I can't empathize with the complaint that we've "lost something" at all. We're on the precipice of something incredible. That's not to say there aren't downsides (WOPR almost killed everyone after all), but we're definitely in a golden age of computing.
> golden age of computing
I feel like we've reached the worst age of computing. Where our platforms are controlled by power hungry megacorporations and our software is over-engineered garbage.
The same company that develops our browsers and our web standards is also actively destroying the internet with AI scrapers. Hobbyists lost the internet to companies and all software got worse for it.
Our most popular desktop operating system doesn't even have an easy way to package and update software for it.
Dystopian cyberpunk was always part of the fantasy. Yes, scale has enabled terrible things.
There are more alternatives than ever though. People are still making C64 games today, cheap chips are everywhere. Documentation is abundant... When you layer in AI, it takes away labor costs, meaning that you don't need to make economically viable things, you can make fun things.
I have at least a dozen projects going now that I would have never had time or energy for. Any itch, no matter how geeky and idiosyncratic, is getting scratched by AI.
It’s never been easier for you to make a competitor
So what is stopping you other than yourself?
I’m not the OP, but my answer is that there’s a big difference between building products and building businesses.
I’ve been programming since 1998 when I was in elementary school. I have the technical skills to write almost anything I want, from productivity applications to operating systems and compilers. The vast availability of free, open source software tools helps a lot, and despite this year’s RAM and SSD prices, hardware is far more capable today at comparatively lower prices than a decade ago and especially when I started programming in 1998. My desktop computer is more capable than Google’s original cluster from 1998.
However, building businesses that can compete against Big Tech is an entirely different matter. Competing against Big Tech means fighting moats, network effects, and intellectual property laws. I can build an awesome mobile app, but when it’s time for me to distribute it, I have to either deal with app stores unless I build for a niche platform.
Yes, I agree that it’s never been easier to build competing products due to the tools we have today. However, Big Tech is even bigger today than it was in the past.
Billions of dollars?
You don't need billions of dollars to write an app. You need billions of dollars to create an independent platform that doesn't give the incumbent a veto over your app if you're trying to compete with them. And that's the problem.
In some ways, I'd say we're in a software dark age. In 40 years, we'll still have C, bash, grep, and Mario ROMs, but practically none of the software written today will still be around. That's by design. SaaS is a rent seeking business model. But I think it also applies to most code written in JS, Python, C#, Go, Rust, etc. There are too many dependencies. There's no way you'll be able to take a repo from 2026 and spin it up in 2050 without major work.
One question is how will AI factor in to this. Will it completely remove the problem? Will local models be capable of finding or fixing every dependency in your 20yo project? Or will they exacerbate things by writing terrible code with black hole dependency trees? We're gonna find out.
Glad to see this already expressed here because I wholly agree. Programming has not brought me this much joy in decades. What a wonderful time to be alive.
We have what I've dreamed of for years: the reverse dictionary.
Put in a word and see what it means? That's been easy for at least a century. Have a meaning in mind and get the word? The only way to get this before was to read a ton of books and be knowledgable or talk to someone who was. Now it's always available.
> Now it's always available.
And often incorrect! (and occasionally refuses to answer)
Sure, but it's easy to check if it's incorrect and try again.
Forgive me if "just dig your way out of the hole" doesn't sound appealing.
Surely you, a programmer, can imagine a way to automate this process
It is! But you can then verify it via a correct, conventional forward dictionary.
The scary applications are the ones where it's not so easy to check correctness...
Right. Except the dictionary analogy only goes so far and we reach the true problem.
This is a great description of how I use Claude.
The "reverse dictionary" is called a "thesaurus". Wikipedia quotes Peter Mark Roget (1852):
> ...to find the word, or words, by which [an] idea may be most fitly and aptly expressed
I briefly investigated LLMs for this purpose, back when I didn't know how to use a thesaurus; but I find thesauruses a lot more useful. (Actually, I'm usually too lazy to crack out a proper thesaurus, so I spend 5 seconds poking around Wiktionary first: that's usually Good Enough™ to find me an answer, when I find an answer I can trust it, and I get the answer faster than waiting for an LLM to finish generating a response.)
There's definitely room to improve upon the traditional "big book of synonyms with double-indirect pointers" thesaurus, but LLMs are an extremely crude solution that I don't think actually is an improvement.
A thesaurus is not a reverse dictionary
Really?
"What's a word that means admitting a large number of uses?"
That seems hard to find in a thesaurus without either versatile or multifarious as a starting point (but those are the end points).
"Admitting a large number of uses" -> "very useful" -> https://en.wiktionary.org/wiki/useful -> dead end. Give up, use a thesaurus.
https://www.wordhippo.com/what-is/another-word-for/very_usef..., sense 2 "Usable in multiple ways", lists:
> useful multipurpose versatile flexible multifunction adaptable all-around all-purpose all-round multiuse multifaceted extremely useful one-size-fits-all universal protean general general-purpose […]
Taking advantage of the fact my passive vocabulary is greater than my active vocabulary: no, no, yes. (I've spuriously rejected "multipurpose" – a decent synonym of "versatile [tool]" – but that doesn't matter.) I'm pretty sure WordHippo is machine-generated from some corpus, and a lot of these words don't mean "very useful", but they're good at playing the SEO game, and I'm lazy. Once we have versatile, we can put that into an actual thesaurus: https://dictionary.cambridge.org/thesaurus/versatile. But none of those really have the same sense as "versatile" in the context I'm thinking of (except perhaps "adaptable"), so if I were writing something, I'd go with "versatile".
Total time taken: 15 seconds. And I'm confident that the answer is correct.
By the way, I'm not finding "multifarious" anywhere. It's not a word I'm familiar with, but that doesn't actually seem to be a proper synonym (according to Wiktionary, at least: https://en.wiktionary.org/wiki/Thesaurus:heterogeneous). There are certainly contexts where you could use this word in place of "versatile" (e.g. "versatile skill-set" → "multifarious skill-set"), but I criticise WordHippo for far less dubious synonym suggestions.
> We're on the precipice of something incredible.
Total dependence on a service?
Same.
I was born in 84 and have been doing software since 97
it’s never been easier, better or more accessible time to make literally anything - by far.
Also if you prefer to code by hand literally nobody is stopping you AND even that is easier.
Cause if you wanted to code for console games you literally couldn’t in the 90s without 100k specialized dev machine.
It’s not even close.
This “I’m a victim because my software engineering hobby isn’t profitable anymore” take is honestly baffling.
I'm not going to code by hand if it's 4x slower than having Claude do it. Yes, I can do that, but it just feels bad.
The analogy I like is it's like driving vs. walking. We were healthier when we walked everywhere, but it's very hard to quit driving and go back even if it's going to be better for you.
I walk all the time
During the summer I’ll walk 30-50 miles a week
However I’m not going to walk to work ever and I’m damn sir not going to walk in the rain or snow unless if I can avoid it
I retired a few years ago, so I have no idea what AI programming is.
But I mourned when CRT came out, I had just started programming. But I quickly learned CRTs were far better,
I mourned when we moved to GUIs, I never liked the move and still do not like dealing with GUIs, but I got used to it.
Went through all kinds of programming methods, too many to remember, but those were easy to ignore and workaround. I view this new AI thing in a similar way. I expect it will blow over and a new bright shiny programming methodology will become a thing to stress over. In the long run, I doubt anything will really change.
I think you're underestimating what AI can do in the coding space. It is an extreme paradigm shift. It's not like "we wrote C, but now we switch to C++, so now we think in objects and templates". It's closer to the shift from assembly to a higher level language. Your goal is still the same. But suddenly you're working in a completely newer level of abstraction where a lot of the manual work that used to be your main concern is suddenly automated away.
If you never tried Claude Code, give it s try. It's very easy to get I to. And you'll soon see how powerful it is.
OT but I see your account was created in 2015, so I'm assuming very late in your career. Curious what brought you to HN at that time and not before?
I agree with you with the caveat that all the "ease of building" benefits, for me, could potentially be dwarfed by job losses and pay decreases. If SWE really becomes obsolete, or even if the number of roles decrease a lot and/or the pay decreases a lot (or even fails to increase with inflation), I am suddenly in the unenviable position of not being financially secure and being stuck in my 30s with an increasingly useless degree. A life disaster, in other words. In that scenario the unhappiness of worrying about money and retraining far outweighs the happiness I get from being able to build stuff really fast.
Fundamentally this is the only point I really have on the 'anti-AI' side, but it's a really important one.
Anybody who says this kind of thing, I assume you weren't very good at programming, you ultimately didn't like doing it and so you probably climbed the business ladder. I can't say for sure but I'd bet you're upper management and therefor this is magic to you, its a threat to all of us. You probably despised writing code because you weren't good at it so you welcome these tools to patch that insecurity of yours.
I didn't imagine I would be sending all my source code directly to a corporation for access to an irritatingly chipper personality that is confidently incorrect the way these things are.
There have been wild technological developments but we've lost privacy and autonomy across basically all devices (excepting the people who deliberately choose to forego the most capable devices, and even then there are firmware blobs). We've got the facial recognition and tracking so many sci-fi dystopias have warned us to avoid.
I'm having an easier time accomplishing more difficult technological tasks. But I lament what we have come to. I don't think we are in the Star Trek future and I imagined doing more drugs in a Neuromancer future. It's like a Snow Crash / 1984 corporate government collab out here, it kinda sucks.
While I'm on the fence about LLMs there's something funny about seeing an industry of technologists tear their own hair out about how technology is destroying their jobs. We're the industry of "we'll automate your job away". Why are we so indignant when we do it to ourselves...
This article isn't really about losing a job. Coding is a passion for some of us. It's similar to artists and diffusion, the only difference being that many people can appreciate human art - but who (outside of us) cares that a human wrote the code?
The people outside of us didn’t care about your beautiful code before. Now we can quickly build their boring applications and spend more time building beautiful things for our community’s sake. Yes, there are economic concerns, but as far as “craft” goes, nothing is stopping us from continuing to enjoy it.
I love programming, but most of that joy doesn't come from the type of programming I get paid to do. I now have more time and energy for the fun type, and I can go do things that were previously inconceivable!
Last night "I" "made" 3D boids swarm with directional color and perlin noise turbulence. "I" "did" this without knowing how to do the math for any of those things. (My total involvement at the source level was fiddling with the neighbor distance.)
https://jsbin.com/ququzoxete/edit?html,output
Then I turned them into weird proteins
https://jsbin.com/hayominica/edit?html,output
(As a side note, the loss of meaning of "self" and "doing" overlaps weirdly with my meditation practice...)
Yes but did you learn anything?
Obviously that matters, but how much does it matter? Does it matter if you don't learn anything about computer architecture because you only code in JS all day? Very situational.
I disagree a bit. Coding can remain an artistic passion for you indefinitely, it's just your ability to demand that everyone crafts each line of code artisinally won't be subsidized by your employer for much longer. There will probably always be a heavily diminished demand for handcrafted code.
I think this is really it. Being a musician was never a very reliable way to earn a living, but it was a passion. A genuine expression of talent and feeling through the instrument. And if you were good enough you could pay the bills doing work work for studios, commercials, movies, theater. If you were really good you could perform as a headliner.
Now, AI can generate any kind of music anyone wants, eliminating almost all the anonymous studio, commercial, and soundtrack work. If you're really good you can still perform as a headliner, but (this is a guess) 80% of the work for musicians is just gone.
Huge tangent but curiosity is killing me: By any chance is your username based on the Egyptian football club Zamalek?
How do you read this article and hear indigence? It’s clearly someone grieving something personal about their own relationship with the technology.
I'm very confident in saying the majority of developers didn't get into it saying "we'll automate your job away"
I never thought or felt myself as or my work as someone or something that "will automate your job away".
Agreed. I've always thought the purpose of all automation was to remove needless toil. I want computers to free people. I guess I subscribe to the theory of creative destruction.
Maybe it comes down to the definition of "toil". Some people find typing to be toiling, so they latch on to not having to type as much when using LLMs. Other people see "chores" as toiling, and so dream of household robots to take on the burden of that toil. Some people hate driving and consider that to be needless toil, so self-driving cars answer that—and the ads for Waymo latch onto this.
Personally, I am not stymied by typing nor chores nor driving. For me, typing is like playing a musical instrument: at some point you stop needing to think about how to play and you just play. The interaction and control of the instrument just comes out of your body. At some point in my life, all the "need to do things around the house" just became the things I do, and I'm not bothered by doing them, such that I barely notice doing them. But it's complex: the concept of "chores" is front and center when you're trying to get a teenager to be responsible for taking care of themselves (like having clean clothes, or how the bathroom is safer if it's not a complete mess) and participating in family/household responsibilities (like learning that if you don't make a mess, there's nothing to clean up). Can you really be effective at directing someone/something else without knowing how to do it yourself? Probably for some things, but not all.
> Maybe it comes down to the definition of "toil".
For sure.
I idealize a future where people can spend more time doing things they want to do, whatever those avocations might be. Freedom from servitude. I guess some kind of Star Trek / The Culture hybrid dream.
The world we have is so far from that imaginary ideal. Implicit in that ideal would be elimination of inequality, and I'm certain there are massive forces that would oppose that elimination.
Computers are definitely on the path to freeing programmers from programming
For me it's because the same tech is doing it to everyone else in a more effective way (i.e. artists especially). I'm an "art enjoyer" since I was a child and to see it decimated by people who I once looked up to is heartbreaking. Also, if it only affected software, I would've been happy to switch to a more artistic career, but welp there goes that plan.
> Now is the time to mourn the passing of our craft.
Your craft is not my craft.
It's entirely possible that, as of now, writing JavaScript and Java frontends (what the author does) can largely be automated with LLMs. I don't know who the author is writing to, but I do not mistake the audience to be "programmers" in general...
If you are making something that exists, or something that is very similar to something that exists, odds are that an LLM can be made to generate code which approximates that thing. The LLM encoding is lossy. How will you adjust the output to recover the loss? What process will you go through mentally to bridge the gap? When does the gap appear? How do you recognize it? In the absolute best case you are given a highly visible error. Perhaps you've even shipped it, and need to provide context about the platform and circumstances to further elucidate. Better hope that platform and circumstance is old-hat.
LLMs are only a threat if you see your job as a code monkey. In that case you're likely already obsoleted by outsourced staff who can do your job much cheaper.
If you see your job as a "thinking about what code to write (or not)" monkey, then you're safe. I expect most seniors and above to be in this position, and LLMs are absolutely not replacing you here - they can augment you in certain situations.
The perks of a senior is also knowing when not to use an LLM and how they can fail; at this point I feel like I have a pretty good idea of what is safe to outsource to an LLM and what to keep for a human. Offloading the LLM-safe stuff frees up your time to focus on the LLM-unsafe stuff (or just chill and enjoy the free time).
I see my job as having many aspects. One of those aspects is coding. It is the aspect that gives me the most joy even if it's not the one I spend the most time on. And if you take that away then the remaining part of the job is just not very appealing anymore.
It used to be I didn't mind going through all the meetings, design discussions, debates with PMs, and such because I got to actually code something cool in the end. Now I get to... prompt the AI to code something cool. And that just doesn't feel very satisfying. It's the same reason I didn't want to be a "lead" or "manager", I want to actually be the one doing the thing.
You won't be prompting AI for the fun stuff (unless laying out boring boilerplate is what you consider "fun"). You'll still be writing the fun part - but you will be able to prompt beforehand to get all the boilerplate in place.
These comments are comical. How hard is it to understand that human beings are experiential creates. Our experiences matter, to survival, to culture, and identity.
I mourn the horse masters and stable boys of a century past because of their craft. Yeats of intuition and experience.
Why do you watch a chess master play, or a live concert, or any form of human creation?
Should we automate parts of our profession? Yes. Should he mourn the loss of our craft. Also yes.
"Wait 6 months" has been the call for 3-4 years now. You can't eulogize a profession that hasn't been killed, that's just mean.
Just a couple more trillion dollars, we are so close!
This is what I don't really understand. It's a bit difficult to take "wait x months" at face value because I've been hearing it for so long. Wait x months for what? Why hasn't it happened yet?
Things seem to be getting better from December 2022 (chatgpt launch), sure, but is there a ceiling we don't see?
"Self-driving cars" and Fusion power also come to mind. With the advent of photography, it was widely believed that drawing and painting would vanish as art forms. Radio would obsolete newspapers, becoming obsolete themselves with television, and so on. Don't believe the hype.
My car has driven me back and forth with no issues for 6 months now. But yes it's been a long time coming.
And yet.. my car was surrounded by 5 self-driving cars with no people in them on the way to work on Thursday.
Waymos require a highly mapped environment to function safely in. Not to take away from what Waymo has accomplished, but it's a far more bounded problem that what the "self driving" promise has been.
Just like in "I, Robot?"
Um.. Claude Code has been out less than a YEAR.. and the lift in capability in the last year has been dramatic.
It does seem probable based on progress that in 1-2 more model generations there will be little need to hand code in almost any domain. Personally I already don't hand code AT ALL, but there are certainly domains/languages that are under performing right now.
Right now with the changes this week (Opus 4.6 and "teams mode") it already is another step function up in capability.
Teams mode is probably only good for greenfield or "green module" development but I'm watching a team of 5 AI's collaborating and building out an application module by module. This is net new capability for the tool THIS WEEK (Yes I am aware of earlier examples).
I don't understand how people can look at this and then be dismissive of future progress, but human psychology is a rich and non-logical landscape.
Agree with the author. I like the process of writing code, typing method names and class definitions while at the same time thinking ahead about overall architecture, structure, how much time given function would run for, what kind of tests are necessary.
I find it unsettling how many people in the comments say that they don't like writing code. Feels aliens to me. We went into this field for seemingly very different reasons.
I do use LLMs and even these past two days I was doing vibe coding project which was noticeably faster to setup and get to its current state than if I wrote in myself. However I feel almost dirty by how little I understand the project. Sure, I know the overall structure, decisions and plan. But I didn't write any of it and I don't have deep understanding of the codebase which I usually have when working on codebase myself.
It's not so much the writing of the code (which I did like), it's the aesthetic of the code. It's solving a problem with the right code and the right amount of code (for now). That's still the case, even with AI writing most of the code. You have to steer it constantly because it has very bad instincts, because most people in the profession aren't good at it, so it has bad training data. Mainly because the "learn to code" movement and people getting into this profession just for the money and not the love. Those people are probably screwed.
To the people who are against AI programming, honest question: why do you not program in assembly? Can you really say "you" "programmed" anything at all if a compiler wrote your binaries?
This is a 100% honest question. Because whatever your justification to this is, it can probably be used for AI programmers using temperature 0.0 as well, just one abstraction level higher.
I'm 100% honestly looking forward to finding a single justification that would not fit both scenarios.
I'm all for AI programming.
But I've seen this conversation on HN already 100 times.
The answer they always give is that compilers are deterministic and therefore trustworthy in ways that LLMs are not.
I personally don't agree at all, in the sense I don't think that matters. I've run into compiler bugs, and more library bugs than I can count. The real world is just as messy as LLMs are, and you still need the same testing strategies to guard against errors. Development is always a slightly stochastic process of writing stuff that you eventually get to work on your machine, and then fixing all the bugs that get revealed once it starts running on other people's machines in the wild. LLMs don't write perfect code, and neither do you. Both require iteration and testing.
I just answered exactly that. I think that AI agents code better than humans and are the future.
But the parent argument is pretty bad, in my opinion.
I'm not particularly against AI programming but I don't think these two things are equivilent. A compiler translates code to specifications in a deterministic way, the same compiler produces the same output from the same code, it is all completely controlled. AI is not at all deterministic, temperature is built into LLMs and furthermore the lack of specificity in prompts and our spoken languages. The difference in control is significant enough to me not to put compilers and AI coding agents into the same catagory even though they are both taking some text and producing some other text/machine code.
There's a big difference between deterministic abstraction over machine code, and probabilistic translation of ambiguous language into machine code.
Compiler is your interface.
If you treat LLM as your interface... Well, I wouldn't want sharing codebase with you.
Compilers are deterministic.
I on the other hand await the coming of the Butlerian Jihad.
I often venerate antiques and ancient things by thinking about how they were made. You can look at a 1000-year-old castle and think: This incredible thing was built with mules and craftsmen. Or look at a gorgeous, still-ticking 100-year-old watch and think: This was hand-assembled by an artist. Soon I'll look at something like the pre-2023 Linux kernel or Firefox and think: This was written entirely by people.
At least with physical works (for now, anyway), the methods the artisans employ leave tell-tale signs attesting to the manner of construction, so that someone at least has the choice of going the "hand made" route, and others, even lay people without special tooling, can tell that it indeed was hand made.
Fully AI generated code has similar artifacts. You can spot them pretty easily after a bit. Of course it doesn't really matter for the business goals, as long as it works correctly. Just like 99% of people don't care if their clothing was machine made vs. handmade. It's going to be a tiny minority that care about handmade software.
One other helpful frame: I consider LLMs simply to be very flexible high-level 'language' Compilers. We've moved up the Abstraction Chain ever since we invented FORTRAN and COBOL (and LISP) instead of using assembly language.
We're 'simply' moving up the abstraction hierarchy again. Good!
> I didn’t ask for the role of a programmer to be reduced to that of a glorified TSA agent, reviewing code to make sure the AI didn’t smuggle something dangerous into production.
This may be the perspective of some programmers. It doesn't seem to be shared by the majority of software engineers I know and read and listen to.
Do you mean the perspective that he is a "glorified TSA agent" or that he doesn't like it? Because in this thread it seems that some people agree but they just like it :)
I don't mourn coding for itself, since I've always kinda disliked that side of my work (numerical software, largely).
What I do mourn is the reliability. We're in this weird limbo where it's like rolling a die for every piece of work. If it comes up 1-5, I would have been better off implementing it myself. If it comes up 6, it'll get it done orders of magnitude faster than doing it by hand. Since the overall speedup is worthwhile, I have to try it every time, even if most of the time it fails. And of course it's a moving target, so I have to keep trying the things that failed yesterday because today's models are more capable.
I do not mourn typing in code.
But I am still quite annoyed at the slopful nature of the code that is produced when you're not constantly nagging it to do better
We've RLed it to produce code that works by hook or by crook, putting infinity fallback paths and type casts everywhere rather than checking what the semantics should be.
Sadly I don't know how we RL taste.
“We’ll miss creating something we feel proud of”
I still feel proud when skillfully guiding a set of AI agents to build from my imagination. Especially when it was out of my reach just 6-months ago.
I’m a 49 year old veteran who started at just 10 years old and have continued to find pure passion in it.
I wonder if this is just a matter of degree. In a few years (or less) you may not have to "skillfully guide" anything. The agents will just coordinate themselves and accomplish your goals after you give some vague instruction. Will you still feel proud? Or maybe a bit later then agents will come up with their own improvements and just ship them without any input at all. How about then?
> We’ll miss the sleepless wrangling of some odd bug that eventually relents to the debugger at 2 AM.
I'll miss it not because the activity becomes obsolete, but because it's much more interesting than sitting till 2am trying to convince LLM to find and fix the bug for me.
We'll still be sitting till 2am.
> They can write code better than you or I can, and if you don’t believe me, wait six months.
I've been hearing this for the last two years. And yet, LLMs, given abstract description of the problem, still write worse code than I do.
Or did you mean type code? Because in that case, yes, I'd agree. They type better.
This perspective was mine 6 months ago. And god damn, I do miss the feeling of crafting something truly beautiful in code sometimes. But then, as I've been pushed into this new world we're living in, I've come to realize a couple things:
Nothing I've ever built has lasted more than a few years. Either the company went under, or I left and someone else showed up and rewrote it to suit their ideals. Most of us are doing sand art. The tide comes in and its gone.
Code in and of itself should never have been the goal. I realized that I was thinking of the things I build and the problems I selected to work on from the angle of code quality nearly always. Code quality is important! But so is solving actual problems with it. I personally realized that I was motivated more by the shape of the code I was writing than the actual problems it was written to solve.
Basically the entire way I think about things has changed now. I'm building systems to build systems. Thats really fun. Do I sometimes miss the feeling of looking at a piece of code and feeling a sense of satisfaction of how well made it is? Sure. That era of software is done now sadly. We've exited the craftsman era and entered into the Ikea era of software development.
Interesting, I still have code I wrote 20 years ago being used in production.
“Most of us are doing sand art. The tide comes in and it’s gone.”
I’m putting that on my wall.
The acceleration of AI has thrown into sharp relief that we have long lumped all sorts of highly distinct practices under this giant umbrella called "coding". I use CC extensively, and yet I still find myself constantly editing by hand. Turns out CC is really bad at writing kubernetes operators. I'd bet it's equally bad at things like database engines or most cutting edge systems design problems. Maybe it will get better at these specific things with time, but it seems like there will always be a cutting edge that requires plenty of human thought to get right. But if you're doing something that's basically already been done thousands of times in slightly different ways, CC will totally do it with 95% reliability. I'm ok with that.
It's also important to step back and realize that it goes way beyond coding. Coding is just the deepest tooth of the jagged frontier. In 3 years there will be blog posts lamenting the "death of law firms" and the "death of telemedicine". Maybe in 10 years it will be the death of everything. We're all in the same boat, and this boat is taking us to a world where everyone is more empowered, not less. But still, there will be that cutting edge in any field that will require real ingenuity to push forward.
I think there's clearly a difference in opinion based on what you work on. Some people were working on things that pre-CC models also couldn't handle and then CC could, and it changed their opinions quickly. I expect (but cannot prove of course) that the same will happen with the area you are describing. And then your opinion may change.
The death of a means to an end is the birth of an end itself.
When cameras became mainstream, realism in painting went out of fashion, but this was liberating in a way as it made room for many other visual art styles like Impressionism. The future of programming/computing is going to be interesting.
I hope that "our craft" which now produces, largely, vulnerable buggy bloatware actually dies.
Perhaps people or machines will finally figure out how to make software which actually works without a need to weekly patching
That's why I'll only read source code written until 2024.
I fall in the demographic discussed in the article but I’ve approached this with as much pragmatism as I can muster. I view this as a tool to help improve me as a developer. Sure there will be those of us who do not stay ahead (is that even possible?) of the curve and get swallowed up but technology has had this affect on many careers in the past. They just change into something different and sometimes better. It’s about being willing to change with it.
Like other tech disrupted crafts before this, think furniture making or farming, that's how it goes. From hand-made craft, to mass production factories (last couple of decades) to fully automated production.
The craft was dying long before LLMs. Started in dotcom, ZIRP added some beatings, then LLMs are finishing the job.
This is fine, because like in furniture making, the true craftsmen will be even more valuable (overseeing farm automation, high end handmade furniture, small organic farms), and the factory worker masses (ZIRP enabled tech workers) will move on to more fulfulling work.
That’s not how it goes for the worker. If you are a capitalist then it doesn’t matter, you own the means of production. The laborer, however, has to learn new skills, which take time and money. If your profession no longer exists, unless you have enough capital to retool/be a capitalist, then you will personally get poorer.
Where do people find this optimism? I reckon when the software jobs fall everything else will follow shortly too. That's just the first target because it's what we know and the manual stuff is a little harder for now. The "good news" is everyone might be in the same boat so the system will have to adapt,
There's a commercial building under construction next to my office. I look down on the construction site, and those strapping young men are digging with their big excavators they've been using for years and taking away the dirt with truck and trailer.
Why use a spade? Even those construction workers use the right sized tools. They ain't stupid.
Them using excavator and trucks etc to move dirt. Is the same as us using a compiler to compile code into an executable.
LLM would be if the digging and hauling of the dirt happened without any people involved except the planning of logistics.
I think OP is coming at this more from an artisan angle. Perhaps there were shoveler artisans who took pride in the angle of their dirt-shoveling. Those people perhaps do lament the advent of excavators. But presumably the population who find code beautiful vs the art of shoveling are of different sizes
I feel like we are long into the twilight of mini blogs and personal sites. Its like people trying to protect automotive jobs, the vas majority were already lost
Perhaps Im a cynic but I don't know
I thought I'd miss all the typing and syntax, but I really don't. Everyone has their own relationship with coding, but for me, I get satisfaction out of the end product and putting it in front of someone. To the extend that I cared about the code, it mainly had to do with how much it allowed the end product to shine.
Yes, there's clearly a big split in the community where perhaps ~50% are like OP and the other ~50% are like you. But I think we should still respect the views of the other side and try to empathize.
Coding is an abstraction. Your CPU knows nothing of type safety, bloom filters, dependencies, or code reuse.
Mourning the passing of one form of abstraction for another is understandable, but somewhat akin to bemoaning the passing of punch card programming. Sure, why not.
Your entire brain's model of the world is an abstraction over its sensory inputs. By this logic we might as well say you shouldn't mourn anything since all it means is a minor difference in the sensory inputs your brain receives.
Write a blog post promoting inevitability of AI in software development while acknowledging feelings of experienced software engineers.
> If you would like to grieve, I invite you to grieve with me.
I think we should move past this quickly. Coding itself is fun but is also labour , building something is the what is rewarding.
You know who else mourned the loss of craft? People that don't like PHP and Wordpress because they lower the barrier to entry to creating useful stuff while also leaving around a fair amount of cruft and problems that the people that use them don't understand how to manage.
Like iambateman said: for me it was never about code. Code was a means to an ends and it didn't stop at code. I'm the kind of software engineer that learned frontends, systems, databases, ETLs, etc -- whatever it was that was that was demanded of me to produce something useful I learned and did it. We're now calling that a "product engineer". The "craft" for me was in creating useful things that were reliable and efficient, not particularly how I styled lines, braces, and brackets. I still do that in the age of AI.
All of this emotional spillage feels for not. The industry is changing as it always has. The only constant I've ever experienced in this industry is change. I realized long ago that when the day comes that I am no longer comfortable with change then that is my best signal that this industry is no longer for me.
I think it's a bit different when you can opt out. If you didn't want to use PHP you didn't have to. But it's getting increasingly hard to opt out of AI.
And all that time spent doing leetcode? Yeah, THAT was time Well Spent.... ;-)
Some code is worth transcribing by hand — an ancient practice in writing, art and music.[0] Some isn't even worth looking at.
I find myself, ironically, spending more time typing out great code by hand now. Maybe some energy previously consumed by tedium has been freed up, or maybe the wacky machines brought a bit of the whimsy back into the process for me.
[0] And in programming, for the readers of Zed Shaw's books :)
I get where this is coming from. But at the same, AI/LLMs are such an exciting development. As in "maybe I was wrong and the singularity wasn't bullshit". If nothing else, it's an interesting transition to live through.
It makes me sad to read posts like this. If it is a necessary step for you on the journey from denial to acceptance to embracing the new state of the world, then sure, take your time.
But software engineering is the only industry that is built on the notion of rapid change, constant learning, and bootstrapping ourselves to new levels of abstraction so that we don't repeat ourselves and make each next step even more powerful.
Just yesterday we were pair programming with a talented junior AI developer. Today we are treating them as senior ones and can work with several in parallel. Very soon your job will not be pair programming and peer reviewing at all, but teaching a team of specialized coworkers to work on your project. In a year or two we will be assembling factories of such agents that will handle the process from taking your requirements to delivering and maintaining complex software. Our jobs are going to change many more times and much more often than ever.
And yet there will still be people finding solace in hand-crafting their tools, or finding novel algorithms, or adding the creativity aspect into the work of their digital development teams. Like people lovingly restoring their old cars in their garage just for the sake of the process itself.
And everything will be just fine.
> software engineering is the only industry that is built on the notion of rapid change, constant learning, and bootstrapping ourselves to new levels of abstraction
Not sure I agree. I think most programming today looks almost exactly the same as it did 40 years ago. You could even have gotten away with never learning a new language. AI feels like the first time a large percentage of us may be forced to fundamentally change the way we work or change careers.
I’m in my 40 something and it’s game over for my career. The grey in my hair makes it so that I never get past the first round. The history on my resume makes it so I’m lucky to get a round. The GPT’s and Claude have fundamentally changed how I view work and frankly, I’m over it.
I’m in consulting now and it’s all the same crap. Enterprises want to “unleash AI” so they can fire people. Maximize profits. My nephews who are just starting their careers are blindly using these tools and accepting the PR if it builds. Not if it’s correct.
I’m in awe of what it can do but I also am not impressed with the quality of how it does it.
I’m fortunate to not have any debt so I can float until the world either wises up or the winds of change push me in a new direction.
I liked the satisfaction of building something “right” that was also “useful”. The current state of Opus and Codex can only pretend to do the latter.
People have to stop talking like LLMs solved programming.
If you're someone with a background in Computer Science, you should know that we have formal languages for a reason, and that natural language is not as precise as a programming language.
But anyway we're peek AI hype, hitting the top on HN is worth more than a reasonable take, reasonableness doesn't sell after all.
So here we're seeing yet another text about how the world of software was solved by AI and being a developer is an artifact of the past.
> we have formal languages for a reason
Right? At least on HN, there's a critical mass of people loudly ignoring this these days, but no one has explained to me how replacing formal language with an english-language-specialized chatbot - or even multiple independent chatbots (aka "an agent") - is good tradeoff to make.
"Glorified TSA agent" is a rather gloomy, low-agency take on it. You both ask for what you want and verify the results.
December a few years ago, pre-ChatGPT I did Advent of Code in Rust. It was very difficult, had never done the full month before, barely knew Rust and kept getting my ass kicked by it. I spent a full Saturday afternoon solving one of the last problems of the month, and it was wonderful. My head hurt and I was reading weird Wikipedia articles and it was a blast. Nothing is stopping me from doing that sort of thing again, and I feel like I might need to, to counteract the stagnation I feel at times mentally when it comes to coding. That spark is still in there I feel, buried under all the slop, and it would reappear if I gave it the chance, I hope. I have been grieving for the last years I think and only recently have I come to terms with the changes to my identity that llm's have wrought.
Great post. Super sad state of affairs but we move on and learn new things. Programming was always a tool and now the tool has changed from something that required skill and understanding to complaining to a neural net. Just have to focus on the problem being solved more.
> Programming was always a tool
This is the narrow understanding of programming that is the whole point of contention.
This makes me think about the craftsmen whose careers vanished or transformed through the ages due to industries, machines etc. They did not have online voices to write 1000's of blogs everyday. Nor did they have people who can read their woes online.
The thing he has spent his whole career doing unto others he finally did into himself
I absolutely disagree with this. All the things the author said will still exist and keep on existing.
Nothing will prevent you from typing “JavaScript with your hands”, from “holding code in our hands and molding it like clay…”, and all the other metaphors. You can still do all of it.
What certainly will change is the way professional code will be produced, and together with that, the avenue of having a very well-paid remuneration, to write software line-by-line.
I’ll not pretend that I don’t get the point, but it feels like the lamentation of a baker, tailor, shoemaker, or smith, missing the days of old.
And yet, most people prefer a world with affordable bread, clothes, footware, and consumer goods.
Will the world benefit the most from “affordable” software? Maybe yes, maybe not, there are many arguments on both sides. I am more concerned the impact on the winners and losers, the rich will get more rich and powerful, while the losers will become even more destitute.
Yet, my final point would be: it is better or worse to live in a world in which software is more affordable and accessible?
> All the things the author said will still exist and keep on existing.
Except the community of people who, for whatever reason, had to throw themselves into it and had critical mass to both distribute and benefit from the passion of it. This has already been eroded by the tech industry coopting programming in general and is only going to diminish.
The people who discovered something because they were forced to do some hard work and then ran with it are going to be steered away from that direction by many.
I don’t think it’s that simple. A couple of examples:
Food:
A lot of the processed foods that are easily available make us unhealthy and sick. Even vegetables are less nutritious than they were 50 years ago. Mass agriculture also has many environmental externalities.
Consumer goods:
It has become difficult to find things like reliable appliances. I bought a chest freezer. It broke after a year. The repairman said it would cost more to fix than to buy a new one. I asked him if there was a more reliable model and he said no: they all break quickly.
Clothing:
Fast fashion is terrible for the environment. Do we need as many clothes as we have? How quickly do they end up in landfills?
Would we be better off as a society repairing shoes instead of buying new ones every year?
You can still do your craft as you did it before, but you can't expect to be paid for it as much as before.
> We’ll miss the feeling of holding code in our hands and molding it like clay in the caress of a master sculptor.
Oh come on. 95% of the folks were gluing together shitty React components and slathering them with Tailwind classes.
For what it’s worth I’ve followed the author for a long time and that does not describe the type of work he has done
This. People are way too easily impressed. I don't think this easily-impressedness will generalize to most people in the real world.
If you really buy all that you'd be part of the investor class that crashed various video game companies upon seeing Google put together a rather lame visual stunt and have their AI say, and I quote because the above-the-fold AI response I never asked for has never been more appropriate to consult…
"The landscape of AI video game generation is experiencing a rapid evolution in 2025-2026, shifting from AI-assisted asset creation to the generation of entire interactive, playable 3D environments from text or image prompts. Leading initiatives like Google DeepMind's Project Genie and Microsoft's Muse are pioneering "world models" that can create, simulate physics, and render games in real-time."
And then you look at what it actually is.
Suuuure you will, unwanted AI google search first response. Suuure you will.
how elitist
1. it isn't that bad
2. the tools still need a lot of direction, i still fight claude with opus to do basic things and the best experiences are when i provide very specific prompts
3. being idealistic on a capitalist system where you have to pay your bills every month is something i could do when my parents paid my bills
These apocalyptic posts about how everything is shit really don't match my reality at all. I use these tools every day to be more productive and improve my code but they are nowhere close to doing my actual job, that is figuring out WHAT to do. How to do it is mostly irrelevant, as once i get to that point i already know what needs to be done and it doesn't matter if it is me or Opus producing the code.
> They can write code better than you or I can
Speak for yourself. They produce shit code and have terrible judgment. Otherwise we wouldn't need to babysit them so much.
I'll believe it when I start seeing examples of good and useful software being created with LLMs or some increase in software quality. So far it's just AI doom posting, hype bloggers that haven't shipped anything, anecdotes without evidence, increase in CVEs, increase in outages, and degraded software quality.
It would be helpful if you could define “useful” in this context.
I’ve built a number of team-specific tools with LLM agents over the past year that save each of us tens of hours a month.
They don’t scale beyond me and my six coworkers, and were never designed to, but they solve challenges we’d previously worked through manually and allow us to focus on more important tasks.
The code may be non-optimal and won’t become the base of a new startup. I’m fine with that.
It’s also worth noting that your evidence list (increased CVEs, outages, degraded quality) is exclusively about what happens when LLMs are dropped into existing development workflows. That’s a real concern, but it’s a different conversation from whether LLMs create useful software.
My tools weren’t degraded versions of something an engineer would have built better. They’re net-new capability that was never going to get engineering resources in the first place. The counterfactual in my case isn’t “worse software”—it’s “no software.“
Well, on the surface it may seem like there’s nothing being created of value, but I can assure you every company from seed stage to unicorns are heavily using claude code, cursor, and the like to produce software. At this point, most software you touch has been modified and enhanced with the use of LLMs. The difference in pace of shipping with and without AI assistance is staggering.
Some people say that working with an agent or an agents orchestrator is like being a technical lead. But I've been a technical lead for quite a while, and the experience of working with an agent doesn't even come close. I think that when people talk about the agents' coding abilities they're talking about the average ability. But as a team lead, I don't care about average ability. I care only about the worst case. If I have any doubt that someone might not complete a task, or at least accurately explain why it's proving difficult, with at least 95% certainty, I won't assign them the task. If I have any doubt that the code they produce might not be up to snuff, I don't assign them the task. I don't need to review their code; they review each others'. When I have to review code I'm no longer a team lead but a programmer.
I often have one programming project I do myself, on the side, and recently I've been using coding agents. Their average ability is no doubt impressive for what they are. But they also make mistakes that not even a recent CS graduate with no experience would ever make (e.g. I asked the agent for it's guess as to why a test is failing; it suggested it might be due to a race condition with an operation that is started after the failing assertion). As a lead, if someone on the team is capable of making such a mistake even once, then that person can't really code, regardless of their average performance (just as someone who sometimes lands a plane in the wrong airport or even crashes without their being a catastrophich condition outside their control can't really fly regardless of their average performance). I wish the agent could work like a team of programmers and I would be doing my familiar role of a project lead, but it doesn't.
The models do some things well. I believe that programming is an interesting mix of inductive and deductive thinking (https://pron.github.io/posts/people-dont-write-programs), and the models have the inductive part down. They can certainly understand what a codebase does faster than I can. But their deductive reasoning, especially when it comes to the details, is severely lacking (e.g. I asked the agent to document my code. It very quickly grasped the design and even inferred some important invariants, but when it saw an `assert` in one subroutine it documented it as guarding a certain invariant. The intended invariant was correct, it just wasn't the one the assertion was guarding). So I still (have to) work as a programmer when working with coding assistants, even if in a different way.
I've read about great successes at using coding agents in "serious" software, but what's common to those cases is that the people using the agents (Mitchell Hashimoto, antirez) are experts in the respective codebase. I don't know what the future will bring, but at the moment, the craft isn't dead. When AI can really program, i.e. the experience is really like that of a team lead, I don't think that the death of programming would concern us, because once they get to that point, the agents will also likely be able to replace the team lead. And middle management. And the CTO, the CFO, and the CEO, and not just at software companies.
> If I have any doubt that someone might not complete a task, or at least accurately explain why it's proving difficult, with at least 95% certainty, I won't assign them the task
It gets hard to compare AI to humans. You can ask the AI to do things you would never ask a human to do, like retry 1000 times until it works, or assign 20 agents to the same problem with slightly different prompts. Or re-do the entire thing with different aesthetics.
No doubt, I'm just saying that working with a coding agent is not even remotely similar to being a team lead. If a member of your team can't complete a task and can't accurately explain what the difficulty is, you're in trouble.
If you're programming for the art, you can continue. Someone who enjoys painting can do so even after the camera
But you have to admit it loses a certain shine in the cases where you know that what you're doing is no longer solving a problem that could be solved simpler and cheaper another way.
If you want to build a house you still need plans. Would you rather cut boards by hand or have a power saw. Would you rather pound nails, pilot hole with a bit and brace and put in flat head screws... or would you want a nail gun and an impact driver.
And you still need plans.
Can you write a plan for a sturdy house, verify that it meets the plan that your nails went all the way in and in the right places?
You sure can.
Your product person, your directors, your clients might be able to do the same thing, it might look like a house but its a fire hazard, or in the case of most LLM generated code a security one.
The problem is that we moved to scrum and agile, where your requirements are pantomime and postit notes if your lucky, interpretive dance if you arent. Your job is figuring out how to turn that into something... and a big part of what YOU as an engineer do is tell other people "no thats dumb" without hurting their feelings.
IF AI coding is going to be successful then some things need to change: Requirements need to make a come back. GOOD UI needs to make a comeback (your dark pattern around cancelation, is now going to be at odds with an agent). Your hide the content behind a login or a pay wall wont work any more because again, end users have access too... the open web is back and by force. If a person can get in, we have code that can get in now.
There is a LOT of work that needs to get done, more than ever, stop looking back and start looking forward, because once you get past the hate and the hype there is a ton of potential to right some of the ill's of the last 20 years of tech.
LLMs have made a lot of coding challenges less painful: Navigating terrible documentation, copilot detecting typos, setting up boilerplate frontend components, high effort but technically unchallenging code completions. Whenever I attempted LLMs for tools I’m not familiar with I found it to be useful with setting things up but felt like I had to do good old learning the tool and applying developer knowledge to it. I wonder if senior developers could use LLMs in ways that work with them and not against them. I.e create useful code that has guardrails to avoid slop
Ephemeralization: the ability thanks to technological advancement to do "more and more with less and less until eventually you can do everything with nothing." —Buckminster Fuller
Dunno, LLMs writing code still feels like they memorized a bunch of open source code and vomited them out in worse condition.
It's not that impressive that Claude wrote a C compiler when GitHub has the code to a bunch of C compilers (some SOTA) just sitting there.
I'm using an LLM to write a compiler in my spare time (for fun) for a "new" language. It feels more like a magical search engine than coding assistant. It's great for bouncing ideas from, for searching the internet without the clutter of SEO optimized sites and ads, it's definitely been useful, just not that useful for code.
Like, I have used some generated code in a very low stakes project (my own Quickshell components) and while it kind of worked, eventually I refactored it myself into 1/3 of the lines it produced and had to squash some bugs.
It's probably good enough for the people who were gluing React components together but it still isn't on the level where I'd put any code it produces into production anywhere I care about.
That is my experience from a year ago but I no longer feel that way. I write a few instructions, guide an agent to create a plan, and rarely touch the code myself. If I don’t like something, I ask the agent to fix it.
I'm surprised so many people are only waking up to this now. It should have been obvious as soon as ChatGPT came out that even with only incremental improvements, LLMs would kill programming as we knew it. And the fact that these utterances, however performative, from developers expressing grief or existential despair have become commonplace tells me as much about the power of these systems than whatever demo Anthropic or OpenAI has cooked up.
I would also point out that the author, and many AI enthusiasts, still make certain optimistic assumptions about the future role of "developer," insisting that the nature of the work will change, but that it will somehow, in large measure, remain. I doubt that. I could easily envision a future where the bulk of software development becomes something akin to googling--just typing the keywords you think are relevant until the black box gives you what you want. And we don't pay people to google, or at least, we don't pay them very much.
Speak for yourself. I don't miss writing code at all. Agentic engineering is much more fun.
And this surprises me, because I used to love writing code. Back in my early days I can remember thinking "I can't believe I get paid for this". But now that I'm here I have no desire to go back.
I, for one, welcome our new LLM overlords!
I had that same epiphany when I discovered AI is great at writing complicated shell command lines for me. I had a bit of an identity crisis right there because I thought I was an aspiring Unixhead neckbeard but in truth I hated the process. Especially the scavenger hunt of finding stuff in man pages.
Speak for yourself. If you find the agentic workflow to be more fun, more power to you.
I for one think writing code is the rewarding part. You get to think through a problem and figure out why decision A is better than B. Learning about various domains and solving difficult problems is in itself a reward.
Same here i'm a decade plus in this field, writing code was by far the number 1 and the discussion surrounding system design was a far second. Take away the coding i don't think i will make it to retirement being a code/llm PR auditor for work. So i'am already planning on exiting the field in the next decade.
>You get to think through a problem and figure out why decision A is better than B. Learning about various domains and solving difficult problems is in itself a reward.
So just tell the LLM about what you're thinking about.
Why do you need to type out a for loop for the millionth time?
I'm that 40 year old now. Been writing code since grade 5. Loved it so much I got a PhD, was an academic, then moved into industry.
I don't mourn or miss anything. No more then the previous generation mourned going from assembly to high level languages.
The reason why programming is so amazing is getting things done. Seeing my ideas have impact.
What's happening is that I'm getting much much faster and better at writing code. And my hands feel better because I don't type the code in anymore.
Things that were a huge pain before are nothing now.
I didn't need to stay up at night writing code. I can think. Plan. Execute at a scale that was impossible before. Alone I'm already delivering things that were on the roadmap for engineering months worth of effort.
I can think about abstractions, architecture, math, organizational constraints, product. Not about what some lame compiler thinks about my code.
And if someone that's far junior to me can do my job. Good. Then we've empowered them and I've fallen behind. But that's not at all the case. The principals and faculty who are on the ball are astronomically more productive than juniors.
> They can write code better than you or I can, and if you don’t believe me, wait six months.
No they cannot, And an AI bro squeezing every talking point into a think piece while pretending to have empathy doesn't change that. You just want an exit, and you want it fast.
I wonder whether, in the end, it was simply poor accessibility that made programmers special, and whether it is that what some of them are missing. Being special by "talking" a special language their customers can't comprehend.
Sure, they are still needed for debugging and for sneering at all those juniors and non-programmers who will finally be able to materialise their fantasies, but there is no way back anymore, and like riding horses, you can still do it while owning a car.
it definitely sucks to be honest, and theres a lot of cope out there.
fact of the matter is, being able to churn out bash oneliners was objectively worth $100k/year, and now it just isnt anymore. knowing the C++ STL inside-out was also worth $200k/year, now it has very questionable utility.
a lot of livelihoods are getting shaken up as programmers get retroactively turned into the equivalent of librarians, whose job is to mechanically index and fetch cognitive assets to and from a digital archive-brain.
Yeah, I notice a lot of the optimism is from people who have been in the field for decades. I'm newish to the field, half a decade out of undergrad. It definitely feels like almost all of what I learned has been (or will soon be) completely devalued. I'm sure this stuff feels a lot less threatening if you've had decades to earn a great salary and save a bunch of money. If money wasn't a concern I'd be thrilled about it too.
This just in: people who expect things to stay the same should steer clear of careers in technology. Art, too, come to think of it.
dude needs to chill
also:
> We’ll miss the sleepless wrangling of some odd bug that eventually relents to the debugger at 2 AM.
no we won't lol wtf
but also: we will probably still have to do that anyways, but the LLM will help us and hopefully make it take less time
I mean go ahead and cry if you want. You are losing time best spent caring about stuff, and overlooking many alarming gotchas through blindly accepting SV hype. I'd have thought crypto would teach people something, but apparently not.
Do what isn't replaceable. You're being told literally everything is replaceable. Note who's telling you that and follow the money.
I feel bad for this essayist, but can't really spare more than a moment to care about his grief. I got stuff to do, and I am up and doing. If he was in any way competing with the stuff I do? One less adversary.
I would rather bring him into community and enjoy us all creating together… but he's acting against those interests and he's doomering and I have no more time for that.