I don't think that's really the point of this post; it's all about how LLMs are destroying our craft (ie, "I really like using knives!"), not really about whether the food is better.
I think the real problem is that it's actually increasingly difficult to defend the artisanal "no-AI" approach. I say this as a prior staff-level engineer at a big tech company who has spent the last six months growing my SaaS to ~$100k in ARR, and it never could have happened without AI. I like the kind of coding the OP is talking about too, but ultimately I'm getting paid to solve a problem for my customers. Getting too attached to the knives is missing the point.
Call me crazy, but my guess is that that may not have been able to happen without the decade of experience it took you to get to the Staff level engineering position at a big tech company which has enabled you to gain the skills required to review the AI code you're producing properly.
I thought it's interesting that GPT5's comments (on prompting it for feedback on the article) seem to overlap with some of the points you guys made:
My [GPT5's -poster's note] take / Reflections
I find the article a useful provocation:
it asks us to reflect on what we value in being programmers.
It’s not anti-AI per se, but it is anti-losing-the-core craft.
For someone in your position (in *redacted* / Europe)
it raises questions about what kind of programming work you want:
deep, challenging, craft-oriented, or more tool/AI mediated.
It might also suggest you think about building skills
that are robust to automation: e.g., architecture,
critical thinking, complex problem solving, domain knowledge.
The identity crisis is less about “will we have programmers” and
more “what shapes will programming roles take”.
Absolutely. But, what if the point of using the knives, is to be able to understand how to use the machines which can use knives for us, and if we're not replicating the learning part, where do we end up?
It's both. Speaking as a user, software quality was already declining before AI coding, but AI seems to have put that process on a fast track now (not the least because of all the top management drinking the Kool-Aid and deciding that they can replace the people they have with it).
I think the real problem is that it's actually increasingly difficult to defend the artisanal "no-AI" approach. I say this as a prior staff-level engineer at a big tech company who has spent the last six months growing my SaaS to ~$100k in ARR, and it never could have happened without AI. I like the kind of coding the OP is talking about too, but ultimately I'm getting paid to solve a problem for my customers. Getting too attached to the knives is missing the point.