Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>>In one case, one might be committing to the repository slightly more frequently. But in all cases, they still aren't completing assignments on time.

Most people are using it to finish work soon, rather than use it to do more work. As a senior engineer your job must not be to stop the use of LLMs, but create opportunities to build newer and bigger products.

>>I can even see them habitually attempt to use the AI, but then remember I'm sitting with them, and halt.

I understand you and I grew up in a different era. But life getting easier for the young isnt exactly something we must resent. Things are only getting easier with time and have been like this for a few centuries. None of this is wrong.

>>Tried to coax the AI into doing what I wanted and easily spent 10x more time than it would have taken to look up the documentation.

Honestly this largely reads like how my dad would describe technology from the 2000s. It was always that he was better off without it. Whether that was true or false is up for debate, but the world was moving on.



> As a senior engineer your job must not be to stop the use of LLMs, but create opportunities to build newer and bigger products.

I think you just hit the core point that splits people in these discussions.

For many senior engineers, we see our jobs are to build better and more lasting products. Correctness, robustness, maintainability, consistency, clarity, efficiency, extensibility, adaptability. We're trying to build things that best serve our users, outperform our competition, enable effective maintenance, and include the design foresight that lets our efforts turn on a dime when conditions change while maintaining all these other benefits.

I have never considered myself striving towards "newer and bigger" projects and I don't think any of the people I choose to work with would be able to say that either. What kind of goal is that? At best, that sounds like the prototyping effort of a confused startup that's desperately looking to catch a wave it might ride, and at worst it sounds like spam.

I assure you, little of the software that you appreciate in your life has been built by senior engineers with that vision. It might have had some people involved at some stage who pushed for it, because that sort of vision can effectively kick a struggling project out of a local minimum (albeit sometimes to worse places), but it's unlikely to have been a seasoned "senior engineer" being the one making that push and (if they were) they surely weren't wearing that particular hat in doing so.


I don't get this idea that to build a stable product you must make your life hard as much as possible.

One can use ai AND build stable products at the same time. These are not exactly opposing goals, and even above that assuming that ai will always generate bad code itself is wrong.

Very likely people will build both stable and large products using ai than ever before.

I understand and empathise with you, moving on is hard, especially when these kind of huge paradigm changing events arrive, especially when you are no longer in the upswing of life. But the arguments you are making are very similar to those made by boomers about desktops, internet and even mobile phones. People have argued endlessly how the old way was better, but things only get better with newer technology that automates more things than ever before.


I don't feel like you read my comment in context. It was quite specifically responding to the GP's point of pursuing "bigger and better" software, which just isn't something more senior engineers would claim to pursue.

I completely agree with you that "one can use ai AND build stable products at the same time", even in the context of the conversation we're having in the other reply chain.

But I think we greatly disagree about having encountered a "paradigm changing event" yet. As you can see throughout the comments here, many senior engineers recognize the tools we've seen so far for what they are, they've explored their capabilities, and they've come to understand where they fit into the work they do. And they're simply not compelling for many of us yet. They don't work for the problems we'd need them to work for yet, and are often found to be clumsy and anti-productive for the problems they can address.

It's cute and dramatic to talk about "moving on is hard" and "luddism" and some emotional reaction to a big scary immanent threat, but you're mostly talking to exceedingly practical and often very-lazy people who are always looking for tools to make their work more effective. Broadly, we're open to and even excited about tools that could be revolutionary and paradigm changing and many of us even spend our days trying to discover build those tools. A more accurate read of what they're saying in these conversations is that we're disappointed with these and in many cases and just find that they don't nearly deliver on their promise yet.


> life getting easier for the young isnt exactly something we must resent.

I don't see how AI is making life easier for my developers. You seem to have missed the point that I have seen no external sign of them being more productive. I don't care if they feel more productive. The end result is they aren't. But it does seem to be making life harder for them because they can't seem to think for themselves anymore.

> a senior engineer your job must not be to stop the use of LLMs, but create opportunities to build newer and bigger products

Well then, we're in agreement. I should reveal to my juniors that I know they are using AI and that they should stop immediately.


> I understand you and I grew up in a different era. But life getting easier for the young isnt exactly something we must resent.

Of course not. But, eventually these young people are going to take over the systems that were built and understood by their ancestors. History shows what damage can be caused to a system when people who don't fully understand and appreciate how it was built take it over. We have to prepare them with the necessary knowledge to take over the future, which includes all the warts and shit piles.

I mean, we've spent a lot of our careers trying to dig ourselves out of these shit piles because they suck so bad, but we never got rid of them, we just hid them behind some polish. But it's all still there, and vibe coders aren't going to be equipped to deal with it.

Maybe the hope is that AI will reach god-like status and jut fix all of this for us magically one day, but that sounds like fixing social policy by waiting for the rapture, so we have to do something else to assure the future.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: