That has already happened. Companies are hiring less on the engineering front and seeing AI tooling as multipliers to existing developers.
Edit: No direct citations, my own experience, sorry! Interesting to see how many people are throwing up the pitchforks in defense of engineering. Even if you are only getting a 5% increase in productivity via AI assistants, thats ultimately less people you may have to hire.
Investors are not making waves with PR announcements but this is where a lot of the PE/VC space is talking and heading. How can we unlock more productivity with fewer heads by using AI. Its quite similar to the Klarna story, reduce headcount and increase productivity. Even ignoring my comments, its quite easy to see how AI can multiply an individual engineer, that is naturally going to be applied at scale by investors.
I like Sam Altman's quote on this. "I think the world is gonna find out that if you can have 10 times as much code at the same price, you can just use even more. So write even more code."
I agree that it will have an impact, but I'm wary to predict something specific, whether negative or positive.
This. Current tech of AI is comparable to github and npm where when they got introduced (but less dramatic in terms of productivity improvement actually), what do you think happened when we got millions of open source libraries to do our job much more effectively? The number of developers did not shrink as a result, instead it skyrocketed!
I guess my point is I am not predicting anything. This has already been happening in the past year. Smaller companies are seeing a crunch; have we not continued to see layoffs at the big tech companies? The name of the game is reducing costs and headcount at the moment. There are certainly outliers, like OpenAI, that have massive demand but for the vast majority of companies this is not true.
I don't think we need to see numbers from FAANG to see that as an industry in while, AI assistants/AI auto complete can save time. I am probably just not as smart as a distinguised engineer like yourself but I find time savings using these tools.
Maybe this is a me problem, but if I could figure out a way to use AI to significantly accelerate or replace developers or my own work, I would do it.
I have experimented a bunch with llm, copilot, etc. The current offering is useful in a limited scope. People google a bit less, and they are a bit better than existing IDE snippeting tools. I see potential but what is on the market doesn't give me a 10% improvement.
If you ask an LLM to write you a story it will write you a story. If it want a very specific story you have to write a very detailed prompt. Code generation is also like this. A seasoned developer can write code as fast as they can write a detailed prompt, and a newbie may be able to work faster in unfamiliar technologies but is susceptible to following bad suggestions (e.g. llm will tell you to write your own email validation instead of using the teams preferred library).
The vibe I get is like low code technologies. Initially they look promising and you wonder if you need skilled people anymore but any non trivial problem and you're just coding on diagram form realising text is better.
What are you / that using?
I'd really like to try it if it is publicly available.
What I see anecdotally is, now debt costs money a lot of buisness cases for tech investment just don't make sense. Borrowing to buy future growth made a lot of sense when interest rates were negative. Now we have a lot of pressure to deliver profits today.
I use Copilot within the IDE extensively and only for the autocomplete. It is not always correct but it honestly is correct enough of the time that it is like having a second brain complete what I was thinking. If I want to write a unit test for the function I can do it at lightspeed compared to the past.
I will use a flavor of a chat interface (Mistral Chat, ChatGPT, Gemini) when I am trying to figure out something I don't have domain expertise on. For example I have a lot of trouble digesting AWS docs, I often get permissioning wrong or a configuration that is not well outlined to me. I use a chat interface to walk through the problem and more times than not get to a solution a lot quicker than if I had tried to step through all the docs.
I am still doing most of the thinking, I don't find LLMs to be that amazing for engineering solutions. I think it will happen in the future though as they become perhaps more opinionated, especially on software engineering.
> developer can write code as fast as they can write a detailed prompt
That is misleading. Usually what happens for me is I write a line of code, then I wait few seconds and copilot will write the next 5-10 lines. I have in my head what I expect it to write, so I can immediately tell if it is good. It is much less mentally draining as well, it is easy for me to code 12h a day and with higher productivity rates than before. I have done so many side and interesting hobby projects because of that productivity boost.
But overall it hasn't made me code less, it has made me spend more time coding because it is much faster to get the same value.
Same might be for the companies. Projects that weren't worth to do before will now be, because they are cheaper and faster to do.
> Even ignoring my comments, its quite easy to see how AI can multiply an individual engineer, that is naturally going to be applied at scale by investors.
Yeah sure I'm getting a ~10% productivity boost personally from those tools but it's not like you can give those to non-devs and expect them to replace a developer with it.
Let's not forget that we have code generators usable by non developers since the 90s. It's not like it's a particularly new addition.
> Yeah sure I'm getting a ~10% productivity boost personally from those tools but it's not like you can give those to non-devs and expect them to replace a developer with it.
> Let's not forget that we have code generators usable by non developers since the 90s. It's not like it's a particularly new addition.
I never said anything about non-developers. If you hire 10 developers and on average the AI assistants give a 10% productivity boost, that potentially means you don't have to hire the 11th developer. I am not suggesting that engineers are gone, only that headcount reduction via AI tooling is already happening.
The modern developer tooling already had a much larger impact on productivity than AI probably will and the only thing it did is increase even more the number of developers.
If I was the CEO of a company making headcount reduction with AI, I would be more worried about my company itself than the job of the ones I'm firing.
It is not even worth comparing. The industry as a whole is seeing productivity gains by using AI Assistants. I am not here to speculate the medium-term, just the immediate term which is companies are see it as a multipler where they might not need to hire that 11th developer.
I'm speculating on the medium term especially because I don't see much of an impact on the short term. While that logic sounds okay, I'm not seeing it right now, the productivity gains seems to be used to produce even more stuff.
I've never been in a company where the roadmap isn't full to the brim, there doesn't seem to be a limiting factor on this side.
I generally agree but I think its a different story now that we are no longer in 0% interest rate territory. Road maps are probably still full but money is no longer free.
Yes this is the mind of an economist. The question is how much new production/creative work can be made with AI or if it's just replacing common QA tasks.
yes only matter of time until you can present an AI with a specification and mockup and it will give you the code/app
it would need some human touch but most of the work will be done already
edit: i just had this thought that my dev job has become less coding and more process and tooling over the years. which is why i dont enjoy it. it feels like tedium that should be automated.
The issue isn’t implementing mockups, it’s integrating with poorly documented existing systems, anticipating what could go wrong given a users path through the site, making it efficient, understandable, and maintainable.
Most of those tasks will be heavily changed by AI, but not replaced.
If you honestly think that statistics based AI can replace software engineers, then you either have no software engineering experience, don’t understand how AI works under the hood, or haven’t worked anywhere that does anything more than CRUD api development.
> If you honestly think that statistics based AI can replace software engineers, then you either have no software engineering experience, don’t understand how AI works under the hood, or haven’t worked anywhere that does anything more than CRUD api development.
I don't understand how brains work under the hood (does anyone?), but zoom into the brain and you get chemistry, zoom into the chemistry and you get quantum mechanics, and that quantum mechanics is statistical in nature.
I don't know if that truth matters or not, because I don't know which layer of abstraction is the most relevant one for our intelligence. And without knowing that, I don't know if these models we have now can or can't be scaled up to do what we do: if what we are really does depend on some microtubule quantum computation, then no, no classical computer can ever be like us (though it is, still, statistics); on the other hand, if everything we are comes from the strengths of synaptic connections and internal bias of our neurons, then any sufficiently complex model can absolutely do all that we can do, and much faster too.
> I didn’t even need to read the rest to know it was all nonsense.
So, you're pattern matching without using careful logical analysis? Yes, this is a totally convincing demonstration of how humans are not at all like LLMs.
> LLMs aren’t magic.
Are humans?
I really liked the occult when I was a teenager. Despite trying, never found any real magic.
> Are you comparing using your brain to using an LLM.
Do you know where the name "neural network" comes from?
'course, the person I'm replying to probably isn't reading this anyway, given they said they stopped reading too soon the last time. This made me think: https://news.ycombinator.com/item?id=39504270
Those already exist, but are understood and provide professional support (in the form of engineers who extend the platform for a customer)
(Hybris, CrystalCommerce, phpbb, discourse, etc.)
That engineer probably can’t even be replaced by AI since every new business is a snowflake once the low hanging fruit is gone.
I don’t think our current form of statistical models will ever be able to generalize and get into specifics at the same time.
AI will change how individual engineers work by being a more proactive search engine, but will not be relied on, by engineers, to write code entirely.
> it would need some human touch but most of the work will be done already
By that very loose standard, the matter of time is 2 years 6 months 18 days ago — 10th August 2021 was OpenAI's blog post about the Codex model, with a chat interface producing functional JavaScript: https://openai.com/blog/openai-codex
Right now, what I see coming out of these tools (and what I see in the jobs market) gives me the vibes these tools are very much at the level of "why do we need to hire interns and possibly also junior developers anyway?", but mid and senior levels are still better at seeing bigger pictures and subtle issues that both juniors and LLMs have a harder time with… and, indeed, standard new programmer questions like "why doesn't my code compile?".
There's this joke about clients not knowing what they want, so they couldn't possibly explain it to an AI and therefore developers will keep their jobs.
Honestly AI works for writing a small function, and it's definitely superior to Google / SO when searching for code examples.
But in the context of a large app with more exceptions than business rules and where you have to take in to account legacy code & constraints ... I don't see an AI figuring that all out for the simple reason that it's too hard to explain to it the big picture.
Sounds like the ultimate business idea. Sell customers a magic shovel that eliminates their need to hire shovelers. If it doesn't work, just tell them to be more specific with their requests to the Shovel.
Agree as well. We are at the very beginning, AI is the worst it will be right now.
Those who keep saying 'I can't see how my job is at risk' living under a rock.
I remember watching streaming video on the real player on a 56k modem. It was a complete joke but you could envision things getting much better.
Is it fair to say TV broadcasters/production professionals were in danger of losing their jobs in that moment? Kind of, but TV broadcasters/production professionals were also the people in the best position to take advantage of the new advances.
Of course, that was predicated on being open to change and not clinging to the past.
Surely, anyone on HN talking about AI right now is in good shape.
We are inside the bubble. There is a huge % of even young people who have no interest in any of this. It is like 50% of 18-29yo haven't even used chatGPT themselves in the US.
Would love to hear an intelligent reply why? This is just my experience in my corner of the universe.
I am not saying it is the end of engineering or that layoffs are happening because of AI. There are huge incentives for investors to make AI tooling as best as possible because the possible cost reductions are significant. It does not apply to all areas of software engineering but it is certainly being implemented in acquired companies.
Edit: No direct citations, my own experience, sorry! Interesting to see how many people are throwing up the pitchforks in defense of engineering. Even if you are only getting a 5% increase in productivity via AI assistants, thats ultimately less people you may have to hire.