He’s laid out five stages where he says specific changes will occur. He’s extrapolating this all based on evidence from just the past few months. This is nonsense. I’m not saying I know we’re not going to be replaced, but neither does he.
For example, he links to an article that he says shows “OpenAI has openly stated its intent to replace developers”, but when you read that article, it’s actually about a report by another company (not OpenAI itself) saying that “OpenAI is pursuing an avenue to make basic coding practices essentially obsolete”. That’s _very_ different than making _programmers_ obsolete.
Listen, if all you do is plug shit together and don’t contribute at a higher level than that, yeah your job might be at risk. But if you use this as a tool to release you from the drudgery of “basic coding”, this could be an incredible tool to make you far more effective.
Almost every time there’s a new tool that gives us new capabilities, instead of firing programmers (except for this one period of COVID hangover layoffs), companies make the choice to do _more_ with the same number of people. (If we don’t, the the competition will!) And so the level of complexity ratchets up higher. And who are the people that can manage that complexity? The CEO? The product managers? No. Software engineers.
Unless they actually create AGI. It which case, everyone is fucked, regardless of the industry. At that point, we all have to become massage therapists or waiters.
AGI is a zero-dimensional mythology like flying cars, more HAL 9000 or The Jetson's science fiction.
Some forms of it will continue to arrive in stages, but it will be difficult to point to when exactly and what will be called "AGI".
PS: Waiter robots that are self-propelled drink carts have been done before. It's busboys, maids, and physical lift assistance that will be especially difficult to automate. By that time, we better have such amazing BCI that we've shuffled off the mortal coil and melded with tech. The days of carbon-based life are numbered not just by existential threats like climate change, but because it will likely be inferior to computational life. I'm curious how inorganic life will own physical property including itself, expand, reproduce, etc. Will it jump on the zetacoin mining rage to make money or refactor code made by other programs? Will inorganic life need entertainment?
> Unless they actually create AGI. It which case, everyone is fucked, regardless of the industry. At that point, we all have to become massage therapists or waiters.
GPT-4 has already shown sparks of AGI. It can solve complex problems with high accuracy. AGI is inevitable and will arrive soon, most likely within the next 3 years.
And those jobs you've listed can be replaced by AI, the only limiting factor at this point is the lack of a physical body. Though, massages will still be performed by humans as some people would prefer that, but others don't care and would likely prefer AI over a human. So AI will still be deployed to act as a massage therapist.
We don’t know what’s inevitable in foresight, only in hindsight. Sparks of AGI are sparks, and do not imply an inevitable progression to a particular end state.
I don’t see GPT-* replacing us unless they endow it volition. If it requires prompts to get things done, some human has to understand the system enough in order to provide the correct prompt. Writing code is only a small part of the job. Knowing what code to write, what code to refactor, what abstraction to use that fits with your other abstractions — these things are the real job. Giving GPT volition would make it “self-prompting” and maybe allow it to make these leaps. But volition seems qualitatively different to me than what it’s doing now, and that just improving its current capabilities won’t necessarily lead to volition.
Ive been really worrying about my future career wise the last month+ because of chatGPT and the incoming AI wave of innovation. The issue I have is, what’s the obvious choice for a career pivot? Do I switch to something like nursing where physical labor is involved? Do I get into networking and deal with setting up physical servers? I don’t have a great answer, so it seems to me that the best course of action is continue trying to become a better software engineer, and see what happens. I don’t know if anyone else feels more or less in the same boat or not…
Carpentry would be my recommendation. Even feels a bit like working on a react app because you are constantly making new tools to make your work easier.
The simple, brutal truth is this: The age of human accomplishment is almost over. It could be as early as 10 years from now, or it could take another 10 years after that, but sooner or later, "people doing things" is going to end. There will be no field of expertise where even the best humans can compete with AIs.
Instead of worrying how you can avoid the inevitable and win tomorrow's rat race one more time, I recommend you prepare yourself mentally for this. The vast majority of people will no doubt be in denial long after they have already been made obsolete. Defying mass sentiment and recognizing the truth is perhaps the last great truly human act left for anyone to do.
Hmm, AI may be poised to make software devs, copywriters, and accountants redundant, but it's not likely to make a good salad, play a killer live concert, or set a broken arm. At a certain level I think I'm fine with being made redundant (I don't hate my job, but I wouldn't mind retiring early - unless that was to sleep in a cardboard box). So for me the question is can our society survive AI replacing a lot of lucrative white-collar gigs - maybe?
I don't think it'll be 10 or 20 years. I think it'll be more like 2% of good jobs (or 10%+ in random sectors) will disappear on a yearly basis for the next 50 years.
I don't know honestly. I had a period recently where I wasn't sure if my business was worth/able to continue and I spent some free time thinking about what I could do next and what seemed like a good call for the next 10-20 years. I have to admit all of this would have changed my thoughts significantly. For now I feel slightly more comfortable over the short term than several months ago so I've decided to watch and see how things shake out. I have to say I am pessimistic about a healthier relationship between labor and capital as result of this, but that is just a feeling and not backed up by much.
This is extrapolating some amazing things without evidence. People thought we’d have moon bases and flying cars by now, too. There have been some amazing emergent behaviors as models get bigger, but there is no guarantee that the trend will continue.
> People thought we’d have moon bases and flying cars by now, too.
Yup. And the reason we don't have them is because they don't make economic sense – not because they are technologically impossible, or even particularly difficult. Flying car prototypes that work have been around for decades, and moon bases would have been feasible in the 1970s.
But replacing most knowledge workers at 0.1% of their cost certainly does make economic sense, so I have no doubt it will happen even sooner than the most daring predictions suggest.
This article is garbage. Look at it carefully. Where is the reasoning? Where is the evidence? It's just one magical claim after magical claim.
Compilers didn't replace coders, they just changed the nature of coding. AI doesn't change coding, but it may change the nature of coding in some ways.
A CEO of a company could theoretically manage directly every single person in his workforce. But he doesn't. Why? Because managing people requires that you understand what they do and how they might be doing it wrong. You need to track their status. Otherwise, things get completely out of control and your plans go nowhere.
Now imagine a perfect AI "coder." How do you manage it? You will have to know how to talk to it and how to understand its behavior. But this is exactly how human coders deal with compilers and libraries today. We will not see ANY replacement of engineers with AI. What we will see is a new category of high-level programming via prompt engineering.
Programmer tools have not, up until now, ever led to a contraction of the software industry. There are more programmers out there than ever before. Advances in technology merely expand what can be done with tech.
Imagine one more thing. Let's say you find a genie lamp and the genie inside gives you three wishes. You COULD makes those wishes yourself. But there are often unhappy side effects. What you need is to go find a Genie Whisperer-- essentially a coder who understands genies; how they think; what they can do; what the pitfalls of dealing with them are. The Genie Whisperer will translate your naive desires into carefully couched language that maximizes the probability of getting what you want.
You see? EVEN IF AI IS LITERALLY MAGIC business people will discover that they should not deal directly with it.
> Compilers didn't replace coders, they just changed the nature of coding. AI doesn't change coding, but it may change the nature of coding in some ways.
This is a poor comparison. AI itself does the coding. GPT-4 can already write code from scratch with high accuracy. At this stage, it still needs a person to review it, but since AI is growing exponentially and can learn how to review code, this will definitely change soon.
And no, this isn't limited to simple coding problems. GPT-4 has been shown to be able to solve complex problems. We're still in the very early stages of AI and it can already do at least 50% of the work by itself, and that's a low estimate.
You are using "coding" as a magic word. Stop that.
My first job was writing machine language without an Assembler, man. THAT'S coding. Then I used an Assembler and the code was written for me, all I had to do was specify it in a language called "6502 Assembly." Then I moved on to C, and the Assembly code was written FOR ME by an amazing thing called a COMPILER. Then I moved on again.
The meaning of "coding" is simply the act of precisely specifying what you want the computer to do, at some level of abstraction. GPT-4 is not "writing code" in the same sense that a human does. It is an automaton performing an elaborate algorithm. Functionally, it's doing what a compiler does, just at a different level of abstraction.
In order to address my argument, you will have to understand the social role of a programmer in a human system. That's the crux of the matter, not arranging text in a file.
>GPT-4 can already write code from scratch with high accuracy
>GPT-4 has been shown to be able to solve complex problems
In GPT-4's own paper, which as always will reflect it in the best light possible, it passes a whopping 1/4th of leetcode "medium" problems still messes up a lot of the easy ones and basically fails all the hard ones.
Those are toy examples mostly aimed at students.
Why do so many of you just rampantly make shit up when talking about this?
This is a poor comparison. Unlike crypto, AI today already has real world uses. And AI is growing exponentially, it's getting better and better with each month. Models can also be trained for specific purposes, resulting in more accurate outcomes. Many jobs today can already be completely replaced with AI.
OP sets up the article that everyone is myopic, but then falls way short himself in only looking at programmers.
> Engineers are myopic and don’t realize they constitute a massive cost center that management is salivating to slash.
What is the value of a software company where these managers work when everyone can write their own OS/office suite/browser at home?
And if image/video generation keep up with GPT, we will see game of thrones contenders being made by a 1 man operation with a good imagination, essentially killing netflix and the like.
So what are we left with but compute and hardware. The risk of the future in my eyes is that everyone is without a job and the true power is with the few mega companies that provide us the hardware, and compute.
Apple and AWS are probably good companies to bet on. If there isn't a complete revolution by then.
My biggest fear is ChatGpt will do all the coding, and we’ll all be pushed further into being testers and QA. The end stage of development where a human eye is a necessity…
It feels like we are all about to become dirt poor at the expense of this system cranking out products as cheaply as possible.
I feel like nobody ever takes these predictions to the next level. So no company will need us, so why does anyone need the services/products of all these companies that no longer need software developers?
Why do I need to use the products of a company if I can just tell my ai what software I need?
> Why do I need to use the services of a company if I can just tell my ai what software I need?
Because the AI software can’t produce it itself, it requires access to the capital and raw materials to produce it (or to someone who already has.) The companies, even if they ditch any of their human laborers not needed for resource extraction and production, will retain capital and property interests in raw materials to extract.
(Unfortunately, unless you are profiting from capital, or are one of the residual laborers needed for production/extraction/delivery, you won’t have money for your AI to give to those companies for the things you want, so telling the AI won’t help.)
I see your point about physical resources, but I was thinking more about software products. Why pay for say MS Word when I can just list all the word processing features I want to AI and tada there is my custom version of word.
Why is Microsoft (insert any software company) necessary if they can just replace all developers with AI is what I was getting at.
> Why pay for say MS Word when I can just list all the word processing features I want to AI and tada there is my custom version of word.
I mean, if AI gets to that point, you don’t need to pay for Word, sure. (Of course, if AI gets to that point, what are you going to be using Word for, anyway?)
> Why is Microsoft (insert any software company) necessary if they can just replace all developers with AI is what I was getting at.
Microsoft is a heavy investor in OpenAI, and incorporating GPT-4 in its offerings. The thing they plan on you paying them for in the future is “access to the AI”.
The very first time I ever was paid to write software, in 1982, I asked my client to tell me what he wanted. Then I gave that to him, and he complained he didn't like it. He really wanted something else. He was bad at saying what he wanted, and he was bad at knowing what he wanted. There is an old saying (I think from Jerry Weinberg): Make it possible for people to write programs in plain English, and you will find they are not able to write plain English.
The reason you need an intermediary is because you don't have the patience or training to completely specify your system. And if you don't completely specify it, then you are just going to get what everyone else gets. (If you DO have the pateince and training, then congratulations you are an engineer.)
If you are content with getting something generic, then you don't even need to write software in the first place, just download Word!
Another common prediction is that articles like these will be trivial for ChatGPT to author.
Will tears flow at the loss of tech bloggers? How about SEO content writers? In the latter case, copy-paste drivel writers will be the first to lose their jobs.
Programmers may be under the pump in many circumstances. But their skills often require human decisions beyond valid code. Decisions taking into account technical debt, business reasons, UI concerns, or making calculated compromises such as not pissing off the existing codebase, people, or legacy systems that can't be moved easily.
ChatGPT literally doesn't care if it gets those decisions wrong. If your Ai software engineer turns out to be a Shatbot, where is the accountability?
I am about to finish my bachelor's degree in applied computer sciences and am worried about my future.
Sure, this whole process will take some time like 10 - 20 years in which I can build up some skills.
But as life in general becomes more expensive and I want to make good money (let's be honest, we all want that), I consider doing a masters degree in either
- Business Informatics/(IT-)Projectmanagement, or
- Artificial intelligence
As I am not the strongest in maths and I like the Idea of leading a Project, the first one would be my preference.
Do you have any suggestions on this? Will Project-managers also be replaced to the same extend as devs?
I think ultimately what I am looking for is "some cool tech job that gets me paid 90K but without having to being Dijkstra-smart"
Leverage what you have (Bach C.Sci) to learn things orthoganal that you don't already know.
Eg: Mineral Resources, Energy, and Farming will be important in the near and not so near future.
Rio Tinto and others are running robot fleets that move > 800 million tonnes per annumn of raw materials - there's raw processing circuits, refinement, processing concentrates to learn - a mixture of Comp Sci + mech engineering | chemistry + GIS etc.
I hear things are happening in renewable energy at a great pace, and farming has more data streams and Ag-bots than ever before.
I'd suggest getting a foot in the work force for the pragmatic exposure to new domains, then after four or five years picking up whatever further AI (or other skills) you need - keep in touch with CS as you pivot.
IF all this were to come to pass as described, on that ten year timeline, I wonder what would be the fate of programming languages? New dominant coding languages come along roughly every five years. Would generative AI stick with one language? Would it learn to prefer new languages over time? Would it write new languages? If not, who would?
I work on a 17.5 million line code base across 4 different languages that I know will not be something chatGPT has trained on and it won’t be due to commercial interests. Not worried about it.
A lot of the older code isn’t a great example of how to do things. Either too enterprisey, or C/C++ written by newbies coming from a mainframe RPG background.
Agreed, and right from the get-go (the first section on "Addressing AI Fallacies"). Cherry-picking failed predictions (from a far less sophisticated era) makes for a catchy-sounding argument, but has no merit on the nuts-and-bolts discussion of whether a generative model can really compete with what a skilled human does, in the big picture sense.
It was a pretty sad write-up, overall. Literally ChatGPT itself could do better, per one of its answer to a variant of the same question:
“It is unlikely that ChatGPT or Alphacode will replace programmers” because they are “not capable of fully replacing the expertise and creativity of human programmers...programming is a complex field that requires a deep understanding of computer science principles and the ability to adapt to new technologies.”
I can't wait to stop writing repetitive code and get things done faster with time to market approaching zero. But around this time, I should have been sitting in my self driving car instead of ordering a taxi.
For example, he links to an article that he says shows “OpenAI has openly stated its intent to replace developers”, but when you read that article, it’s actually about a report by another company (not OpenAI itself) saying that “OpenAI is pursuing an avenue to make basic coding practices essentially obsolete”. That’s _very_ different than making _programmers_ obsolete.
Listen, if all you do is plug shit together and don’t contribute at a higher level than that, yeah your job might be at risk. But if you use this as a tool to release you from the drudgery of “basic coding”, this could be an incredible tool to make you far more effective.
Almost every time there’s a new tool that gives us new capabilities, instead of firing programmers (except for this one period of COVID hangover layoffs), companies make the choice to do _more_ with the same number of people. (If we don’t, the the competition will!) And so the level of complexity ratchets up higher. And who are the people that can manage that complexity? The CEO? The product managers? No. Software engineers.
Unless they actually create AGI. It which case, everyone is fucked, regardless of the industry. At that point, we all have to become massage therapists or waiters.