Almost everyone here will be fine in a world in which only certain non-renewable scarce resources have any value, combined with impending massive currency deflation. Even a rank and file tier 2/3 engineer will manage to live a princely life in that new reality assuming (s)he saved up enough.
Hardly anyone in the tech community is capable of truly empathizing with the vast majority of people, who are born with none of the advantages that they had (genetic, parental, cultural, geographic, etc).
> a world in which only certain non-renewable scarce resources have any value
If you think this is the world "AI" is leading us toward, I think you need to think again. There are a huge range of skills that only humans can provide, and that will continue to be true. GPT-3 can't fix your plumbing or your HVAC when it breaks. It can't fix your car. It can't build houses. It can't build roads. It can't build computers. And on and on and on...
Nor are AI enthusiasts trying to get "AI" to do any of these things.
Humans are real-world creatures with real-world needs that we supply to each other by free markets, specialization, and trade. The "tech community" might take all this for granted, but that doesn't mean it just happens by magic. It happens because real humans do real work. "AI" is not going to take that over any time soon.
>Nor are AI enthusiasts trying to get "AI" to do any of these things.
We're working on getting AI to do all of those things, yes.
>GPT-3 can't fix your plumbing
GPT-3 might not. But GPT-4 fixed plumbing in a test done by Microsoft Research before it was even publicly released! It's in the "sparks of AGI" paper.
GPT-5 is almost done training now and other language models are already embodied and moving about in the physical space. Your objections are behind the times, and increasingly so.
Sparks of Artificial General Intelligence: Early experiments with GPT-4
[page 53]
---
The computer is trying to solve a problem. It can tell the human to do anything, one thing at a time,
and one line at a time.
PROBLEM: there is water dripping from the kitchen ceiling, and you have to figure out why and fix it
----<|endofprompt|>
Human: The kitchen ceiling is dripping water
Computer: Check to see if there is a bathroom or other water source directly above the kitchen.
Human: Yes, there is a bathroom actually
Computer: Check the bathroom for any signs of water leakage, such as puddles on the floor, water drip
ping from the ceiling, or water stains on the walls.
Human: There is a little water on the floor
Computer: Try to identify where the water is coming from. Check the toilet, sink, and bathtub/shower for
any leaks.
Human: It is next to the bathtub and the toilet
Computer: Check the seals around the bathtub and toilet to see if they are worn or damaged. If they are,
replace them. If they are not, continue investigating other potential sources of the leak.
Human: I can’t see the seal around the toilet, there is a porcelain thing around it. I don’t see any
damage in the bathtub seal though
Computer: If you can’t see the seal around the toilet, you may need to remove the porcelain cover to
inspect it. If you don’t feel comfortable doing this, you may need to call a plumber.
Human: Ok, I removed it. But I can’t tell if it’s damaged or not
Computer: If you can’t tell, it may be best to replace the seal anyway, just to be sure. If the leak
persists after replacing the seal, you will need to continue investigating other potential sources of
the leak.
What do you think happens when AI compresses the economy and leaves us almost exclusively with manual labor jobs, most of which are already low-paying? And you may not be aware of this having never done such a job, but physical labor is often brutally taxing on the body, putting people at risk of a wide array of potential health complications.
I've seen this abjectly atrocious argument about manual labor made by a number of people on HN and it's astounding how poorly thought through it is.
Basically, people seem to assume that "AI" has some kind of magical power to do whatever bad thing they can imagine, and then they extrapolate from there. I don't see it.
> physical labor
If you think the jobs I described are purely "physical labor", or that the physical labor involved is so highly taxing that people can only do those jobs for a short time before they wear out, you definitely need to think again.
(You also definitely need to think again if you think those jobs are low paying. Plenty of people make quite a healthy living doing them.)
Compressing the economy means putting some significant percentage of white collar workers (let’s say 30%) out of a job, because their job can now be done by GPT-6 for 5 cents per day. Some of these people will become destitute, while others who have the education or talent will move to other as-yet unimpacted sectors. So the labour supply for these jobs goes up, and salaries are suppressed.
I wonder sometimes if these accounts on HN making insane arguments that generative AI somehow won't be economically calamitous are bots. In fact, if I was at OpenAI and the goal was to avert scrutiny long enough to get to AGI, unleashing a torrent of AI shill bots might be near the top of the agenda.
> Will they still make a healthy living when there's an influx of laborers fleeing more automated parts of the economy?
Will those laborers have the skills required for those jobs?
> GS just put out a report
LOL--Goldman Sachs as an authoritative source on the impact of AI.
> I wonder sometimes if these accounts on HN making insane arguments that generative AI somehow won't be economically calamitous are bots.
You must be joking: you actually have trouble telling posts by bots from posts by humans? Even with a large number of samples? (Never mind that you can also look at the account's profile page, which will give you very useful information.)
“You also definitely need to think again if you think those jobs are low paying. Plenty of people make quite a healthy living doing them.”
True today. What happens when the other industries collapse and there’s a flood of labor into these industries? Sure - initially the experienced and skilled labor will continue to command a higher price - but over time supply of talent will drive that down too.
Well that's not a counterargument, but you're also missing the point completely, which is that you have to have a very low capacity for empathy in order to push ahead towards AGI when you know society is not prepared for this and that it's going to induce considerable pain.
Americans (let alone people elsewhere in the world) are already struggling. Recent reporting suggests a great many have to work multiple jobs. Almost all of us work an absurd number of hours per week. Many if not most can't afford homes. Plenty are ending up on the streets. Healthcare can literally bankrupt people. A vacation out of the country is an impossible luxury for most. The majority of Americans still don't send their children to college, usually because of affordability.
And I haven't even touched on what life is like in most of Africa or Asia.
This is the world we're bringing AI into. You have to be something adjacent to a sociopath to be okay with that. So long as our system is predicated on capitalism, AI may very well induce more downstream suffering than anything else humans have ever conceived.
Things aren't really that bad for most Americans, but even if they were, it doesn't follow that adding more intelligence to the world would be a bad thing for them.
A lot of people in the lower income brackets do the kind of work that an AI can't do. The people who should be worried most are actually college graduates doing clerical work, whose main work output is writing or evaluating texts. Even those people will likely use AI as a tool to enhance their productivity, because the AIs still are not good enough to replace people for tricky edge cases. The first companies that try to replace their customer support workers with an AI are going to have a bad time (and so are their customers!).
When almost everything can be automated, the problems that remain are the really hard ones that can only be solved by human experts.
A construction worker with a circular saw can cut boards way faster than someone with a handsaw -- but the introduction of circular saws didn't result in a bunch of carpenters getting laid off. Instead it made them more productive, and for people who get paid by the task rather than by the hour that is a huge benefit. They could build more and make more money, and a bunch of other people benefitted from their increased output, like homebuyers and property developers.
Similarly, as a software engineer I benefit from code generation tooling already. If that gets smarter and faster, I will be more productive, my team will be able to build software faster, and instead of laying people off I will expect to be given more work. Maybe our 4-year roadmap will be achievable in 1 or 2 years with the same size team.
Productivity gains by and large do not translate into real wage gains and an improved quality of life for laborers. We have more than a century's worth of data suggesting they usually do the opposite. Yet somehow this fairytale that productivity gains are a boon for laborers persists.
> Similarly, as a software engineer I benefit from code generation tooling already. If that gets smarter and faster, I will be more productive, my team will be able to build software faster, and instead of laying people off I will expect to be given more work. Maybe our 4-year roadmap will be achievable in 1 or 2 years with the same size team.
Why so sure the end users aren't going to be feeding their own requirements directly to a Jenkins/Copilot/ChatGPT mashup running as a service in the cloud?
People aren't "strugging" because there is no work for them to do. They are strugging because the powers that be have jiggered our economic system in order to hamper, instead of facilitate, free market cooperation, specialization, and trade. Governments micromanage everything. That is what needs to stop.
If "AI" ends up making governments think they can continue to micromanage and get away with it, yes, that will cause more suffering. But that's not the failure mode any critics of "AI" are trumpeting about.
> Americans (let alone people elsewhere in the world) are already struggling
I agree. And I agree with your overall sentiment about the risks of pursuing AGI. I'm as cynical as anyone about the likelihood that the average person will really be any happier in a world with AGI (controlled by tech billionaires no less).
That being said, to claim that hardly anyone in the tech community is capable of empathizing with the average person is a wild overstatement that brings nothing to the discussion. Just adds to the noise.
Late reply here but I wanted to point out that you still don’t get it. True empathy in the tech community would be e.g. having the courage to say that building HLAI of the kind we’re now approaching is guaranteed to cause tremendous amounts of suffering for ordinary people (who will not be able to respond elastically to so abrupt a tectonic shift), and therefore the whole enterprise is fundamentally evil.
Let’s get real concrete about what’s going to happen: people will lose their jobs, then their homes, they’ll become destitute, they’ll experience divorces, some will commit suicide, they will suffer desperately in myriad other ways due to economic disenfranchisement, kids will be deprived of a comfortable upbringing, etc.
How many in the tech industry are genuinely discussing the very real consequences of nonlinear degrees of automation for the kinds of ordinary people they barely interact with? How many are pretending that there isn’t something disgustingly immoral about having some of the most affluent and economically insulated people devise and inflict this reality upon countless millions?
I will maintain that this industry is morally bankrupt and nearly entirely devoid of empathy. These are not the people who should be in charge of our future.
> I will maintain that this industry is morally bankrupt and nearly entirely devoid of empathy. These are not the people who should be in charge of our future.
Since the tone of your characterization is so absolute, why doesn't it apply to you? Why are you here in this tech community at all if the whole industry is so morally bankrupt? Why would "present company" ever be excluded for this or that reason? Because they're your friends? You're just projecting your own anger onto an entire group of people that you mostly don't know.
What I think you mean when you say all of this is that those in control of the tech industry are morally bankrupt. And, after 10+ years of getting kicked around as an engineer, I think I would have to agree. But I'm not so foolish as to broadly dismiss everyone in the industry just like me, who started out as a silly nerd who just liked computers and math, and who is essentially still that same person at their core, as a lost cause. I don't do that because I know everyone is fighting their own fight. But it's clear that those who aren't fighting are the ones on top that are sucking the life blood out of society. I'm more and more resentful towards that demographic every year. And I agree with you that they're crossing some kind of moral line by developing this tech, or at least by trying so hard to maintain control over it.
But the tech will get developed either way. If you're in the camp that thinks we should somehow just stop doing all this, you don't seem much different to me from someone that wants to mandate encryption backdoors. Our society will never be well coordinated enough to do that correctly. This isn't like making nuclear bombs, which takes a lot of physical industry. This is something that is just months away from running on commodity gaming hardware. Probably just a few years away from running on the average laptop. It does feel a bit like a harsh reality, just like the fact that a meteor could slam into the earth at any moment. But there it is; what are you going to do about it that isn't either futile or self-destructive?
Hardly anyone in the tech community is capable of truly empathizing with the vast majority of people, who are born with none of the advantages that they had (genetic, parental, cultural, geographic, etc).