I made some perhaps overly hasty generalizations. HN is a community that is heavily skewed towards a belief in transhumanism (especially that AGI will be coming very soon) and moreover that high rates of technological unemployment are upon us, necessitating a UBI or stronger reforms to make sure the general-purpose labor saving of AGI doesn't cause social unrest from crowding humans out of the labor market.
I can't imagine what else could the GP have been referring to other than mass automatized labor for most sectors, hence either AGI or lots of narrow AI.
I think you accurately characterize a fringe of HN, but I think support for UBI, etc., isn't all (or even mostly) about imminent AI singularity (though certainly some is based in that belief), much of it is a belief that the Band-Aid over traditional capitalist economic relations provided by the modern mixed economy, particularly as implemented in the US, which remains heavily based on the assumption of steady wage labor with one primary employer (which, itself, is almost a pre-capitalist, feudal/manorial model) as the normal means for people to earn income, is fraying given present levels of automation and preferences for ad hoc interactions, necessitating a more efficient model of support to cushion the blows of long-term and short-term dislocation in an increasingly automated and increasingly fluid market, a need which will only get more acute with increasing automation (even fairly mundane automation that falls well short of AGI.)
> I can't imagine what else could the GP have been referring to other than mass automatized labor for most sectors, hence either AGI or lots of narrow AI.
* Renewables generating the majority of power consumed world wide
Second is a real possibility during the next 20-30 years. First and third are significantly more doubtful.
Even assuming all three come in place soon, I don't understand how they'll fix public choice. If anything, at least autonomous electric vehicles will be a boon on private enterprise, which I don't think is what you had in mind.
I have made a comment or two recently about how unlikely AGI/superintelligence is in the next 50+ years. Those comments received a net positive response. I don't think HN is as biased toward AGI as you might think. Then again automation in labor tasks doesn't require AGI --just specialized programs that work well. Advanced automation is very likely to have an impact in the next 5-10 years.
Advancing automation in labor has been having a progressive effect for several decades, and will to all evidence keep doing so for the foreseeable future.
That's a good point. I should have said "a more significant impact". It could be argued that the impact of the last decades were offset by increased support role and tech jobs. I think in 5-10 will be when a lot of the current tech and support jobs also get filled by automation. Or where its an "automation of creativity" where you don't have a broad general intelligence, but you do have programs that present a few good options that could be considered creative depending on the domain.
I can't imagine what else could the GP have been referring to other than mass automatized labor for most sectors, hence either AGI or lots of narrow AI.