> The goal, the company said, was to avoid a race toward building dangerous AI systems fueled by competition and instead prioritize the safety of humanity.
> “You want to be there first and you want to be setting the norms,” he said. “That’s part of the reason why speed is a moral and ethical thing here.”
Clearly having either not learned or ignored the lessons from Black Mirror and 1984, which is that others will copy and emulate the progress.
The fact is that capitalism is no safe place to develop advanced capabilities. We have the capability for advanced socialism, just not the wisdom or political will.
(I’ll answer the anonymous downvote: Altman has advocated giving equity as UBI solution. It’s a well-meaning technocratic idea to distribute ownership, but it ignores human psychology and that this idea has already been attempted in practice in 1990s Russia, with unfavourable, obvious outcomes).
>Then following up with whatever your answer is: Why are you picking and choosing which fictions are reasonable?
This is arguing in bad faith. You don't care what their answer will be, you have decided that they are absolutely picking and choosing, and will still accuse them of as much even if their answer to your first question is, "Yes".
You're right that I don't care, because it has already been decided that Orwell is representing the future if things go "The Wrong Way (tm)", buy 1984 at Amazon for $24.99, world's best selling book. Or more succinctly to OP, "The Capitalist Way (tm)".
It's okay to decide that something isn't worth arguing against, and to spend your time in a way you find more productive.
Having articulated an argument (which you absolutely did), it's not okay to try to retcon that you were just trolling and everyone else is the fool for having taken you seriously.
"The only thing stupider than thinking something will happen because it is depicted in science fiction is thinking something will not happen because it is depicted in science fiction."
> Maybe because they are more digestible then reality. Reality is much much worse.
That makes it infinitely worse, because ANY work of fiction will inevitably not be able to cover every minutiae of detail that reality mandates be covered, even the extremely rare & bizarre. And it is those one-off rare events & coincidences that will lead to significant global change. (See the the assassin buying a sandwich & the consequent assassination of Archduke Ferdinand)
Fiction allows for ideas to exist in a vacuum without any challenges from the outside. It allows for the perfect execution of said ideas without diving into the technical details for said implementations. It allows for the assumption of zero external AND internal resistance, & zero internal schisms. It treats irrational events as impossible to manifest, and coincidences as oddities instead of common occurrences.
In short: Ideas from fiction should be treated like the simplified universal laws of physics that's commonly shown to the mainstream - Idealistic, only tangentially related to the actual observed/calculated models, & abstracts over the complicated implementations underneath them.
Do we have the capability for advanced socialism? Because I recall all the smartest economists circa 2021 saying inflation wasn't a thing, it's transient, it's only covid affected supply chains. In reality we are in an broad sticky inflation crisis not seen since the 70s, which may be turning into a regional banking crisis.
It's difficult to believe we have reached advanced socialism capabilities, and all of the forecasting that would require, when we don't even understand the basics of forecasting inflation 1-2 years out.
The ambiguity of “advanced socialism” is problematic for any meaningful debate, so I apologise for that.
I was meaning something closer to “we have the resources and technology (in this advanced era), just not the wisdom or political will”. The actual nature of what could be provided is up for debate, but if we’re looking at mass unemployment in 2 decades’ time, perhaps it’s a conversation worth having again.
I agree it’s worth looking at the history, and to not repeat its mistakes, though at the same time this is a new situation, and it will continue to be new into the future, so sticking to heuristics may not serve humanity as well than being open-minded on the policy front.
> “You want to be there first and you want to be setting the norms,” he said. “That’s part of the reason why speed is a moral and ethical thing here.”
Clearly having either not learned or ignored the lessons from Black Mirror and 1984, which is that others will copy and emulate the progress.
The fact is that capitalism is no safe place to develop advanced capabilities. We have the capability for advanced socialism, just not the wisdom or political will.
(I’ll answer the anonymous downvote: Altman has advocated giving equity as UBI solution. It’s a well-meaning technocratic idea to distribute ownership, but it ignores human psychology and that this idea has already been attempted in practice in 1990s Russia, with unfavourable, obvious outcomes).