I won't elaborate on the Stevie Wonder quote. I think it's perfect the way it is.
--
I can, however, elaborate on the subject separately from that quote.
The video talks about the more extreme cases of AI cultism. This behavior follows the same formula as previous cults (some of which are mentioned).
In 2018 or so, I noticed the rise of flat earth narratives (bear with me for a while, it will connect back to the subject).
The scariest thing, though, was _the non flat earthers_. People who defended that the earth was round, but couldn't explain why. Some of them tried, but had all sorts of misconceptions about how satellites work, the history of science and so many other mistakes. When confronted, very few people _actually_ understood what it takes to prove the earth is round. They were just as clueless as the flat earthers, just with a different opinion.
I believe something similar is happening with AI. There are extreme cases of cult behavior which are obvious (as obvious as flat earthers), and there are the subtle cases of cluelessness similar to what I experienced with both flat-earthers and "clueless round-earthers" back in 2018. These, specially the clueless supporters, are very dangerous.
By dangerous, I mean "as dangerous as people who believe the earth is round but can't explain why". I recognize most people don't see this as a problem. What is the issue with people repeating a narrative that is correct? Well, the issue is that they don't understand why the narrative they are parroting is correct.
Having a large mass of "reasonable but clueless supporters" can quickly derail into a mass of ignorance. Similar things happened when people were swayed to support certain narratives due to political alignment. The flat-earthism and anti-vaccine pseudo nonsense is tightly connected to that. Those people were "reasonable" just a few years prior, then became an issue when certain ideas got into their heads.
I'm not perfect, and I probably have a lot of biases too. Narratives I support without fully understanding why, probably without even noticing. But I'm damn focused on understanding them and making that understanding the central point of the issue.
--
I can, however, elaborate on the subject separately from that quote.
The video talks about the more extreme cases of AI cultism. This behavior follows the same formula as previous cults (some of which are mentioned).
In 2018 or so, I noticed the rise of flat earth narratives (bear with me for a while, it will connect back to the subject).
The scariest thing, though, was _the non flat earthers_. People who defended that the earth was round, but couldn't explain why. Some of them tried, but had all sorts of misconceptions about how satellites work, the history of science and so many other mistakes. When confronted, very few people _actually_ understood what it takes to prove the earth is round. They were just as clueless as the flat earthers, just with a different opinion.
I believe something similar is happening with AI. There are extreme cases of cult behavior which are obvious (as obvious as flat earthers), and there are the subtle cases of cluelessness similar to what I experienced with both flat-earthers and "clueless round-earthers" back in 2018. These, specially the clueless supporters, are very dangerous.
By dangerous, I mean "as dangerous as people who believe the earth is round but can't explain why". I recognize most people don't see this as a problem. What is the issue with people repeating a narrative that is correct? Well, the issue is that they don't understand why the narrative they are parroting is correct.
Having a large mass of "reasonable but clueless supporters" can quickly derail into a mass of ignorance. Similar things happened when people were swayed to support certain narratives due to political alignment. The flat-earthism and anti-vaccine pseudo nonsense is tightly connected to that. Those people were "reasonable" just a few years prior, then became an issue when certain ideas got into their heads.
I'm not perfect, and I probably have a lot of biases too. Narratives I support without fully understanding why, probably without even noticing. But I'm damn focused on understanding them and making that understanding the central point of the issue.