I disagree with the notion that anyone anywhere can do the job. These moderators don't just moderate the rants your grandma posts about shopping malls. There's a documentary about what the content moderation people do, and they end up seeing horrific stuff like (sexual) child abuse, gore, and all the other worst thing humanity has to offer. Many of these people end up depressed or in therapy.
I don't know if Facebook has their stuff together, but I think it's unethical to have people review random user uploaded content without close access to a mental health specialist.
You can have several degrees of intensity a reviewer might be able to see (to not expose all reviewers to the very worst on a regular basis), but no algorithm can clearly identify the nastiest of the nasty content. The algorithm sees "government pedo club", flags it as fake news, and who knows what the shared content actually contains. It could be a conspiracy nut, it could just as well be actual child porn. The probability is low, but you need someone standing by just as well, in my opinion.
Isn’t it a bit silly that these Big Brother companies (e.g. Facebook, Twitter) are so afraid of global communication being taken over by small, yet persuasive groups, that they set about to take over global communication by their…small, yet persuasive company?
…then it's not content moderation, which AFAIK companies like Facebook and Google require to take place on-site, in a controlled environment with no electronic devices, due to the potential data security issues involved.
FB team can be overwhelmed with Covid related misinformation.
A lot of content moderation is outsourced to countries like India where productivity, availability may have degraded due to covid that ravaged through the country.
Many firms still have backlogs from Covid disruption.
If you have worked at / started any half-decent sized company you'd have known