> The only power that the content moderator has, is to delete a post.
This surprises me, but it is consistent from Facebook's viewpoint where the user is the product. You can fire a customer, but you can't fire your product.
So maybe it is to be expected that Facebook implemented moderation as a QA process. It would probably be against their interests to implement policies that would police their community in a way that makes sense, with warnings, temporary and permanent bans etc.
> "You can fire a customer, but you can't fire your product."
On some level, money is morally neutral, but it's not like Facebook won't have enough "products" to sell if they dealt with those that violated their TOS more decisively.
That also contradicts an earlier statement in the article:
> The moderator has not only to decide whether reported posts should be removed or kept on the platform but navigated into a highly complex hierarchy of actions.
This surprises me, but it is consistent from Facebook's viewpoint where the user is the product. You can fire a customer, but you can't fire your product.
So maybe it is to be expected that Facebook implemented moderation as a QA process. It would probably be against their interests to implement policies that would police their community in a way that makes sense, with warnings, temporary and permanent bans etc.