If it makes you feel any better, I think this exec agrees with you that FB should not necessarily moderate. That said, judging by the blowback being received, it looks as though this exec's position will likely not win the day in the long run.
I'm not sure what this comment is trying to get at? It's a blanket call for Facebook to take "accountability", but for what specific actions. Private corporations will always have control on what goes on with their platforms, if the US government ran a version of Facebook, it would be a different story
Like, there's a difference between removing material that's actually illegal and removing material that they just don't agree with as a company. I think the difference is important.
Right, I think the problem then is that we only use one word for both, while at least in the "western" world I imagine a lot of people are ok with the former but not the latter. And saying something like "you can only censor material which is explicitly illegal" is not helping either, because like you said - that's still censorship.
You've implied two separate false equivalences, here:
• That "censorship" necessarily and only means "removing obvious child pornography", AND
• that in order to "take accountability", a company must "hire a ton of people".
You haven't established why either of those equivalences are a reasonable interpretation of the parent statement. And to this casual observer, it would appear as though you are deliberately conflating extremes in a rather disingenuous fashion.
No, the poster is not implying that "'censorship' necessarily and only means 'removing obvious child pornography'". They're implying that removing child pornography is one form that censorship might take, and that entirely removing the ability to censor would remove the ability to censor child pornography. You might have valid objections to this, but it's not the same as what you've written.
As to the second point, either an organization has the manpower to defend itself from content that it is "accountable" for, or it will be unable to defend itself when necessary. It seems pretty clear to me that only organizations with decent resources would be able to moderate content in the "if you censor, you're accountable" regime.