Facebook has absolute control to limit the spread of a story or news on their platform.
Facebook has absolute control to outright block specific content based on an algorithm they control.
Facebook can and will editorialize content posted on their platform.
Facebook has absolute control over the content that users see.
Can someone, anyone, please explain to me why Facebook is not a publisher? Why should Facebook continue to receive Section 230 protections? They are gatekeeper, publisher, and editor.
It also makes users responsible for their own words, which is legitimate. Without 230, Facebook would become liable for defamation lawsuits unless it basically censored everything but the most bland content.
You could "hack" the law by saying it only applies to companies with market cap less than 100bn. This is how some laws have been written in the past to specifically jeopardize particular companies.
If you change a law like this to specifically attack one business, you're basically dumping the essence of the rule of law. Generally speaking, people are angry at Facebook's moderation decisions for political reasons. While I think they should abandon the platform, users would prefer that Facebook just enable their tribe to win. It's a silly situation. However, making Facebook liable for defamation as a publisher will just kill the platform entirely and that's not what these users actually want.
> Without 230, Facebook would become liable for defamation lawsuits unless it basically censored everything but the most bland content.
> making Facebook liable for defamation as a publisher will just kill the platform entirely and that's not what these users actually want.
What the law should be changed to say is something like "if you censor any legal content based on its message, you become liable for everything". The point is that since being liable for everything would kill them, they'd stop censoring instead.
It sounds like you have never been a moderator for an online community of at least moderate size.
If you've never been in that role, it's easy to not only underestimate the sheer amount of garbage that you need to remove, so regular users won't be turned off and leave, but also the dedication and finesse of some of the trolls, who love nothing more than asymptotically approaching that line, without ever fully crossing it.
What you propose is effectively giving these guys a doomsday device (i.e., "if you censor me erroneously, your platform will be dead"). In that scenario, there is a 0% chance that a place like Facebook would not turn into an absolute cesspool.
How about if instead, we nationalize Facebook. The public square should not be owned by a private corporation. If that's not possible, yes: Facebook should be destroyed.
I'm from Brazil, here we have more than 3000 national companies, handling everything from oil, mining and energy to banks and in the past even telecommunications. It was a disaster, you had to wait for about one year to have a land line installed and it costed almost as much as a car, you had to declare it in your taxes!
Social media should be a protocol, in that way it will become naturally decentralized and way harder to censor or control, like email.
It is apparently already like that with Facebook, but the government has a layer of plausible deniability. If the FBI can show up and tell Mark Zuckerberg to censor a story, that means Zuckerberg is effectively a state actor.
We could do a Bell Labs and force each nation's Facebook to be spun off into its own entity, with a federated interop protocol to connect them together. I have no delusions that such a thing would ever be allowed to happen, though.
Not really. As the owner of a site, the government controls its own speech and is not under any obligation to show “both sides” of an argument. Sure, it can’t only show your comment vs mine, but it can definitely choose which propaganda to spread.
Exactly. As long as Facebook is a private corporation, the First Amendment protects their right to arbitrarily censor the speech on their platform however they wish.
The way state-controlled media censorship has turned out in other countries is well-known, but having our public spaces controlled by a patsy of the government, who is not subject to the same restrictions as the government even in theory, is not a better outcome.
Nationalizing social media is a solution... It would never happen even though many problems with the current system would be solved. Freedom of Speech, 4th amendment rights, and it wouldnt necessarily have to be profitable so it defeats the current ad driven hate machine.
Third party doctrine would no longer apply and so on.
There is a precedent for it with USPS. When a national postal service was created it was done with the realization of the ability for corruption if a service like that was in private hands. Benjamin Franklin knew this because he often read mail being passed around for intelligence, etc.
Benjamin Franklin was like an honest Mark Zuckerberg in that respect.
I find the replies to this fascinating. I don’t know if nationalization would work with FB, but it is interesting what a taboo it appears to be to even bring it up.
Personally, I think 30 years ago the USPS should have made a nationalized email service at the very least.
Now we just need to convince folks that Facebook is a town, not say... a digital publication. Then it can be nationalized as was originally suggested. Although maybe the government should try using eminent domain before outright nationalization?
Still, the whole idea seems like a stretch and a crazy one at that.
"The more an owner, for his advantage, opens up his property for use by the public in general, the more do his rights become circumscribed by the statutory and constitutional rights of those who use it. Cf. Republic Aviation Corp. v. N.L.R.B., 324 U.S. 793, 65 S.Ct. 982, 985, 987, note 8, 157 A.L.R. 1081. Thus, the owners of privately held bridges, ferries, turnpikes and railroads may not operate them as freely as a farmer does his farm. Since these facilities are built and operated primarily to benefit the public and since their operation is essentially a public function, it is subject to state regulation."
The argument here would be that Facebook is built and operated primarily as a benefit to the public (i.e. they draw income by being a benefit to the public), they are essentially a public function, and are subject to state regulation as such.
Facebook would remain privately-held, and could either remain open to the public and subject to constitutional scrutiny, or they could close off and become some kind of a private club like Clubhouse.
I haven't thought through the ramifications of doing so, but I think it could be legally supported.
It's probably more like a utility at this point and perhaps should be regulated as such. Utilities generally don't interject themselves into people's communications, and they can't refuse people service except for carefully regulated reasons.
The only publishing-like activity social media sites partake in are the curated news feeds. It's not clear that the service hosting accounts and people's communications as a utility has to be the same as the one providing curated feeds. That's one way to approach this anyway.
Saying that it's a communications utility may be the best argument for regulation. However, I don't think Facebook provides anything essential and so it's much closer to entertainment/media in my mind. I'll continue to listen but they seem to be in the clear for now.
> However, I don't think Facebook provides anything essential
I'm speaking to social media in general. Is it not undeniable that if there's a public outcry on social media, that politicians pay attention? Does it not then follow that access to social media is fairly important to having a voice in our politics? Is it not also true that something like 50% of Americans get their news form Facebook? These are only surface level facts, there deeper issues like, for instance, ability to organize political campaigns for a cause without access to social media being severely hampered.
I don't know what your threshold for "essential", but access to these platforms sound pretty important to me, and I expect these trends will only increase.
The court pointed out that the more an owner opens his property up to the public in general, the more his rights are circumscribed by the statutory and constitutional rights of those who are invited in.
Yes, but which country gets to nationalize it then? I'm assuming you mean the USA, but where does that leave the users of all of the other countries that also use Facebook?
The same problems with moderation would still exist, and as the previous commenter I think correctly pointed out, the various tribes will all still want the 'other' side's 'misinformation' censored and allow their own 'truths' put out for all to see. Zero moderation is a not workable solution because things would quickly escalate on the site (see what happens to AI chat bots when exposed to an unfiltered internet). They have to do some moderation and the more the do, the madder people get, the less they do the worse the site becomes. They therefore reach this balanced state which has BOTH angry users who feel their side is being censored AND widespread posts with dis/misinformation. It's the shittiest equilibrium.
Indeed. The word 'platform' doesn't appear in the entire section. The only time the word 'publisher' appears is:
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
And that's the trillion dollar line. The reason for its justification is because as an owner of a digital platform, you can't get sued by Joe for defamation because somebody said something defaming about him on your platform.
Publishers can, and do, get sued for defamation because they're responsible for what they publish. Now let's imagine a hypothetical extreme where OnlineSite ends up censoring any posts related to Joe, except those that defame him. It's quite clear that the site themselves are now actively defaming Joe, but they can use 230 to hide behind that line making them legally immune.
In short, the more censorship a site engages in, the more they are effectively publishing their own views and those they defacto endorse.
The question is when is “information provided by another information content provider?”
Suppose Facebook (corporate) generates content, sells it to an independent company, then leases it back and publishes it?
Suppose Facebook uses millions of user-submitted pictures to train an AI, then publishes the resulting images? Are those “information provided by another information content provider?”
Suppose Facebook copies and pastes user content, but claims to be the author and puts “written by Facebook and representative of the company’s official views.”
Suppose they have samples that say A and B, and they choose to show A and not B?
Suppose they like the gist of A, but it would be more compelling if they edited the image, changed the wording, and cut out other parts?
Suppose they like A, but hires staffers to completely rewrite the story, like Disney redoing Cinderella?
The legalese defines this: "The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service." In all your examples except the 4th Facebook would clearly be the publisher/provider, and responsible. The 4th is where things get tricky largely because of yet another part of the law:
"No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected"
Wrangling over the term "good faith" is going to be a lot trickier, especially when plausible deniability enters into the picture.
To my knowledge Hacker News does not publish misinformation that labels facts as misinformation. This is what Facebook, Twitter and YouTube do repeatedly. They’re publishers full stop.
HN moderation will [dead] the content. Different mechanics, but same effect - basically kills its distribution.
Of course, there are valid questions about prevalence of that, and there’s likely a point where moderation becomes publishing. But if that point was mere existence of moderation, every single platform is a publisher.
You seem to be thinking of author, not publisher. Lots of publishers publish material they didn’t write themselves, they’re still publishers when they do that.
As an offline-platform can make all sorts of bureacratic rules designed to censor those they don't like, this problem is far-far worse on a software-driven platform.
Imagine facebook wants to censor anti-Biden, or anti-Trump, posts.
They needn't design an algorithm that does explicitly that - that could be scrutinised under review.
Instead: design an 'anti-spam' algorithm.
Within this anti-spam algorithm, characteristic Y is flagged.
Characteristic Y just happens to be heavily-correlated to anti-biden/anti-trump posts. A bit of filtering later (for 'anti-spam measures' of-course), and you have achieved your censorship without being too overt.
see section 230 of a very specific US law-act as that law provided some outs for internet firms as far as being legally not declared a publisher if they volunteered to do specific things.
Facebook has absolute control to limit the spread of a story or news on their platform.
Facebook has absolute control to outright block specific content based on an algorithm they control.
Facebook can and will editorialize content posted on their platform.
Facebook has absolute control over the content that users see.
Can someone, anyone, please explain to me why Facebook is not a publisher? Why should Facebook continue to receive Section 230 protections? They are gatekeeper, publisher, and editor.