Are you trying to use this question as a rhetorical device, or are you genuinely asking what Facebook has done wrong?
If you're being genuine, the answer is:
* They allowed for advertising campaigns using their tools which were flat out illegal (e.g.: Real-estate advertising targeted based on race.)
* They have failed to meaningfully control over illegal uses of their platform in any way. (See: The current Rohingya issue)
* They were far too liberal with access to user data, in a way that is currently driving a debate around the legality of doing so. (Cambridge Analytica, etc.)
* They have built tools that are ideal for doing things like illegally swaying elections. ("Russia doing stuff".)
I can only assume there are many other things that Facebook have made possible which are illegal and completely unnoticed. It's hard to try to build any sort of exhausted list of unknown unknowns, though, so I'll leave it at a prominent four.
-
If you're using rhetoric, the thing you'll probably say next is "but that's not Facebook being bad, that's people using Facebook! You can't blame a tool for how it's used!", but the problem with that is that Facebook exerted editorial control over the platform in numerous ways over the time period in which these things happened, both in advertising and in news-feed sorts of ways.
Any person doing any of these things would be punished.
Any person aiding and abetting any person doing any of these things would be punished.
It was established long ago that the phone company isn’t guilty, legally or morally, if two people use telephones to organize a crime. Stopping crimes was the police’s job, who could wiretap phones with a court order if necessary.
Current public opinion is divided on whether internet communication services should take it upon themselves to detect and prevent crime. It may be the best solution, but there are pitfalls and we really have to decide as a society how to change the system.
So your point of view may be the best, but the way you present it as self-evident is misplaced.
> It was established long ago that the phone company isn’t guilty, legally or morally, if two people use telephones to organize a crime.
In this case, Facebook is not a neutral communications provider. They are selling and displaying the ad. They are a party to the transaction.
> Current public opinion is divided on whether internet communication services should take it upon themselves to detect and prevent crime.
Yes, there is debate as to how involved Facebook should be if two independent parties are, say, coordinating crime on Facebook messenger. That's a totally different case than what we're discussing.
And they are also curating who’s phone calls get through. The phone call never edits or bans content on their lines (generally.) Facebook as a utility is a false idea: they are a publisher whether they say so or not. A neutral platform doesn’t have nebulous “community standards” nor algorithms to determine the visibility of content.
It's the same word games as the "sharing economy".
Yeah, Uber drivers are "independent contractors" rather than employees because they have to buy their own cars and are totally free to do what they want (as long as they strictly follow the terms and conditions while being under close supervision of their performance).
Yeah, Facebook is "just a platform" because users provide the content and Facebook just decides what content to show, whom to show it to and charges users to show it more prominently or to show it to very specific groups of people.
We can discuss what the law should be all week long. By that time the barn door has been open and the horse is miles away. This isn’t just solved by regulation.
And the "phone company" is now a highly regulated utility that was broken up by the government, to include those surveillance capabilities. FB doesn't get to be Ma Bell and I doubt they would welcome the comparison.
The question here isn't even if FB should be regulated or broken up like the phone company. People are simply realizing the consequences of something like FB even existing. The market seems to be taking corrective actions on its own.
There is a moral difference between doing and not doing that you seem to be glossing over. Facebook didn't do enough to stop all sorts of things. But the question is what moral responsibility do they have for not recognizing issues soon enough, or not committing enough resources to responding, etc. It's not obvious that they have the backwards-looking responsibility that your argument requires.
The role of Facebook and other social media in swaying politics is almost presented as if it began in 2016.
In reality what the U.S. saw was more like version 3 of a society manipulation tool.
Looking at the events in France recently, the 2016 election, Brexit, the Arab Spring and to some degree the 2008 election. We can see traces of targeted information inflaming a populace for political goals if we look back a decade.
This is not about Facebook specifically, it's more an observation on the kinds of social media manipulation that can happen with these tools. The question becomes, how much of this was an intended consequence of building these tools? A cynical view could make a strong case that it was intentional.
The role of social media in the Arab Spring was spun as a positive thing at the time though, as was the way it was used by Obama in the 2008 and 2012 elections.
Sure, but that proves Facebook was aware of its influence as early as that and had enough time to consider the implications. If you find that politicians are using your product to sway public opinions, it's your moral responsibility to think about what that means and how it can be abused. You can't just shrug your shoulders ten years later and pretend you're surprised.
> There is a moral difference between doing and not doing that you seem to be glossing over.
Sure, I'm glossing over it, because it's not really a useful distinction for me to make in this case.
The difference between doing and not doing is that one is a deliberate criminal action, the other is negligence. Both of these things are, in various ways and in various situations, considered immoral and illegal.
The harshness of the punishment may vary, but the fact of punishment does not.
By way of analogy, if a company operates a nature preserve and invites people to walk through it and then sells licenses for other people to come and shoot at people on the preserve, and gives them lessons on how to shoot people, then that company will probably not be in business much longer.
But negligence implies an obligation to act. Did facebook have an obligation to recognize the breadth and depth of the Russian operation against the U.S.? I don't see how.
These are the questions that folks who want to see facebook suffer consequences of some kind need to address. Your argument is not made simply by pointing out that bad things happened on facebook.
You're implying the problem is Russia or that the "attack" was somehow particularly sophisticated.
It was entirely possible, using the tools Facebook provides, the way those tools are intended to be used, to do extremely unethical and illegal things without Facebook intervening.
If I run an unprotected server and allow people to execute arbitrary code on it it should be pretty clear that I was being reckless if it turns out after ten years that all this time script kiddies have been running elaborate botnets with it. Especially if the code examples I provide take users 90% of the way to doing that.
I was genuinely asking, and none of your answers really put that much blame on Facebook, I mean come-on "They have built tools that are ideal for doing things like illegally swaying elections. ("Russia doing stuff".)"
They built tools that can be used this way? Are you serious? This is the big crime? (And the very concept of "illegally swaying elections by telling people things" bothers me - how can it be illegal to say something like that? But that's a different discussion.)
Or: "They have failed to meaningfully control over illegal uses of their platform in any way. (See: The current Rohingya issue)"
This is the job of Facebook? Isn't that a police/government job?
> They allowed for advertising campaigns using their tools which were flat out illegal
Knowingly allowed after being informed, or didn't realize they could be used this way?
> the thing you'll probably say next
I wasn't using rhetoric, but now that you answered me, we can fast forward the discussion.
And yes, that's exactly what I'll say next.
> is that Facebook exerted editorial control over the platform in numerous ways over the time period in which these things happened, both in advertising and in news-feed sorts of ways.
Are you saying that when you bought an ad someone at Facebook reviewed it? Is so than yah, I agree, bad on Facebook.
But why do I have a feeling they did not actually review ads?
Edit:
"a group of Russian agents utilized not only Facebook and Twitter, but also sites including Instagram, Tumblr, and YouTube"
i.e. they used everything, yet somehow it's Facebook's fault.
So let's say I build a nuclear plant. I make a lot of money from this nuclear plant, and I'm providing a useful and arguably essential service to my community.
Now, it turns out my nuclear plant is emitting toxic radiation and poisoning the community. Not only that, but the problem was blatantly apparent for years, everyone with the slightest bit of foresight tried to bring it to my attention, and to the attention of the community leaders.
The problem was identified, but I did absolutely nothing to resolve it even after being warned. When the problem shifted from a hypothetical to a forensically provable reality, I tried to cover it up and minimize it, while my gross negligence exerted immeasurable harm to the community.
That is the story of Facebook. When you build things - especially giant pieces of infrastructure which communities rely on - you have a RESPONSIBILITY to do your very best to ensure it does not cause harm. Facebook is not responsible for everything which occurs on their platform, but they sure as hell have a responsibility to at least apply due diligence in reducing the harm it causes.
Facebook did not apply that due diligence. They were negligent at the very least, if not openly malicious in their intent.
"Now, it turns out my nuclear plant is emitting toxic radiation and poisoning the community."
wrong analogy.
the plant isn't emitting anything. it is external people, outside of the plant, that the plant is providing power to, power that is then used by these outside forces in order to poison people.
the plant is just providing the energy. the people outside the plant are using it to poison others.
Facebook's not providing some fungible supply of something. They are taking in text and using their own methods to choose which to feed back, often being paid for it.
If the nuclear plant lets people send in fuel, it needs to inspect it before use.
If you're being genuine, the answer is:
* They allowed for advertising campaigns using their tools which were flat out illegal (e.g.: Real-estate advertising targeted based on race.)
* They have failed to meaningfully control over illegal uses of their platform in any way. (See: The current Rohingya issue)
* They were far too liberal with access to user data, in a way that is currently driving a debate around the legality of doing so. (Cambridge Analytica, etc.)
* They have built tools that are ideal for doing things like illegally swaying elections. ("Russia doing stuff".)
I can only assume there are many other things that Facebook have made possible which are illegal and completely unnoticed. It's hard to try to build any sort of exhausted list of unknown unknowns, though, so I'll leave it at a prominent four.
-
If you're using rhetoric, the thing you'll probably say next is "but that's not Facebook being bad, that's people using Facebook! You can't blame a tool for how it's used!", but the problem with that is that Facebook exerted editorial control over the platform in numerous ways over the time period in which these things happened, both in advertising and in news-feed sorts of ways.
Any person doing any of these things would be punished.
Any person aiding and abetting any person doing any of these things would be punished.
Why not Facebook?