"...following revelations of how its platform was used to manipulate black voters in the 2016 presidential election."
This seems somewhat misguided... I know Facebook is the whipping-boy de jour, but it seems evident that Facebook is the biggest victim in all of this.
All of the bad press FB is getting lately... literally years after events occurred and after they've been working to rectify the issues is curious.
After all the bad press that was hammering Tesla for an extended period of time quickly dried up (after Elon Musk showed just how far he was willing to go to shake the short-sellers), Facebook suddenly took its place.
I'm not one for conspiracy theories, but it really feels like someone with a lot of power has organized and weaponized information for the sake of manipulating stock prices.
That doesn't have much to do with the NAACP itself, but these information campaigns have a side-effect of modifying public opinion and sentiment.
I really kind of feel ashamed of posting this -- it seems like such a low-value conspiracy theory, but it just seems strange to me how all this is playing out.
a) historically it's done a lot of shady stuff and gotten a pass on that,
b) it really screwed up around election time, with major / unprecedented geopolitical and economic consequence,
c) it doesn't seem to have fully understood let alone fixed the problems,
d) it's business model is interfering with fixing these problems, and
e) it's managed this situation from a PR perspective very poorly.
All of those are facts on the ground that would suggest FB is operationally not as solid as people previously assumed, and that it has many (possibly inherent) vulnerabilities that are not easily fixed that add risk to the business - risk of people abandoning it, risk of regulation, risk of being manipulated for political purposes.
Certainly FB was used by those looking to manipulate public perceptions during the election (and since), and that makes it in a certain light a 'victim'. But they were also the creators and popularizers of this new way of being manipulated, and thus an excuse of being ignorant of the implications is a bit weak, especially because of how they've handled things since.
Just practically, it's extremely unlikely that independent news & research orgs are all conspiring to manipulate FB's stock price. At some point people stop giving companies or individuals benefit of the doubt, which is what's now happened to FB.
This pattern of bad press you're seeing with FB is going to happen a lot in the future with our 24/7 news cycle and rumor mill. Something will come out about a company or individual about them doing something bad that will get a lot of visibility. In that process a lot of their past transgressions will come out, or be re-evaluated in a new light, and it will happen through a slow drip of new discoveries that will keep them in the press for a while. This will break into mainstream discourse, and it may permanently damage their reputation. This is a pattern inherent in the mix of people/companies royally screwing up, the existence of deep digital info trails, social media giving a voice to every person to share juicy info, and the instantaneous social media news cycle.
Oops, sorry, didn’t mean to run into you with my Mercedes. It wasn’t on purpose and I swear I want the best for you! I was just a little tipsy...I am so sorry! I promise not to drink quite so much the next time I get on the road.
If you are making Billions off of people your mistakes are judged much more sharply and people expect you make up for the mistakes somehow. You don't have to agree with this but there is nothing that makes Facebook special and I know this as someone who grew up watching the internet from the get go. Sites like myspace, yahoo messenger, nexopia, and several others I can think off all died off. They seemed like world changers at the time and they all died off. It can happen to anyone and when you are making billions off of people you get to make very few mistakes.
> saying they did shady things in the past...neither legislators nor people seemed to think so
Your entire comment is ridiculous but I'll address this point... How many tech leaders have been summoned in front of world governments to justify their behaviour?
It isn't about left wing versus right wing. It is about weaponizing PII plus relationship data for profit and the inevitable consequences that come from such a revenue model.
This is exactly why I deleted my Facebook account in 2010. The writing was on the wall back then so I don't feel sorry for people who fear their privacy is violated now and certainly don't feel sorry for Facebook.
Can you be more specific? I don’t know what you are referring to.
Youtube and Google’s G+ social network were (like Facebook, Instagram, Twitter, etc.) used by Russian agents in the past few years to target Americans with disinformation with the intention of affecting US domestic politics. Youtube in particular is a powerful vector for fringe conspiracy theories etc., and IMO Google should take more responsibility for the poor quality of their recommendation engine and the negative social consequences thereof.
Soros is a wealthy US citizen philanthropist who spends his money on promoting government transparency, civics education, investigative journalism, free and fair elections, cultural exchange between different parts of the world, public health, and so on.
He also spends money on political campaigns and advertising in a way comparable to any number of billionaires with varying agendas. Personally I think it is indefensible that US campaign finance laws have been gutted and billionaires can spend unlimited money on political advertising etc., but Soros is not essentially different from numerous other billionaires (Adelson, Simons, Bloomberg, Koch, Uihlein, Mercer, ...) involved in the same type of political spending.
Disclaimer: when I was in high school 15 years ago, Soros’s Open Society Institute funded a US State Dept. sponsored program which sent my high school debate team on an exchange trip to Azerbaijan, and brought a group of Azeri students to California. It was a great experience for me.
Can we please stop dropping conspiracy-fueled asides like that? It's entirely tangential if taken at face value and my ears are bleeding from the dogwhistling.
From what I've seen in earlier reports, a decent chunk of the Russian work was just them reposting things that actual Americans had come up with and managing to reach a smaller audience than the original Americans. This really is about Facebook letting people skip US corporate media and about ideas they don't approve of reaching a broad audience.
Reposting things is likely the most effective idea, there's no risk of them accidentally sounding Russian, and it doesn't matter if they reach a smaller audience if that audience is significantly more targeted.
Let us not forget that Facebook has enabled in a way a form of direct democracy.
Subversive globalist players without any moral compass (such as George Soros and Hillary Clinton, whose foundation connections are being interrogated as we type) are just angry that they lost in a long time.
George Soros frequently propagates propaganda via his Open Society Foundations about the dangers of Facebook. Because he lost.
> George Soros frequently propagates propaganda via his Open Society Foundations
I am not agreeing or disagreeing with that, but so what if he does? He is a private person and you don't have to consume his politics just as I don't consume his political opinions or consume any trash on Facebook. It is the nature of liberty that people are allowed to express opinions you find offensive.
It becomes problematic if it is used to lobby for political change with private funding as a magnifier. Once you start to pump so much money into political movements, you can co-opt and change them. You get to drown some currents and magnify others by simply providing infrastructure and funding. It can be as simple as providing a speaker system to a protest and deciding who gets to use it.
Its a question common to campaign contribution and whether super PACs are to be considered free speech.
> It can be as simple as providing a speaker system to a protest and deciding who gets to use it.
That isn't a problem. Private individuals are not required to abide by a fairness standard when it comes to speech. Taken a step further talk radio stations are often extremely right wing are not required to fairly convey an alternate position. I am fine with that because as a consume I am not forced to consume it.
Multiple radio stations exist in parallel and you dont have to listen to any of them. Once you put up one speaker system at a location, like a central park that is the focus of a protest, you drown out everyone else. You are in control of the communication infrastructure and bought yourself an authority position.
The question of who is providing and controlling infrastructure is a real threat to protest movements, as it is easily possible to co-opt a political movement through funds this way.
We have clear limits on what lobbying is allowed to do and what kind of interference in a democratic system through funds is prohibited. Take vote buying as an extreme example or simply the ordinary regulations over campaign financing. We dont have that clear cut rules for what is acceptable in the case of non parliamentary political movements. Calling the targeted funding and co-opting of political movements problematic is appropriate in my opinion.
Reneé Girard has some interesting things to say about scapegoating.
> "Everywhere and always, when human beings either cannot or dare not take their anger out on the thing that has caused it, they unconsciously search for substitutes, and more often than not they find them."
Seeing the overwhelming vitriol towards Facebook (justified to a certain extent, but certainly over-indexed), I can't help but join you in this low-value conspiracy indulgence...
If Girard's prescription is true here, what really is the media so rampantly enraged about that they are seeking to continually beat the downed (arguably) dead horse that is Facebook's PR image?
* Is it related to an increasing coastal elite / tech / income-inequality distrust with the rest of America?
* Is it a trendy common-enemy for young intellectuals to frame as the next trope for "misused, oppressive power"?
* Is it anger and bargaining from the 2016 election fallout where the unhappy masses are working through the Stages of Grief and are searching for an object where to fixate the blame to move on?
FB is a direct competitor to media brands -- eyeball time that would have gone to CNN or WashPo or MTV now go to FB. FB then resells those eyeballs to said media brands, and it's big enough to command market power--it can control prices and terms and it's so huge you can't easily take your budget to a competing service. (It has in fact done this, before the wave of "FB is evil" stories, there was a wave of "FB is raising the price of impressions" stories. So many media brands perceive it as an existential threat.
FB did some fairly dirty things to obtain its huge market share and it's too powerful. So I don't really see a problem with all its competitors piling up on it, but that's definitely what's going on.
If you commit a crime, and only get caught years later after reforming yourself (off the profits of your crime?), can you truly be a victim? Now, I understand that the analogy does not map precisely, but at what point in this equation did fb face consequences for its initial misdeeds?
I am not agreeing or disagreeing with you, but this is a low value comment in furthering healthy discussion. Consider tacking on some further qualifiers to support your claim.
* In what actionable ways do you think they are missing the mark on reformation?
* What in your opinion is the bar for reformation for a social media corporation?
Otherwise, Twitter-esque call-to-panic statements like these diminish the signal to noise quality of discussion here.
I don't have time (or the thumbs) to make a comprehensive list but Facebook has been repeatedly caught doing the absolute bare minimum for PR reasons with little real effect (which theyd be aware of).
For example, earlier this year Facebook announced after public pressure that it would require labeling of all political ads [1]. Right before the midterms, Vice publishes the result of their investigation: they posed as all 100 senators and every single ad got through, labeled as an ad by a sitting senator [2]. You'd imagine a system for identity verification is the first thing they'd have set up before announcing new rules for identifying the source of political advertising.
This is a pattern with facebook and the other social media companies. When they're put under a microscope, they pay lip service to legislators and the public, then continue business as usual.
That Vice headline is simply not true. They note at the very bottom of their article that no ads "got through" because they never bought any in the first place.
The only thing I see that didn't get through was "There was one “Paid for” disclosure that Facebook didn’t approve in our latest test. They denied, just a couple minutes after we submitted it: Mark Zuckerberg."
Oh, I guess it was the last paragraph of the snippet and not the full article.
> What’s more, all of these approvals were granted to be shared from pages for fake political groups such as “Cookies for Political Transparency” and “Ninja Turtles PAC.” VICE News did not buy any Facebook ads as part of the test; rather, we received approval to include "Paid for by" disclosures for potential ads.
>I am not agreeing or disagreeing with you, but this is a low value comment in furthering healthy discussion.
I disagree. When someone poses an analogy, it's a valid and contributing comment to point out a key flaw in the analogy, even if you didn't bring citations.
There are contexts where a comment like that is low value, for example if it's just contradicting another comment. The standards for making a statement are higher in that case.
But in context this comment is fine. I have no idea what you're even on about calling it a call to panic. Your comment is far more problematic than what you reply to.
Are you trying to use this question as a rhetorical device, or are you genuinely asking what Facebook has done wrong?
If you're being genuine, the answer is:
* They allowed for advertising campaigns using their tools which were flat out illegal (e.g.: Real-estate advertising targeted based on race.)
* They have failed to meaningfully control over illegal uses of their platform in any way. (See: The current Rohingya issue)
* They were far too liberal with access to user data, in a way that is currently driving a debate around the legality of doing so. (Cambridge Analytica, etc.)
* They have built tools that are ideal for doing things like illegally swaying elections. ("Russia doing stuff".)
I can only assume there are many other things that Facebook have made possible which are illegal and completely unnoticed. It's hard to try to build any sort of exhausted list of unknown unknowns, though, so I'll leave it at a prominent four.
-
If you're using rhetoric, the thing you'll probably say next is "but that's not Facebook being bad, that's people using Facebook! You can't blame a tool for how it's used!", but the problem with that is that Facebook exerted editorial control over the platform in numerous ways over the time period in which these things happened, both in advertising and in news-feed sorts of ways.
Any person doing any of these things would be punished.
Any person aiding and abetting any person doing any of these things would be punished.
It was established long ago that the phone company isn’t guilty, legally or morally, if two people use telephones to organize a crime. Stopping crimes was the police’s job, who could wiretap phones with a court order if necessary.
Current public opinion is divided on whether internet communication services should take it upon themselves to detect and prevent crime. It may be the best solution, but there are pitfalls and we really have to decide as a society how to change the system.
So your point of view may be the best, but the way you present it as self-evident is misplaced.
> It was established long ago that the phone company isn’t guilty, legally or morally, if two people use telephones to organize a crime.
In this case, Facebook is not a neutral communications provider. They are selling and displaying the ad. They are a party to the transaction.
> Current public opinion is divided on whether internet communication services should take it upon themselves to detect and prevent crime.
Yes, there is debate as to how involved Facebook should be if two independent parties are, say, coordinating crime on Facebook messenger. That's a totally different case than what we're discussing.
And they are also curating who’s phone calls get through. The phone call never edits or bans content on their lines (generally.) Facebook as a utility is a false idea: they are a publisher whether they say so or not. A neutral platform doesn’t have nebulous “community standards” nor algorithms to determine the visibility of content.
It's the same word games as the "sharing economy".
Yeah, Uber drivers are "independent contractors" rather than employees because they have to buy their own cars and are totally free to do what they want (as long as they strictly follow the terms and conditions while being under close supervision of their performance).
Yeah, Facebook is "just a platform" because users provide the content and Facebook just decides what content to show, whom to show it to and charges users to show it more prominently or to show it to very specific groups of people.
We can discuss what the law should be all week long. By that time the barn door has been open and the horse is miles away. This isn’t just solved by regulation.
And the "phone company" is now a highly regulated utility that was broken up by the government, to include those surveillance capabilities. FB doesn't get to be Ma Bell and I doubt they would welcome the comparison.
The question here isn't even if FB should be regulated or broken up like the phone company. People are simply realizing the consequences of something like FB even existing. The market seems to be taking corrective actions on its own.
There is a moral difference between doing and not doing that you seem to be glossing over. Facebook didn't do enough to stop all sorts of things. But the question is what moral responsibility do they have for not recognizing issues soon enough, or not committing enough resources to responding, etc. It's not obvious that they have the backwards-looking responsibility that your argument requires.
The role of Facebook and other social media in swaying politics is almost presented as if it began in 2016.
In reality what the U.S. saw was more like version 3 of a society manipulation tool.
Looking at the events in France recently, the 2016 election, Brexit, the Arab Spring and to some degree the 2008 election. We can see traces of targeted information inflaming a populace for political goals if we look back a decade.
This is not about Facebook specifically, it's more an observation on the kinds of social media manipulation that can happen with these tools. The question becomes, how much of this was an intended consequence of building these tools? A cynical view could make a strong case that it was intentional.
The role of social media in the Arab Spring was spun as a positive thing at the time though, as was the way it was used by Obama in the 2008 and 2012 elections.
Sure, but that proves Facebook was aware of its influence as early as that and had enough time to consider the implications. If you find that politicians are using your product to sway public opinions, it's your moral responsibility to think about what that means and how it can be abused. You can't just shrug your shoulders ten years later and pretend you're surprised.
> There is a moral difference between doing and not doing that you seem to be glossing over.
Sure, I'm glossing over it, because it's not really a useful distinction for me to make in this case.
The difference between doing and not doing is that one is a deliberate criminal action, the other is negligence. Both of these things are, in various ways and in various situations, considered immoral and illegal.
The harshness of the punishment may vary, but the fact of punishment does not.
By way of analogy, if a company operates a nature preserve and invites people to walk through it and then sells licenses for other people to come and shoot at people on the preserve, and gives them lessons on how to shoot people, then that company will probably not be in business much longer.
But negligence implies an obligation to act. Did facebook have an obligation to recognize the breadth and depth of the Russian operation against the U.S.? I don't see how.
These are the questions that folks who want to see facebook suffer consequences of some kind need to address. Your argument is not made simply by pointing out that bad things happened on facebook.
You're implying the problem is Russia or that the "attack" was somehow particularly sophisticated.
It was entirely possible, using the tools Facebook provides, the way those tools are intended to be used, to do extremely unethical and illegal things without Facebook intervening.
If I run an unprotected server and allow people to execute arbitrary code on it it should be pretty clear that I was being reckless if it turns out after ten years that all this time script kiddies have been running elaborate botnets with it. Especially if the code examples I provide take users 90% of the way to doing that.
I was genuinely asking, and none of your answers really put that much blame on Facebook, I mean come-on "They have built tools that are ideal for doing things like illegally swaying elections. ("Russia doing stuff".)"
They built tools that can be used this way? Are you serious? This is the big crime? (And the very concept of "illegally swaying elections by telling people things" bothers me - how can it be illegal to say something like that? But that's a different discussion.)
Or: "They have failed to meaningfully control over illegal uses of their platform in any way. (See: The current Rohingya issue)"
This is the job of Facebook? Isn't that a police/government job?
> They allowed for advertising campaigns using their tools which were flat out illegal
Knowingly allowed after being informed, or didn't realize they could be used this way?
> the thing you'll probably say next
I wasn't using rhetoric, but now that you answered me, we can fast forward the discussion.
And yes, that's exactly what I'll say next.
> is that Facebook exerted editorial control over the platform in numerous ways over the time period in which these things happened, both in advertising and in news-feed sorts of ways.
Are you saying that when you bought an ad someone at Facebook reviewed it? Is so than yah, I agree, bad on Facebook.
But why do I have a feeling they did not actually review ads?
Edit:
"a group of Russian agents utilized not only Facebook and Twitter, but also sites including Instagram, Tumblr, and YouTube"
i.e. they used everything, yet somehow it's Facebook's fault.
So let's say I build a nuclear plant. I make a lot of money from this nuclear plant, and I'm providing a useful and arguably essential service to my community.
Now, it turns out my nuclear plant is emitting toxic radiation and poisoning the community. Not only that, but the problem was blatantly apparent for years, everyone with the slightest bit of foresight tried to bring it to my attention, and to the attention of the community leaders.
The problem was identified, but I did absolutely nothing to resolve it even after being warned. When the problem shifted from a hypothetical to a forensically provable reality, I tried to cover it up and minimize it, while my gross negligence exerted immeasurable harm to the community.
That is the story of Facebook. When you build things - especially giant pieces of infrastructure which communities rely on - you have a RESPONSIBILITY to do your very best to ensure it does not cause harm. Facebook is not responsible for everything which occurs on their platform, but they sure as hell have a responsibility to at least apply due diligence in reducing the harm it causes.
Facebook did not apply that due diligence. They were negligent at the very least, if not openly malicious in their intent.
"Now, it turns out my nuclear plant is emitting toxic radiation and poisoning the community."
wrong analogy.
the plant isn't emitting anything. it is external people, outside of the plant, that the plant is providing power to, power that is then used by these outside forces in order to poison people.
the plant is just providing the energy. the people outside the plant are using it to poison others.
Facebook's not providing some fungible supply of something. They are taking in text and using their own methods to choose which to feed back, often being paid for it.
If the nuclear plant lets people send in fuel, it needs to inspect it before use.
"Move fast and break stuff" can be reworded to "do whatever and deal with the consequences later".
All the dirt now isn't people grouping up and coordinating. It's literally just people doing their jobs, and having an easy time of it. Because Facebook has decided to never deal with the consequences of their actions, so every time a reporter/regulator needs to look at them and verify facts they of course find something new
>it seems evident that Facebook is the biggest victim in all of this.
Looking at all the fallout from this, I'd love to know how you reached that conclusion. Even if Facebook was an entirely innocent victim of this (seemingly highly successful) attempt to sabotage and seed mistrust in US elections, then any effect on Facebook would be minor collateral damage at most.
Seriously the psychology behind that is so human, doing a thing you think is bad, being ashamed, then calling it out in others. Like the stories of gay senators being the most hardline anti-gay people, or coloreds being the most viciously racist ethnic group in South Africa (being a bit black themselves).
> I'm not one for conspiracy theories, but it really feels like someone with a lot of power has organized and weaponized information for the sake of manipulating stock prices.
Be able to recognize the narratives is a very valuable level of awareness. Doesn't mean all news/opinion gets dismissed, but it's important to remain grounded in reality about real harms vs perceived harms.
Depending on who you ask these days, FB is inarguably a bad thing and terribly harmful to everyone. It's easy for most to think so single-sidedly when the narrative is as well. I'm searching for how it helps people connect, makes people happy, gives them an outlet, etc at [0]. Of course I won't find it, 100% bad, 0% good.
I think it's actually just a simpler thing - people like to hate popular things and doing so sets them apart or makes them feel special (and can signal this specialness to others).
Since everyone knows what Facebook is, it's a good candidate.
I don't think there's much depth to the thought behind it.
I think you are wrong. Often popular things become popular due to deceit. Take tobacco for an example. It's (past?) popularity does not make hating on it a fad. People were told it was good. You can make friends over a smoke. It does have some beneficial effects, physical and social. Then it is likely to kill you after ten, twenty, or thirty years. It will destroy your family bonds and make your children ill. It is bad for society and people on so many levels. Facebook is different only in form. People who have been hating on it forever aren't doing this to be special, despite how convenient an explanation that might be. They get it. Facebook is social cancer.
> This seems somewhat misguided... I know Facebook is the whipping-boy de jour, but it seems evident that Facebook is the biggest victim in all of this. All of the bad press FB is getting lately... literally years after events occurred and after they've been working to rectify the issues is curious.
They helped throw an election. This is not the sort of thing you forgive and forget after a couple of years and some apologies.
“Throw an election” implies illegitimacy, that’s a dangerous idea in a democracy where legitimacy is derived from the people. It is just as wrong as if trump were to not accept the results because of “election fraud” or what have you. Can we accept that people that’s what people wanted? And if we want to change that, then talk to voters instead of blaming the news: people share articles they are inclined to agree with anyways, and confirmation bias means nobody will fact check their own opinions.
This always rubbed me the wrong way. To say that the election was hacked implies that people were manipulated to do the "wrong thing" instead of the "right thing". To debate on that level seems extremely anti democratic, as it inadvertently leads to either the assertion, that people are to stupid and irresponsible to have a say in their future or to have the access to information restricted, to make sure people dont respond to the information by doing the "wrong thing". In both cases you have a higher authority decide what is best for people as they are to irresponsible to decide for themselves.
Facebook is an "attractive nuisance" that they have failed to sufficiently secure and monitor, the same as if they left their hot tub uncovered and unlocked, and a neighbor kid climbed the fence and drowned.
I watched Apple stock for years and it definitely went through periods where it couldn’t get a lucky break to save it’s life despite being the most profitable company.
This seems somewhat misguided... I know Facebook is the whipping-boy de jour, but it seems evident that Facebook is the biggest victim in all of this.
All of the bad press FB is getting lately... literally years after events occurred and after they've been working to rectify the issues is curious.
After all the bad press that was hammering Tesla for an extended period of time quickly dried up (after Elon Musk showed just how far he was willing to go to shake the short-sellers), Facebook suddenly took its place.
I'm not one for conspiracy theories, but it really feels like someone with a lot of power has organized and weaponized information for the sake of manipulating stock prices.
That doesn't have much to do with the NAACP itself, but these information campaigns have a side-effect of modifying public opinion and sentiment.
I really kind of feel ashamed of posting this -- it seems like such a low-value conspiracy theory, but it just seems strange to me how all this is playing out.