Hacker News new | past | comments | ask | show | jobs | submit login
Dissent at Facebook over hands-off stance on political ads (nytimes.com)
309 points by mindgam3 on Oct 28, 2019 | hide | past | favorite | 393 comments



If Facebook decides which political ads are truthful, then either they reject all political ads (because when was the last time someone ran a political ad that was wholly truthful?), or else they have do decide where to draw the line. Wherever they draw the line, they're going to reject someone's ad, and be exposed to screams of how they're biased for the other side. That's not going to end well for them.

I mean, just running them all is also getting Zuckerberg raked over the coals[1], but I think picking and choosing which ads are "truthful enough" is going to be worse.

And that's if Facebook is actually completely unbiased. If this becomes a vehicle for the biases of those at FB charged with judging the ads, that's even worse.

[1] Zuck was getting raked over the coals in the name of truth, but I suspect it was at least partly because those who did so thought they would benefit politically if FB censored their opponents' ads.


It's really a question of whether or not Facebook should submit to FEC regulation or not. Other media outlets that run political ads need to submit to FEC regulation. Why does Facebook get an exemption? Newspapers literally have to run political ads by their editorial board and TV has to do something similar. It's one of the reasons why political ads are so bland and content-free: they don't want to make any claims that could be evaluated as true or false.


Yes, it's not like this is a problem that hasn't been dealt with in other media.

I think one of the things that makes it more difficult in the case of facebook is that the ad targeting makes it possible to serve specific ads only to very specific groups.

This means that if the ads say untrue or damaging things, most people will never know, as opposed to newspaper or TV ads. In my mind this argues for at least as strict controls, if not more stringent.


This was dealt with in other media the same way that Facebook decided to deal with it.

https://www.politifact.com/truth-o-meter/statements/2019/oct...

TV networks are required to air even untruthful ads as not to run afoul of regulations governing political ads. It's also likely that campaign laws requiring all political ads to be true would be unconstitutional.

Either way, the very fact that the Elizabeth Warren campaign is promoting a fairly blatant falsehood (and many people are happy to parrot a false talking point in the name of truth) is a symptom of the same problem that would force any reasonable entity to make the same decision Facebook just did. Truth isn't very popular, especially in politics and people (on every side of most political issues) would lose their mind if falsehoods were strictly banned.


I think it’s important to mention that Elizabeth Warren is displaying a blatantly false ad to illustrate this exact point: she thinks that it’s ridiculous that Facebook is airing obviously false ads and has no accountability.


The blatantly false part I'm talking about isn't "Mark Zuckerberg is endorsing Trump" which everyone understands to be the obviously false statement, but the subsequent argument that is still being repeated here that "If Trump tries to lie in a TV ad, most networks will refuse to air it."

A whole bunch of comments here are along the lines of: how come Facebook isn't required to abide by the rules that other media companies have to follow? But this is the exact opposite of how things are - small media companies can adopt whatever standards they want, but the broadcast networks have to air even false political advertising, because these types of regulations are more about protecting fairness and free political speech than about the truth. This is an example of Facebook subjecting themselves to the highest possible standards, instead of making arbitrary editorial decisions that smaller actors can get away with.


It's not the same high standard when they allow targeting of specific people and groups highly vulnerable to bullshit with no one else included who might identify it as such.


How would this be any different from advertising on certain shows or talk radio that peddle huge amounts of BS and are almost exclusively consumed by gullible people?

Again, the point is that the standards followed by broadcast TV are higher than the standards followed by, random media companies that aren't regulated. And what Facebook is doing here aligns closely with how public institutions that are held to the highest standards operate. The general principle regarding the freedom of speech is that the closer you are to being a public institution, the more open you ought to be in terms of the types of speech you accept.

So the criticism here isn't merely wrong, it's exactly backwards.


I can freely choose what shows, radio stations and other media I see, and see the same ads as their audience. In the case of targeted ads, I can't move from demographic to demographic as I please, so some campaigns are inaccessible to me.

To me, that opaqueness separating one filter bubble from the next is an important quality. It may raise the effectiveness of propaganda campaigns significantly.


> It's also likely that campaign laws requiring all political ads to be true would be unconstitutional.

That can't possibly be true. The cornerstone of democracy is informed voters.


That can't possibly be not true. The state has not and can not have any role in determining what is "true", if only freedom of speech has any value. As soon as the state gets to decide what is "true" and ban speech on that basis, freedom of speech is destroyed. In the USSR, there was a section in the criminal code banning "knowingly false statements defaming the state and the social order". Obviously, people who were punished under this section were ones that said anything that the communist regime didn't like, and their statements always were "knowingly false", because the same communist regime decided what is false and what is not. I don't think the US should follow this example. That's why the Constitution exists and why the First amendment is the first one. Because it's that important. Yes, the price of it is somebody can say something you don't like. You'll survive it.


Seems like a straightforward first amendment violation to me, exactly because nobody agrees on what's "true".


Defamation, for example, is not protected by the First Amendment, and deciding whether some information is true or not is crucial to determining whether it constitutes defamation.


This is irrelevant - there are all kinds of speech that are generally accepted to be outside the scope of free speech. Political speech that includes false claims does not fall under that and never has. If taking down "false speech" was something the US government can easily enforce, we'd make China look like a shining beacon of freedom. And the proper recourse to such speech is the court, not poorly paid and overworked social media moderators.

Also, it's important to keep in mind that the Trump ad in question does not make claims that are easily proven to be false - they are merely BS claims that are entirely unsubstantiated. I don't think they'd be taken down even if a rule against false claims existed and was properly followed. There's no evidence given for such claims, but not all claims without evidence are false.


Part of the regulation could be that Facebook can only serve stateless political ads. If a politician wants to advertise, they can't pick and choose any parameters.


It would be better if advertisers had their own pages where all of their advertisements would be shown as posts with the parameters they used for targeting.

Basically Facebook needs to make the whole process maximally transparent, and claim it is just a telephone line, and only people telling the lies are responsible for the lies.


That sounds like Facebook's Ad Library feature[1]. It doesn't show you the targeting parameters used, but it does show the age/gender demographics the ad was actually shown to.

[1] E.g. https://www.facebook.com/ads/library/?active_status=all&ad_t...


What if we just banned targeting users and groups of users and instead only allowed ad targeting based on the content of the web page?

Basically a smart version of a banner ad.


I'd support this but moneyed interests would not. Advertisers wouldn't have an excuse to build the digital panopticon anymore, and a lot of the effort to build it would be wasted. That'd be too much justice for this world.


Surely there are some parameters that would make sense? I don't want to see political ads from a local election across the country. There's also no reason for me to see an ad for the Republican primaries if I am voting in the Democratic primaries. Etc


"Surely there are some parameters that would make sense?"

For sure...

"I don't want to see political ads from a local election across the country."

They should be targeting eligible voters. If a local election is taking place across the country, then, of course, you shouldn't be seeing the ads.

"There's also no reason for me to see an ad for the Republican primaries if I am voting in the Democratic primaries. Etc"

No, this defeats the purpose. We should be trying to expose people to more viewpoints - not creating more echo-chambers.


FEC rules don't really cover who sees what, they cover what anyone can see.

Now we can expand on those rules to dictate who can see what, but no matter who views the ad, it would likely be bland, and confer no new viewpoint at all. Lest it fall afoul of FEC rules.

What we're really talking about with FEC rules, is a return to the middle, because extremist views can be shut down by pointing out misleading content in the ads and subsequently rejecting them. Whereas the bland, meaningless stuff will always be free of any misleading content, and be subsequently approved.


> Whereas the bland, meaningless stuff will always be free of any misleading content, and be subsequently approved.

I'm not sure what media you consume, but TV and radio ads always seem filled with misleading content about political rivals.


Your 4th paragraph contradicts your 6th.


>> Part of the regulation could be that Facebook can only serve stateless political ads. If a politician wants to advertise, they can't pick and choose any parameters.

> Surely there are some parameters that would make sense? I don't want to see political ads from a local election across the country. There's also no reason for me to see an ad for the Republican primaries if I am voting in the Democratic primaries. Etc

I think the answer is somewhere in the middle. A couple factors need to be balanced: the ads need to have the minimum targeting to not be wasteful and pointless, but they can't be so narrowly targeted that oversight by the media and voting public becomes difficult.

I think the right balance is something along the lines of: 1) coarse-grained geo-targeting allowed down to an MSA at the narrowest and 2) no targeting based on interests, demographics, or ideology.


I was honestly thinking for national elections when I wrote this.

There are some parameters that make sense, for example targetting congressional districts.

What I'm mostly against is zip code, ideology/interest, racial/sexual/etc factors being taken into consideration.


Why would you buy such an ad given that almost every election in the US is geography bound?


Even TV & newspaper ads are regionally bound.


Well you kind of have to allow targeting based on location/home location.


I like this solution but they wouldn't. It brings them back to the problem of trying to tell which ads are political in nature or not and policing normal ads for such content.


Some parameters will still be picked for them, right? It would be a bit silly to show ads in Dutch to Danish people that can't speak Dutch.


Television advertising isn't stuck in the dark ages. Political candidates run and create ads targeted at specific counties or districts.


Next to Facebook targeting, that is the dark ages.


True, but that level of targeting isn't completely unprecedented. Direct mail was an extremely targeted form of political advertising and faced zero content regulations. (Much of the targeting was achieved by buying magazine subscription data from magazine publishers; you could target households pretty effectively based on which magazines they subscribed to.)


Agreed. The fact that it's harder to do on the web for whatever reason shouldn't mean they get to ignore the rules.

I see this sort of paradoxical mindset all over the place: X breaks the rules but X is just so successful, or so full of potential, so shouldn't we just bend or break the rules for it? But the fact is that X is probably successful or full of potential in large part because it's been breaking the rules, and the rules exist for a reason.


If there were rules governing political ads in social media, they would look very much like Facebook's policy. At least TV ads (which are more tightly regulated due to the use of radio spectrum, which are public goods) are regulated as to force TV stations to air even political ads they deem untruthful:

https://www.politifact.com/truth-o-meter/statements/2019/oct...

From that perspective, Facebook deciding to censor untruthful political ads would be much more akin to ignoring the rules.


What this ignores is that there is a finite amount of ads one can run.

It would be easy enough for Facebook to classify ads as political/non-political and for political ads have more scrutiny over what they decide to air. And have strict penalties against those who choose to mis-classify their ads.

Or make political ads visibile publicly for anyone to search for details as to who created them, how much they paid, how many people were targeted.

All these are valid solutions to the problem, but Facebook doesn't want to do it, because if they can get away with what they're doing right now, they can make a ton of ad money.


> It would be easy enough for Facebook to classify ads as political/non-political and for political ads have more scrutiny over what they decide to air.

I doubt it is as easy as you think because not all political ads look like "vote for foo!"

They will evolve to look like governor foo paid company xyz which sells home security products to make an ad about how unsafe the country is...while governor foo runs on a platform all about reducing crime rates...etc

Look at some of the propaganda during the 2016 election era for instance


Facebook already classifies ads about "social issues, elections or politics" and those ads are subject to extra scrutiny[1].

[1] https://www.facebook.com/help/180607332665293


> Or make political ads visibile publicly for anyone to search for details as to who created them, how much they paid, how many people were targeted.

Isn't that already the case? You can already search all political ads and see who ran them, how much was paid, and the number of people who saw the ad.


Is it? I can’t find any documentation on how to query all the political campaigns that are currently active, who is paying for them and who they’re targeting. All of this information is critically important to be made public to ensure the integrity of our elections.


Did you look at https://www.facebook.com/ads/library/ ? It should show all active (and inactive) ads, information about the page that ran the ad, and who paid for the ad. It doesn't show targeting, but it does show amount spent and the gender/age/geography of who saw the ads.


This used to be more of a problem, but IIRC they have forced political ads to be publicly visible to anyone on the page that's running the ads.


>> Yes, it's not like this is a problem that hasn't been dealt with in other media.

And it was dealt with poorly. My whole life, minorities (persons of color esp) have been scapegoated in elections in ads on TV, etc. So, yes, political ad censorship has been dealt with by media, and the "deal" was that victims would be minorities.

In a sense, Facebook just makes it fair. Now anyone can criticize anyone, everyone gets to suffer equally. Those who don't like it should have spoken up decades earlier.

I'm posing anon for obvious reasons. My hope: Lets let Facebook win this round, so we as a society come up with decency rules that help all of us, rather than just defending protected groups.


> minorities (persons of color esp) have been scapegoated in elections in ads on TV, etc.

Do you have any sources for this?


IIRC, Willie Horton.


My whole life, minorities (people who earn more money than average esp) have been scapegoated in elections...

What if you said this? What happens when Bernie Sanders run a dozen ads saying the “rich” haven’t been “paying their fair share” and what standard would allow this to be consider truthful, or will the ad need to define words like rich and fair (good luck on that 2nd one)?


Yes, talking about corporations avoiding taxes is the same as talking about Mexicans and refugees coming to destroy your country. You really need a dose of nuance.


This was pointed out elsewhere but needs to be repeated:

https://www.politifact.com/truth-o-meter/statements/2019/oct...

The regulations exist in the exact opposite direction as you claim - TV networks are required to air even untruthful ads as not to run afoul of regulations governing political ads, which means Facebook's policy is closely aligned with the actual regulations as they exist.

Also, most newspapers at least online are are outsourcing part of their ad placement to third-party ad tech companies so I'd be shocked if there was any editorial control.


> It's one of the reasons why political ads are so bland and content-free: they don't want to make any claims that could be evaluated as true or false.

I keep getting mailings in Ohio indicating that if I sign a petition to put a certain issue on the ballot The Chinese Will Know Where I Live And Come Get Me. It's a completely idiotic campaign, and of course, it's all blatant lies. (similar ads appear on television)

This place you're living sounds relatively pleasant though.


Direct mail is entirely different. The campaign doesn't need anyone's prior permission to do it, and it might not even be sent by the campaign.


Then, is Facebook the newspaper, or is it the mail (in terms of how it should have to treat political ads)? j-c-hewitt thinks it's the newspaper. But why that and not the mail?

Note well: I'm not saying that Facebook should be treated like the mail rather than the newspaper. But, given that they can target ads to individuals, that aspect seems more like the mail, at least to me.


Zuck's best way to 'wash his hands' of it is to adhere closer the newspaper standard. Though, if he does this, it may mean less money.


Facebook advertisements are a public forum. Moreover, there is an intermediary party that can exercise review.

Direct mail is not a public forum. Moreover, who would accept or reject a direct mailing campaign?


How are Facebook ads a public forum? They send an ad to the viewer of a page. They can, and do, control viewer-by-viewer who sees which ads. That's like sending an ad to a mailbox.

But I wish I could reject direct mail campaigns...


many of them are promoted posts, with discussion, and which are re-shared by individuals to their network.


OK, but that's like you got something interesting in the mail (or even email), and forwarded it to someone. That's still not a public forum.


We are going to wreck FB within 5 years with something far less centrally controlled, and you want to waste government resources on this? The company is 15 years old.

Math doesn't work that way or bitcoin would be toast.


Honestly if a lie that blatant even needs to be censored, then I think we can just say democracy is a failure. What concerns me are the lies / truths that depend very much on values and biases that would get caught out because of Facebook's own biases or people gaming the system.


I don't buy defeatist statements like this anymore. Let's be real, if it turns out lies actually do "need" to be censored for whatever basically measurable definition of need you prefer, are you literally going to give up on democracy? What are you going to replace it with? Unless you just give up on morality entirely, you're still going to try to find a solution.

That means that the only thing "...then we can just say democracy is a failure" accomplishes today is give you a clever-sounding way to avoid thinking about a hard problem. That's not a good thing.


You could give up on a particular implementation of democracy, without giving up on the whole thing.

Council democracy does not require politicians to try to appeal in absentia to hundreds of thousands of voters they ostensibly "represent", for example.


> You could give up on a particular implementation of democracy, without giving up on the whole thing.

Yes, that's pretty much my point.


>> are you literally going to give up on democracy

American democracy? Oh yeah I've absolutely given up. Candidates I like never even come close to winning. Often they don't win ballot access. I'll still vote, but I'm done contributing to campaigns or being optimistic about the outcome.

I would expect most of the ads in question here fall into one of two categories. One of them is considered false by one side, but an argument can also be made by the other, because each side is probably cherry-picking some data, looking at different contexts with different values, etc. I honestly don't think FB can do this at scale without bias, arbitrary judgment calls that don't seem fair and take too long to resolve, etc. So let's say they just let all those cases slide. The other category is things that are what we on HN would call blatantly false, where there is no supporting evidence, tons of evidence to the contrary, and it just violates common sense. Are we saying that we trust people who fall for the latter, and do no independent research before basing votes on things they see in FB ads, are people we should still trust to judge national economic plans and foreign policy in the debates? I'm sorry if this is defeatist, but I just don't think so. Those aren't people I even want to be on a jury of my peers. I'd rather just be judged by the man with the law degree and years of experience. Maybe this is a step in the right direction, but I just don't even think it's a drop in the bucket. With so many people who lack basic knowledge of civics but have the right to vote, I think we're just past the point of no return to idiocracy until there's some major collapse and we start from the ground up.

But the mere fact that we're having this argument is beyond me. After every election, which ever side loses goes grasping at straws. With Obama it was the birthers. With Trump it's social media ads. With the last 2 Republican presidents it's been the electoral college. I mean I'm sitting here watching the 49% (concentrated in a few dense areas), claim they have more of a democratic mandate than the 48% (spread out in very culturally distinct areas), when most people don't even vote. I see both parties as being inappropriately quick to judge and legislate matters that they're completely unfamiliar with. I'm pretty individualist, but US democracy seems to just be a constant race to a tyranny of the majority, with the legislature having abdicated many of their responsibilities to the executive. Violations of the law by the government just seem to get ignored because that's the way it is now. So if I did get to pick what we replace the current US democratic system with, I'd far rather go for at least a parliamentary system where instead of me thinking I have a representative for my geographic area, my party gets a share of the power and has to figure out a way to share power with the other parties. It's less direct, but if I don't get to be as self-governing as I want to be, I don't want to be governed by a bunch of people who fall for Facebook spam.


I agree. I feel like creating the illusion of a disinformation-free platform will only weaken peoples' ability to discern it. Taiwan has a hands-off strategy for dealing with it for that reason [1].

1: https://cpj.org/blog/2019/05/qa-taiwans-digital-minister-on-...


> Honestly if a lie that blatant even needs to be censored, then I think we can just say democracy is a failure.

The problem with what you are saying is, if we accept that there can be any curbs on free speech at all, "how could anyone be stupid enough to believe that?" is not a useful guiding principle for setting the threshold at all.

The classic example of speech that is not covered by the First Amendment is crying "fire!" in a crowded theater. Of course it's rather crazy for everyone to trample each other trying to escape a fire that does not exist when they do not even smell smoke. That is completely beside the point. We govern the speech based on the damage it does, the intent to do damage, and the utter lack of any redeeming qualities to the speech.


FYI, the example of shouting "fire!" in a crowded theater comes from a case against someone who was claiming that drafting people into the military was involuntary servitude, and was illegal and worth resisting. I think they should have been allowed to say that.


Oh, I should have chosen a different example! I certainly think the argument that we cannot censor based on whether or not someone would be a bit foolish to believe the lies being spread still stands, but I'll leave it for now.

(Although I believe I share the OP's sentiment that the whole thing is not a sign that democracy is in great shape...)


It's surprising to me that they'd run it by the editorial board. How do you know?

I mean, sure, newspapers have policies for acceptable ads, but don't they try to keep editorial and advertising separate?

Example:

"The Minnesota Daily at the University of Minnesota, Minneapolis, houses its news staff and advertising staff separately and does not hold meetings between the groups."

https://splc.org/2018/10/student-media-guide-to-publishing-p...


Leaving aside the question of whether or not the FEC should regulate social media ads (and personally I think they should), as a practical matter, they can't for the time being. As of a few weeks ago with the resignation of a commissioner, the FEC lacks quorum, and the administration apparently isn't planning to appoint any replacements for the time being, which means they can't issue rules governing Facebook or anyone else, nor can they initiate enforcement actions or investigations, or issue advisory opinions about what campaigns or companies should and shouldn't do to comply with existing rules.


Not saying that the current administration is trying to abuse this, but having the executive branch able to paralyze the body governing elections seems like a massive loophole.

Is this something that congress could (technically, not politically) fix themselves, or do they really depend on a nomination from the executive branch?


> Is this something that congress could (technically, not politically) fix themselves, or do they really depend on a nomination from the executive branch?

You could have any vacancy for more than X-days be filled by the minority party. (i mean even nominating an acting member -- you don't need to get an approval)


Proposal assumes two parties and would codify this. Two parties is one of the ways Americans keep getting themselves screwed. No need to codify it. Maybe one day we shall fix it?


Other parts of norms and conventions in USG assume two parties as well.


They can't appoint people, but they could change the quorum rules to allow the currently sitting commissioners to regulate. It would require Presidential signature, though, or enough votes to override a veto.


This is a red herring - there's no law against untruthful political advertising. Such a law, broadly enforced, would likely be unconstitutional and how do you suppose that you enforce it? Also, if that was the law, it wouldn't be up to corporations to determine what is true and the responsibility would fall between the politicians and the regulatory body that decides what the truth is. That sounds to me like a totalitarian nightmare but I suppose YMMV.


I don't know why it would be any different constitutionally from false advertising in any other context. It's not a crime per se, but false advertisers can be held civilly liable for misleading consumers, and individual consumers can sue false advertisers, as can the FTC on the public's behalf (presumably it would be the FEC instead, but work similarly). The court system (so, presumably a jury) decides whether or not it's false.


Civil liability is an entirely different matter altogether but what's problematic about false advertising isn't that the claim was false, but more that you're breaking a promise. This isn't about free speech, but more about an implicit contract being violated. Political speech, on the other hand, is explicitly what the freedom of speech as a right is about.

Either way, I don't believe anyone would claim that in case of false advertisements, the responsibility falls on the publisher of ads, not the advertiser. It's not generally possible to evaluate the claims made in advertisements - if I were to say, something I'm selling will work for 10 years and will offer replacements if it breaks in the next 10 years, how would the ad publisher know if I would honor the claim several years later?


> one of the reasons why political ads are so bland and content-free

I'm not sure I would describe them as bland: https://youtu.be/LcjEe9guEIA?t=435. I bet the bland ones are by choice.

> a question of whether or not Facebook should submit to FEC regulation or not

There are official political ads by campaigns, and then there are non-official political-influencing ads/articles/etc. A new challenge will be defining how "political" something has to be for it to be considered a "political ad".


I honestly didn't know there was an existing system for this, though it makes sense. And given that it does exist, it's an absolute no-brainer that the world's largest media platform should be subject to mass-media regulation.


> Why does Facebook get an exemption?

Snail mail and email also get exemptions.

Let's not pretend like there was some overarching architecture for all of society's communication.


> Why does Facebook get an exemption?

Because it's not apples to apples.

Facebook is like NBC but it's also like Comcast. Comcast doesn't have an editorial board screening all the content that its product indirectly facilitates.

Newspapers basically are content creators and curators at the core. If newspapers stopped doing that, they basically wouldn't exist. If Facebook eased up on ranking, filtering, and promoting, it would just be the old thefacebook.com.


Agree not apples to oranges but disagree w/ comparison to Comcast. The ad platform is literally Facebook's (and Google's) core business.


This is not accurate. See the discussion here: https://www.politifact.com/truth-o-meter/statements/2019/oct...

> But overall, it’s inaccurate to say that "most networks" will refuse to air an ad by Trump with a lie in it. We could find no evidence that most networks reject false candidate ads.


Not sure which TV ads you see, but the TV ads I see suck. Not all of them, but many of them talk about how someone voted for something and completely mis-construe the intent. So there has never been truth in political advertising, only a thin veneer of pretending that this is ok.

So I’m not sure that the boards have done their job.

I’d certainly call bs on them, and that’s on ads from my own party!


> It's really a question of whether or not Facebook should submit to FEC regulation or not. Other media outlets that run political ads need to submit to FEC regulation.

Only for political advertising related to federal elections. That's a subset of all political advertising.


I can understand why FB shouldn't be allowed to just exempt itself from the FEC rules that other media follow. But I don't think their policy on "no filtering for truth" is something the FEC bans in those other media, so I don't see the privilege.


If social media companies should be held accountable for the so-called integrity of our elections, shouldn't they presumably also be responsible for not censoring legal speech?


It's a fallacy to believe that the ability to determine the truthfulness of the content of an ad is binary. 100% or 0%. It's a spectrum. There are obvious truths, shades of gray, blatant misinformation, and everything in between. At the very least, they should clearly disallow verifiably false information. An advertisement aimed at Black voters with the wrong day for voting? That's clearly voter disinformation aimed at voter suppression. How close Facebook gets to the gray areas is by no means an easy problem, but it's patently false to pretend they can do nothing.


Facebook do disallow clear voter suppression ads like that one, though. They have a specific rule banning voter suppression which applies to all advertisers including politicians, and they definitely enforce it. One of the articles going after their hands-off approach to political ads (I think in the Guardian) even specifically attacked them for banning this whilst leaving other lies alone.


Disinformation about voting dates or choices seems out of bounds. But any negative ad is at a basic level voter suppression.


> It's a spectrum

The parent comment agrees very strongly with that.

> verifiably false information

Look at Snopes for an example. Relatively few assertions are "pants on fire" false.


Empowering Snopes to effectively censor Facebook by deciding what is true and false seems incredibly dangerous to me.


> (because when was the last time someone ran a political ad that was wholly truthful?)

Perhaps this is a good time then, to revisit this norm itself? Why is this allowed? Sure, politicians lie, it is known, but we've reached a point where the lies on one side (and it is one side, the Republican side) have diverged so far from reality than the base of that side can no longer engage in any meaningful debate on any issue. It's not even a question of bias, most Republican talking points and positions literally do not make logical sense. They scream and rant about problems that don't exist, and refuse to acknowledge the ones that do that are inconvenient. So much of the right is based on such hard misinformation as to be ridiculous.

And of course Democrats fudge the truth a little here and there too, and that should stop as well. To be frank I think the benefits of "Free Speech" are vastly overvalued. You shouldn't have the right to say whatever you want, at least not in the context of politics. You should be held to account for factual inaccuracies, perhaps not penalized but it should be noted, and you should be penalized for what can be demonstrably proven as outright lies and misinformation.


> To be frank I think the benefits of "Free Speech" are vastly overvalued.

Free Speech really only applies to the government, not private companies, so FB can impose whatever rules they want (though that might depend on if the FEC decides to try and enforce some regulation). Also, even Free Speech has its limits, for example, yelling "fire!" into a crowd to cause a mass panic when there is no fire is not protected speech.

Other examples include slander, this wiki page has a nice discussion on Supreme Court decisions as it relates to the First Amendment: https://en.wikipedia.org/wiki/United_States_free_speech_exce...


The Schenck case, which created the "yelling fire" line was overturned over 40 years ago:

"In 1969, the Supreme Court's decision in Brandenburg v. Ohio effectively overturned Schenck and any authority the case still carried. There, the Court held that inflammatory speech--and even speech advocating violence by members of the Ku Klux Klan--is protected under the First Amendment, unless the speech "is directed to inciting or producing imminent lawless action and is likely to incite or produce such action""

According to case law and Supreme Court decisions, "Free Speech" is much more absolute than most imagine.

Ref: https://www.theatlantic.com/national/archive/2012/11/its-tim...


It's also worth noting what, exactly, was the offending speech in that court case, and several related ones (such as https://en.wikipedia.org/wiki/Debs_v._United_States). It was anti-war propaganda, and specifically, calls to resist the draft. The defendants were high-ranking members of the Socialist Party of America.

Principled defense of free speech is often accused of being a slippery slope fallacy, usually by the same people who give "fire in a crowded theater" as a counterpoint. I wonder how many of them are aware that the phrase itself is a historical example of a very real slippery slope.


> so FB can impose whatever rules they want

Except, as evident, they don't. And they don't really have a financial motive to police speech, in fact they have financial motives to turn a blind eye to it.

And moreover, Facebook not making more money does not affect society at large, but the continued distribution of misinformation has direct repercussions for society at large.


Your political opponents think exactly the same about you, though. Your belief that "truth" would be objectively measured and enforced is pretty naive. Especially if you count "refusing to acknowledge problems that are inconvenient" as untruth.

Look at it this way. Huge numbers of Americans and especially Democrats believed for the longest time, and many still do, that Trump was a Russian spy. The entire "Russiagate" phenomenon was investigated to ludicrous depth and found no evidence such claims were true; so does that mean every journalist and Democrat politician that fueled those flames now goes to prison, or can't speak in public anymore? Think about what you're arguing for here! That was an absolutely massive factual inaccuracy so how would it be held to account?

And let's not even get into cases where someone genuinely believes they're repeating something true, which turns out not to be. If that were the case, given the replication crisis in science virtually all politicians would end up being unable to speak publicly as they'd have repeated the claims of studies that later didn't replicate.


Orwell had a great essay on the subject of freedom of speech and of the press.

https://www.orwellfoundation.com/the-orwell-foundation/orwel...

What makes it particularly interesting is that it was written by a socialist who volunteered to go and literally fight fascists (with bullets, not fists), and it was written in the immediate aftermath of WW2. You'd think that a person like that, at a time like that, would be very sympathetic to the kinds of arguments that are being paraded on the left today - that, basically, once politics gets sufficiently extreme, immediate concerns about that should trump any abstract freedoms. And yet, Orwell criticizes the treatment of Oswald Mosley (of the British Union of Fascists) during the war...

The irony is that the essay was effectively censored itself - it was meant to be a preface to the Animal Farm, but no publisher was willing to publish it as such. It was eventually published on its own, 27 years after it was written.


> but we've reached a point where the lies on one side (and it is one side, the Republican side) have diverged so far from reality than the base of that side can no longer engage in any meaningful debate on any issue

As someone who has no love for the Republican party, I have to say that your statement shows a sort of tunnel vision. Just take the latest Ukraine scandal: it's true that Trump pressured the Ukrainian government to investigate his political rival, but at the same time, it's obvious that Hunter Biden only had his job because he was the US Vice President's son. That's corruption by any reasonable definition. Yet plenty of Democrats are insisting that there was nothing improper with Hunter Biden taking the job, and that this is all fake news. But if it were Trump's son on the board of a Ukrainian gas company, guess how Democrats would react (and rightly so)?

> To be frank I think the benefits of "Free Speech" are vastly overvalued. You shouldn't have the right to say whatever you want, at least not in the context of politics.

I prefer the First Amendment to the truth police. Of course, if I get to pick the truth police and they always agree with me, that's great for me. But it's not so great for anyone who disagrees with me, and it ultimately means the end of democracy.


Oh, there's nothing "sort of" about it- it's the kind of tunnel vision and mentality that make up the extremists on every side (the ones screaming in everyone's face and giving their respective parties a bad name). It's a little sad that the post you responded to laments the loss of meaningful debate in the very same breath as a sweeping umbrella vilification of the entire party and every one of its values.


The first step of a debate is to have a set of understood and shared facts, the reality of whatever is currently going on. On some issues, yes, this is still possible. On the vast majority, and unfortunately the ones most pressing, it is not.

If you talk to the average Republican voter about, for example, climate change, you'll hear about China is faking the whole thing to make American industry less competitive, or about how, since we got some snow last night, that must mean it's a hoax. If you talk to them about job growth, they'll tell you we need to resurrect the COAL INDUSTRY, which was in rapid decline already and has only accelerated as coal continues to be less and less appealing as an energy source, because, you know, it's coal, and it's not very good. Jobs have been slipping in that industry since the 1940's, and automation has only worsened it. Yet Trump campaigns about revitalizing it, and the rest of the Republican party enables that, no matter how unrealistic it is.

It's just non stop conspiracy theories, disinformation, denial of fact, on and on. Watch Fox News some time. It's unbelievable how utterly insane the coverage is. I'm not talking a "different point of view" I'm talking about they are literally destroying any sense of truth to push their agenda, which shouldn't be surprising to anyone paying attention because that's basically what Roger Ailes started the company to fucking do: Appeal to conservative viewpoints that felt left behind as progressive stances became more and more mainstreamed.

This is not a conspiracy theory: Fox News' origins, along with smaller outfits like InfoWars, various talk radio stations, all the way to more recent ones like Brietbart, have been running this grift for decades; enabling the intellectual rot that has destroyed the GOP from it's core, and annihilated everything they ever stood for, all of it murdered on the altar of making money.

I say all of this as someone who is ashamed to acknowledge being a conservative, who has slowly watched the party he grew up believing in become a sad parody of it's former self, able to stand for NOTHING, NOT A THING other than squeezing out the last few dimes before the planet bursts into flames. What the GOP is today is a disgusting husk, a cancer-laden zombie kept moving by Koch brothers money and the sheer refusal of huge parts of the country lacking in proper education to admit they were fucking conned.


> Watch Fox News some time. It's unbelievable how utterly insane the coverage is.

MSNBC is every bit as unhinged as Fox News. It wasn't always that way, but it's become worse and worse over time. Rachel Maddow's ranting and raving about the Russians, panel discussions about Tulsi Gabbard being a Russian asset - it's bad. I feel sorry for anyone who gets their news from either channel.


Except that MSNBC doesn't dominate the left like Fox does the right. There are many networks that liberals watch: MSNBC, CNN, the BBC, etc. etc. and that's not even going into the fact that liberals are generally smarter consumers of news as a whole, reading multiple papers, watching multiple news networks, not to mention independent sources too. Conservatives rarely if ever leave the bubble.


Most liberal-leaning media in the US went absolutely insane over Russiagate. You wouldn't do much better by watching CNN. Every day, the walls were closing in, the big breakthrough was right around the corner. I don't doubt that liberals are typically more "sophisticated" than conservatives in the US, but that doesn't translate into political sense. The greatest condemnation of their sophistication is that so few of them predicted Trump's election, and that so many still have no idea why he won. That's why the Russiagate narrative is so popular among American liberals. It gives them an easy way to explain a phenomenon that they fundamentally do not understand.


I have not seen a single leftist defending Hunter Biden. Hell, the majority I talk to are hoping that's what sinks Biden's campaign, since as a candidate, he's the only weaker option to oppose Trump than perhaps Hillary Clinton was. The only reason he's even IN the primary is because he was attached to Obama, if it weren't for him being the VP, nobody would care who the hell he was. He's failed to run for the presidency, what, 3 times prior?

I don't care about Joe Biden. He's as crooked as any of the other big business Democrats. He'd do the country a tremendous service in just fucking off back to Pennsylvania and never going on TV again.

> I prefer the First Amendment to the truth police.

I welcome the truth police with open arms, because the alternative is what we have now: a free-for-all in which well funded interests, both political and corporate, can spew whatever well produced and intelligent-sounding bullshit they please onto our mass media and an unsuspecting populace. This is not sustainable. I would've really hoped that people would know better and be able to properly vet sources, but it's been proven, again and again, that they cannot, or will not.


We're talking about Democrats, not "leftists." Last debate, Joe Biden denied any wrongdoing by his son, and Cory Booker then chastised the moderators for even asking about Hunter. Elizabeth Warren has also been reluctant to condemn what looks like obvious corruption, which is notable, given that that's one of her signature issues.

> I welcome the truth police with open arms, because the alternative is what we have now: a free-for-all in which well funded interests, both political and corporate, can spew whatever well produced and intelligent-sounding bullshit they please onto our mass media and an unsuspecting populace.

This always sounds like a great idea if you assume that you're the one who will get to decide what's true and what's false. But what makes you think the truth police themselves won't be corrupt? Once they exist, and once they come for your views, it's too late. By throwing away the First Amendment, you're condemning your own views to eventual suppression.


> Joe Biden denied any wrongdoing by his son

I do not care about Joe Biden. He's a disheveled artifact of pretty much everything wrong with the Democrat party. You can criticize him all you want and I'll agree with you but please stop presenting this as though it means anything about the vast majority of the Democrat party, which again, is not supporting him.

> Elizabeth Warren has also been reluctant to condemn what looks like obvious corruption, which is notable, given that that's one of her signature issues.

What "looks like corruption" that she hasn't condemned?

> But what makes you think the truth police themselves won't be corrupt?

You say this as though corruption isn't fought off regularly in basically all functions of Government? And that's not say we're totally free of corruption, because well, yeah. But in general the country gets along okay. It's not stellar and could be improved quite a lot, but in the grand scheme, roads are maintained, government workers are paid, government contractors get paid, our sewage and water systems (in most places) are fine, our electrical grid is pretty standard.

You know it's easy to get disillusioned and say our Government as a whole is a shitheap, but honestly, we're doing okay. Certainly well enough to say that just because something the Government does might be influenced by corruption isn't reason enough, by itself, to not even try.

And this idea that mass media should be policed for something resembling accuracy and honesty means the first amendment is dead is just ridiculous false dichotomy. We police speech CONSTANTLY. Spreading information you know to be false about a person is already covered under the law as libel, and in order to put that law to someone speaking requires THOROUGH, WELL DOCUMENTED evidence, it's not just something that gets thrown around, other than by people who don't understand what it really is. We police speech that incites violence, again requiring thorough proof as we do for any crime. In my mind, there's no difference between that, nor any requirement for the "Truth Police," and going to a place like a talk radio station and say "This host on this date espoused a thoroughly debunked viewpoint that has been proven false, demonstrably, with data, multiple studies, and testimony from experts all over the country. Because of this we're assessing your station a $50,000 fine, and $5,000 per day additionally that this host is on the air and has not yet fully retracted this statement, and accepted that they have been spreading misinformation."


> What "looks like corruption" that she hasn't condemned?

Hunter Biden working for Burisma. She was asked about it, and the contorted answer she gave to avoid condemning it is rather hilarious.

> You say this as though corruption isn't fought off regularly in basically all functions of Government?

And you're proposing giving a corruptible institution vast new powers to control public dialogue.

> And this idea that mass media should be policed for something resembling accuracy and honesty means the first amendment is dead is just ridiculous false dichotomy. We police speech CONSTANTLY.

Libel and incitement are very narrowly defined, and proving either is incredibly difficult. Those laws could be used as a political tool to suppress speech, which is why they're so narrowly construed.

I can predict with 99% confidence what will happen if the truth police are instituted. The truth police will be heavily influenced by the same industries and financial interests that dominate government today. Many views that you hold will be declared false. Black Lives Matter will be declared a form of incitement. Socialism will be declared a false ideology.

I'm just very surprised by how many Americans have decided after Trump's election that the First Amendment has to go. It's very short-sighted.


> And you're proposing giving a corruptible institution vast new powers to control public dialogue.

And you're continuing to argue as though the vast already thoroughly corrupted institution already doing that is somehow better.

> Libel and incitement are very narrowly defined, and proving either is incredibly difficult.

No shit. I'm saying exactly that laws to police speech can be done right. It should be a sizable burden of proof needed, it should be difficult specifically so that the Government can't simply say "that's bad, stop saying it." That's my whole damn point.

> I can predict with 99% confidence

Is this prediction based on something besides your own biases?

> I'm just very surprised by how many Americans have decided after Trump's election that the First Amendment has to go. It's very short-sighted.

I said no such thing. Please read my posts if you intend to respond to them.


> And you're continuing to argue as though the vast already thoroughly corrupted institution already doing that is somehow better.

No, I'm arguing that that institution should not be given vast new powers to determine what can and cannot be said, based on what it deems to be true and false.

> I'm saying exactly that laws to police speech can be done right.

The right way to do it is to not give the government the power to censore "untrue" speech in the first place. What you're proposing is incredibly dangerous, and would be a clear break with the 1st Amendment - the bedrock of American democracy.

> Is this prediction based on something besides your own biases?

It's based on a simple analysis of what would serve the interests of those who wield great influence in government. Since November 2016, I've heard lots of liberals argue for censorship of "untrue" or far-right speech. They seem to imagine that they can create a system in which they will determine what is true or acceptable, and that those definitions will be imposed on the "bad guys." I think that's incredibly naive. Lots of conservatives call Black Lives Matter "hate speech" and the like. If you give them a tool to suppress speech, they'll turn it against BLM and other causes they dislike.

> I said no such thing. Please read my posts if you intend to respond to them.

You're arguing against the 1st Amendment. You want to impose an official government standard of what is true and false, which can be used to muzzle speech. I get that you want this standard to be objective, but the obvious direction that such laws will go is towards wide-ranging political censorship. The 1st Amendment is central to American democracy, and one shouldn't play around with it. Leave it alone. You can't ban people you think are wrong from speaking. Try arguing against them, convincing them.


Not all ads necessarily contain statements that would force Facebook to take a side on any issue or become an arbiter of truth. They could simply restrict political ads to essentially just "Vote for [candidate]", or something similar.


You mean ads paid for by (American?) registered campaigns or anyone who buys an advertisement deemed 'political'?

It's easy to define an ideal ad or ideal bad-guy scenario, the problem is in the grey areas of categorizing such a chaotic, broad, and unpredictable group of content.

That's always what ends up making this sort of thing a bad idea. They're always at massive risk for false positives, selective enforcement, moral panics, etc. It's also a giant distraction from real problems in the world and provides little long-term ROI compared to all the work that will go into it.

It should be either all ads are okay with some obvious-tier moderation (ie, no ads promoting violence) OR no political ads at all. Let's not give FB et al even more power to choose what ads are okay, please.


What about ads that promote war?

That obviously seems to be promoting violence to me, but unfortunately, that doesn't seem to be a popular opinion.


Promote it in the abstract, or promote a very specific war.

The first seems inside the line to me (stuff like 'support our troops' or 'Join the army, take the fight to our enemy'). The second is past the line (stuff like 'we need boots on the ground in Syria, write your senator!').

I guess legally there is a difference between 'promoting violence between citizens' and 'promoting violence in war'. Since the first is usually illegal, and the second is not.


In general, government has a monopoly on legal violence. As such ads that promote war are promoting legal violence. Rules against promoting violence are (implicitly) targeting illegal violence.

At the same time, if FB reduced political adds (including any war/military related ones), I would be supportive.


How much money would Facebook actually lose by banning all political ads?

Seems like the amount of revenue their getting from political ads might not be worth the headaches and distraction they are causing the company.


You might be right, but I think banning political ads would probably lead to people doing even more massive political astroturfing campaigns within facebook using the funds they would've spent on facebook ads.

So facebook goes from having a political ad problem where at least they get paid, to having a political manipulation/content problem (well a bigger problem than they already have) that they still have to solve but don't get paid for.


I think it's less about revenue and more about common carrier status. Though FB has recently put out confusing signals [1], it seems that if they start to editorialize what content is delivered, even just a fraction, they are liable to editorialize it all - at HUGE cost.

1 https://www.theguardian.com/technology/2018/jul/02/facebook-...


They currently do all sorts of blocking on what content is delivered. Try and run an ad with nudity. The sorts of nudity that is routine in a European advertising context is blocked on Facebook, because they choose not to. They could easily do the same for political ads, but they don't because they choose not to.


This article erroneously claims that if Facebook is a publisher it loses section 230 protections. This is not the case. See https://www.eff.org/issues/bloggers/legal/liability/230 for more details.


Facebook is not a common carrier, and has never been one.


I guess the problem is what is political and what is not political is not always so obvious?


Looks like it's $81M in total: https://www.cnet.com/news/trump-clinton-election-facebook-ca...

Although I've read elsewhere that Trump's campaign alone spent ~$100M on FB.

Given the level of pressure on Zuck, I'd say he's not doing this to make more money. He's doing this to avoid establishing a precedent, and avoid negative repercussions later on. For him it pays (in the long term) to be as neutral as possible.


> That's not going to end well for them.

Define what you mean? Like users will stop using it or advertisers stop advertising?

Facebook IS already banning some categories of ads that could normally upset you - for example you cannot advertise a Gun Store - even tho owning firearm is rooted in being an American. But I don't see Americans leaving in hoards to some other venue. Same with sex ads, viagra, etc, etc. Facebook has their own reasons to block them and it only "ends for them" with higher stock price.


It's going to end badly for in the form of more Congressional oversight, not less; more politicians yelling at them, not less; and more people angry at them, not less.


I’d be okay with facebook just not running political ads, tbh.


I don't get it, how come there's no outrage over political ads shown on TV? If it's a legal ad, why should Facebook police it?


Political ads on TV are regulated.



how so? I didn't see any mention of the content of TV or radio being regulated (other than the disclaimer of who paid for it) in my quick search.

This article appears to say that the content is unregulated. https://www.thebalancecareers.com/should-tv-stations-ban-fal...


But the ads in question were aired on TV, this letter is asking FB to go above and beyond what most TV networks are doing (CNN being the only one who did not air the trump ad, objecting because it called them "fake news")


Because lying TV ads are old news. FB is an easy target, by all the parties.


They should just categorize all political ads as "likely untruthful and misleading" and label them as such. If you're a politician and you're advertising on Facebook, your ad should be label "This ad is political and is likely untruthful or misleading. Viewer discretion advised."


Imagine if Facebook required all Ads to be truthful and anyone found to have posted a false Ad would have all their Ads taken down for 2 weeks.

It's not like the Ads are going to go away.


Ads can be both truthful and misleading.


The problem is who decides the truth?

Children in cages Abortion is murder

Both phrases are true or false depending on who you ask and the end result will be no ads for any politician because they all use questionable language that could be considered false.


They should have a 'truth' button to accompany the 'like' button, and so people can see who thinks something is 'true', and if they identify with that group. Then there is no censorship, and people also get some sort of crowdsourced 'truth' value that allows them to see the source of bias.


Sorry, that just sounds naive. It’s all well and good until the echo chamber you live in meets a different echo chamber, which has a different idea of what’s true. The resulting conflict would instantly devolve into furious tribal warfare with people using the ‘like’ and ‘truth’ buttons as synonyms. Except with the worse feature that a popular post would seem to be additionally endorsed by Facebook as true.


Good point. 'like' and 'true' are not really different. Maybe allow people to 'like' political ads (or ads in general) will have the same effect.


Bots and troll farms would be popping champagne the day it was introduced.


How do TV stations do it? How does the radio do it? Use those same rules.

I feel like ads should have equal time provisions too.


There was an ad that had Paul Ryan pushing a grand mother off of a cliff that ran on TV. So Facebook is free to run that, right? Swiftboating, GWB draft dodge charges (later rebuffed after CBS reported it,) the famous "Daisy" ad that was run against Goldwater, etc. I'd love to see where people draw the line.


Giving two wolves equal time to discuss their bid for the county shepard is, in itself, not going to produce better governance. (And will happen to entrench the false dichotomy of 'there are two, and only two, valid choices.)


In this analogy, if the people are truly sheep, there is nothing to be done. Democracy is finished. The entire premise of Democracy is that voters can be trusted on average to make correct decisions on who should lead. If that premise is false, the wolves win in any scenario no matter what.


If 60 voters want X offered by two candidates but 40 want not-X from one candidate, not-X wins 40-30-30. It's not a false dichotomy, it's the flawed outcome of plurality voting.


There is regulation for political ads in almost any medium. I don't see why Facebook is exempt from that.

The anger towards Facebook is easy to understand as they categorize Breitbart News as "trusted" in their News section.


agreed. i think the only right answer here is to just ban them all especially if its not a significant source of revenue.

ruining the brand for pennies on the dollar is not smart.


Political ad spends is expected to be 6 billion USD. It is by no means a small market. Not to mention influence that gives in lobbying


Would anybody besides Facebook's shareholders be upset if they just deemed political advertisements verboten on the platform?


From what I've read on the subject, it seems like limitations like this generally serve the incumbent (who benefits from default name recognition that challengers lack).

This is perhaps reasonable approach (many people in this thread are suggesting this!), but it seems like this a thorny problem because there is no "neutral" position.


"Put them all up against the wall" is pretty neutral. Just saying.


They could ignore truth all together and simply focus on factual accuracy.


It’s this type of subtle, rational-sounding pacifism that paves the way for evil to erode societies. Regulatory institutions have to actively and vigorously prohibit brainwashing of the masses. Even if it’s unpalatable, it’s the only real way to preserve democracy.


"I plan to do X, vote for me" truthful.

"Don't vote for Y, they want to kill babies" not truthful.

It's pretty clear to me which ads are in bad faith. I really just wish Americans were bright enough to see through the bullshit themselves but alas, here we are.


> Wherever they draw the line, they're going to reject someone's ad, and be exposed to screams of how they're biased for the other side

Today it's not about being "unbiased" and objective, it's about receiving equal amounts of bias allegations from both sides. That's what the NYTimes has done, liberals are saying it's not harsh enough on Trump.


In the recent Canadian election, the conservatives used Facebook to target Asian Canadians with ads falsely claiming that the liberals wanted to legalize hard drugs.

It shouldn’t be facebooks job to police that though, we should have legal recourse when that happens


There are some things that are just provably false ("Donald Trump voted for the green new deal") - these ads should not be run.


Yet, that one is right next door to a very frequently deployed political lie: $POLITICIAN voted against $VOTER_DESIRED_POLICY, when in fact what they did was vote against some massive bill primarily about some other topic entirely that contained some head nod towards $VOTER_DESIRED_POLICY that may or may not even have done anything, or voted against a bill that did indeed have $VOTER_DESIRED_POLICY but had so much other crap in it they couldn't vote for it for some unrelated reason.

Teasing that sort of thing out is above the pay grade of anyone Facebook is going to hire to resolve it.


FB is either a service provider or a content moderator.

One brings liability the other provides an exemption.

I know what I would choose. Let the people ads lie about sue the ad purchaser


They are clearly a content moderator. You can tell that by trying to post a nude picture.


Content moderation does not reduce or remove the liability protection of section 230 of the Communications Decency Act.


Not real sure how you can put political campaign ads in the same bucket as child pornography and still claim "good faith" ?


Where did I say anything about child pornography?!

Are you perhaps confused because I mentioned the Communications Decency Act? It's true that one of the purposes of the CDA was to regulate pornography on the internet (although I don't think it was about child pornography--the pornography concern was about children seeing pornography on the net), but those parts of the CDA were struck by the Supreme Court in Reno v. American Civil Liberties Union, 521 U.S. 844 (1997).

The CDA also, in section 230, said that providers and users of interactive computer services are not to be treated as the publisher or speaker of any information provided by another information content provider. Section 230 was not struct down in Reno v. ACLU, and remains in force.


Intent. The law always uses intent. To compare political ads to the list below is facetious at best and I was calling out one example of content that could be moderated with protection and comparing it to a political ad. You seemed bothered by the comparison. As would most rational folks be. Not sure why this is difficult to understand.

(2)Civil liability No provider or user of an interactive computer service shall be held liable on account of—

(A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or


Actually, law does not always use intent. Intent is the primary basis for one of the philosophies of jurisprudence, not every judicial body/judge follows that. See textualism for an example where intent is irrelevant.


Found the lawyer


Hah. I'm flattered, but I have never been remotely associated with law either through education or through work. :)


Would you agree "good faith" implies intent is measured ?


Good faith implies intent of the entity is measured, not necessarily of the way the law is written. Textualists like Gorsuch would actually not read into the intent of the law writers and interpret "good faith" how he pleases cause that's as specific as the written law is. Don't you agree?


Not sure I know the supreme court well enough to judge their judging

I just try to interpret the laws as they are written with my limited legal experience :)

That said I think I can agree


One thing to note: this only applies to politicians. It seems normal people and political groups can't post false political ads. https://www.washingtonpost.com/news/powerpost/paloma/the-tec...


Very much this. The top comment on the thread is just ignorant of the situation.


This is what makes Facebook's behavior really inexcusable. Facebook has granted these people a special power on the platform, but are unwilling to police it.


Once again you are dammed if you do and damned if you don’t. If you like, imagine a totally random hypothetical set of Facebook rules (for example, the exact reverse of what the current rules are). You will find that it is easy, really completely trivial, to come up with reasons that this is “inexcusable” as well.

The whole discourse around this is poisoned. I have literally watched people completely flip their principles in mere minutes, just to keep condemning Facebook.


You're not damned if you don't. Stop hosting political ads. I guess then you're "damned" because you make less money, but I don't think that's what you meant.


And what is a political ad?

Does an ad for an abortion clinic count? How about an ad asking for donations for Palestine? Or Hong Kong? Or wall construction? How about an ad that only points out that stocks have gone up, along with a picture of Donald Trump and a flag in the background?

Obviously this falls prey to the classic problem, which is that every partisan sees messages that benefit the other side, even incredibly indirectly, as insidious manipulation, while dismissing outright propaganda from their own side as “just getting the truth out”.


The FEC already has rules on what defines a political ad, including mentioning or displaying a candidate in a window before an election. There's additional distinction on how the ad was coordinated, the type of organization behind the ad etc. Those advertising politically in a federal election are included in reporting their activity to the FEC.

So if an independent org ran those three ads, only the third could be considered political for election purposes.


What about all the political ads that are not connected to federal elections? People in this thread keep throwing around "political ad" as if it is a synonym for "election ad". It isn't.


I think tying it to the source paying for the ad would be an easy delineation.

Is the purchaser a PAC, political campaign, or someone personally running for office? Political ad.

Is the purchaser a private individual, or company? Nonpolitical.

I'm not an expert in this area but I'm fairly certain FEC and IRS already have designations on what counts as political, one could just use those.


Why not bias on the side of denial if there's any doubt? The people generating such content don't need to be forbidden from generating it; but there's a difference between sharing it in communities or groups or with friends, and paying to have it inserted into the News Feeds of people who didn't ask for it.


>> Does an ad for an abortion clinic count? How about an ad asking for donations for Palestine? Or Hong Kong? Or wall construction? How about an ad that only points out that stocks have gone up, along with a picture of Donald Trump and a flag in the background?

FB has armies of content moderators and content policies to police non political speech already. Look at the controversies surrounding "no nudity/sexual imagery" > "banning/unbanning breast feeding pics" > "banning/unbanning pics of the african tribe who nurse sick goat calves back to health with human breast milk".

Its all complicated, every bit of it, and there is no reason to suggest that political content moderation is any more difficult than anything else.

FB _can_ do it, they are already doing very complex content moderation. There is nothing special about "political" speech.


> Look at the controversies surrounding "no nudity/sexual imagery" > "banning/unbanning breast feeding pics" > "banning/unbanning pics of the african tribe who nurse sick goat calves back to health with human breast milk".

That is entirely my point. This is actually a famous story -- something as objective and straightforward as banning nudity turned turned into a nightmare with tons of exceptions and exceptions to the exceptions. Politics is the most delicate, messy, complicated subject there is. Facebook obviously can come up with some policy, just like I could right now, but whatever it is, a majority of people will be pissed off.


> there is no reason to suggest that political content moderation is any more difficult than anything else.

Pretty surely it is. Everything can be politicized, or be perceived to be politicized by the other side. Even a breastfeeding image can become political in an election context. Will FB take down posts that laypeople send about candidates? Even the ban itself would become extremely contentious, political issue. That's a very messy situation that directly infringes on a bunch of rights from freedom of expression to association to basic voting rights.


It might be worth considering the political fallout of the platform Congresscritters use to get elected refusing to deal with them.


250 signatures on a letter at a company with 35,000 employees is an eruption of dissent?

I'm reminded of a quote about urban warfare, something like "In a city of 10 million, if 1% of the population opposes you, you have 100,000 adversaries." That seems to apply here.

0.7% of the company is writing to complain? Okay - what amount do you expect to complain? How many would complain about the opposite direction?


Publicly stating your opposition to a contentious issue your company is facing isn't exactly an easy thing to do. Workplace marginalization is a real thing.


It's pretty common at Facebook. Andrew Bosworth repeatedly writes about the importance of malcontents, and there are many employees who repeatedly push back on leadership. This takes place almost entirely on Workplace.


I'm not sure why this is getting downvoted. This is correct.

I've never worked somewhere where challenging leadership is so encouraged.


Don't you think workplace marginalization would cut the other way? If you have 250 people angry enough about this to leak to the press, imagine what they'd do to their fellow employees who oppose them.


This piece is a repeat of a strategic smear, for the obvious reason that more Republicans and moderates can be reached through Facebook than Democrats, so if they pressure Facebook into not running political ads, they win a bit, and if they use the process abusively, they win also.

You can tell by the way this issue was raised when Zucc was at Congress the other day: AOC just sat there spouting incoherent nonsense that could be cut into soundbites, and was 100% uninterested in Mark's clear answers to her non-questions. This one is entirely about framing.


The media went after Facebook when they started losing all that sweet money to them. Current political situation was an icing on the cake for them, as much as I despise Zuck and FB (I find their business model to be vile), it's more than obvious the media has a strong agenda and really are after them, no matter what Zuck or FB does.


I don't want a private owned platform that as far as today plays a huge role in being a place for the public discourse to regulate speech.

Of course, and it pains me to have to state the obvious for fear of strawmaning, saved the applicable boundaries like distribution of child porn, terrorist groups recruiting pages and so on.

Take Twitter as an example, does an abhorrent job on that, they have no standard other than liked/disliked people.

- They have no transparency, people you follow will get banned and you won't be notified. - People will get banned for tweeting exactly what "liked" people tweeted. - They made the utterly stupid decision of changing the interpretation of their Verified mark from "This is the real person" which is perfect, to "we kind of support whatever this verified person says" which makes no freaking sense. Can you imagine a Bell PR guy saying at a press conference "We are sorry for what one of our landlines customers said on the phone, we turned their line off"?

It is transparent to me that any effort in making platforms like Facebook, Twitter and so on to take the role of speech regulators isn't coming from regular people, it comes to the detriment of the common folk like me and you.


> I don't want a private owned platform that as far as today plays a huge role in being a place for the public discourse to regulate speech.

They aren't anymore influential than news broadcasters for television were two generations ago. And yet, we managed to create satisfactory regulations for broadcast networks that did nothing to impede free speech.

I think we have a choice: regulate them such that a bare minimum of due diligence is required to post political ads; or let the problem persist until someone else "solves" the problem for us. Enemies of the USA are using Facebook as a weapon to undermine our country. There's no longer a doubt in my mind that, left unchecked, a persistent, well-orchestrated propaganda campaign (through Facebook) have Americans embracing a truly authoritarian regime. At which point, we won't need to worry about free speech anymore.

Maybe its selection bias, but I know a whole lot people who were radicalized by FB.


> I don't want a private owned platform that as far as today plays a huge role in being a place for the public discourse to regulate speech.

You also have to remember that part of speech is the fact that I won't be using a platform that deliberately enabled dis/mis-information.

There's also a clear case to make against allowing explicitly false things being run as ads.

If something is provably false I have no problem with them blocking it. There are not free speech arguments to be made here. There's no benefit to allowing it.


I downvoted you because you are basically saying that if we don't agree with you, we're not regular people or common folk.


> I downvoted you

I initially frowned upon your comment, interpreting it as self-involved, ego-stroking, grandstanding. Upon further reflection, this is a forum, and one that that aims specifically for intellectual discourse. If we are not simply to degenerate into some kind of karma farm, i think opening the conversation in this manner is a good idea.


No, the comment was about the source of the driving force of the effort. Even if there are regular people who share the same sentiment, the fact is that the effort is being coopted by those special interests. Have you heard of the term "astroturf"?


Exactly, thank you. It is clear some regular people approve, support, and demand that too. But I think the bulk of the effort comes from big companies and ONGs, pressure groups.

This is at least true for my country, to put it brutally short, European influences are guiding legislative efforts regarding Fake News and media control.


It would give Facebook way too much power if they could decide what is true and what is not. Better allow lies than to block free speech.

I wonder if this open letter by employees advocating for more control over content combined with Mark Zuckerberg's 'hands-off stance on political ads' are just a coordinated act of 'good cop, bad cop' designed to manipulate the public. Also, my cynical side thinks that maybe some of these government authorities are in on this charade.

It seems like a show to make people think that the good employees of Facebook are on the public's side. Whatever the big mean Zuckerberg wants must be bad for everyone.

Facebook must have a PR team the size of a small country working for them by now. Of course everything they do is orchestrated. We have to be really cynical to see through the BS.

The government is completely under the thumb of these big corporations. Many of the regulations that are coming out of Washington are carefully crafted by corporate lobbyists to superficially look like they're bad for corporations and good for the public, but in reality they're intended to give corporations more power and to create a moat around their monopolies. The government and corporations are on the same team; their common objective is to fool the public into slowly accepting the erosion of their most basic rights so that corporations can have more money and governments can have more power for themselves.


> Facebook should allow politicians to lie in ads

There are more options than "Facebook regulates its ads" and "Facebook permits everything."

I am in favor of having a permanent, public repository of political ads, where they have to sit for a "cooling-down" period prior to going out. This lets journalists and the public fact-check them before they're blasted to a targeted group.

Another option is FEC, or some other independent body, overseeing political ads. There are a lot of options between open season and censorship, and it's only to Facebook's benefit that the debate is constrained to this axis.


We don’t need journalists to have even more of a strangle-hold over political discourse. Because you know that any such institution would just let pass false information that just happens to comport with liberal beliefs, while giving enhanced scrutiny to assertions from other viewpoints.


I was just going to upvote and keep scrolling, but I saw the three replies to your post critique something you said that I strongly agreed with.

> We don’t need journalists to have even more of a strangle-hold over political discourse

This so true in this day and age. The quality of journalism has collectively nosedived, but the journalists keep producing and continue to guide the “conversation”. I’m not quite sure what those who disagree with the statement actually do believe is going well and why journalists need to have even more control over the conversation.

> ...would just let pass false information that just happens to comport with liberal beliefs

This is the natural course for a technology company in modern society. The biases of tech company employees is clearly more left-leaning. If Facebook employees or contractors were deciding what’s true and false, the information displayed on Facebook would (does?) have a leftist slant. This is a bad thing to any fair-minded person and why Facebook shouldn’t be regulating free speech in political ads.


  We don’t need journalists to have even more of a strangle-hold over political discourse.
This statement doesn't make sense to me. Nothing to do with conservative vs. liberal biases; just that the majority of current political discourse does not go through even nominally journalistic outlets, let alone rigorously journalistic.


> We don’t need journalists to have even more of a strangle-hold over political discourse

How does "a permanent, public repository of political ads" give journalists any advantage? I just mentioned them as one potential user of a public database.


If anything, I think journalists need more influence on political discourse. Currently the discourse is controlled by media companies, not journalists - and there is a big difference there.


> This lets journalists and the public fact-check them before they're blasted to a targeted group.

I believe this proposal to be worse than the current state of affairs. I would not delegate censorship of what I may read or watch to a non-elected political body, and doubly so to one which skews >90% to one political party.

> Another option is FEC, or some other independent body, overseeing political ads.

Once again, I do not understand this rational! I assume you do not like how one party in particular is acting, but you would delegate that very same party the power to oversee the censors over all online electoral ads?


> I would not delegate censorship of what I may read or watch to a non-elected political body

Who said anything about censorship?

What I proposed has two parts. One, when a political ad is bought, it must wait a certain amount of time (I propose a week) before being displayed. Two, immediately on being bought, it must be made publicly accessible in a permanent database.

The latter lets anyone fact check the ad. They DON'T get to take down the ad. But the public gets time to evaluate it before it's blasted to users. This deters micro-targeting groups with misinformation. It also creates a public record of what a campaign has said to whom, for contemporaneous and later reference by the public.

A lot of people got triggered by the term "journalist." I'm using that term broadly, to include anyone from a hobbyist with a blog to the mass media. I fundamentally disagree with censoring political speech, even lies. But the current system is unworkable. We need heightened and broadened disclosures for micro-targeted ads.


> But the public gets time to evaluate it before it's blasted to users.

I believe I understand the implied difference between sequestering a video somewhere where it can be evaluated by the public and simply "blasting it to users" alongside content they are consuming, but I wonder if sequestering videos in this way would have the effect you think it will have. It's a system that would reward highly provocative videos with the potential to immediately go viral regardless of their intrinsic merit or truthfulness.

> A lot of people got triggered by the term "journalist."

Right. The issue with the system you're describing is, entirely too many people will react to an online story from the NY Times pointing out a dozen untruthful things about the latest, newly introduced and sequestered angry political ad from whoever by delighting in the distress caused by the lies in the ad, joking about how the libs have been owned, passing links around to the video, and generally spreading falsehoods about the critiques of the ad and the people doing the critiqing. If the whole trolling effort is successful then it will become a meta-story about stupidity and falsehood among the people who aren't dumb enough to believe whatever is being asserted in the ad, but the one thing that will not happen is everyone joining together to condemn the falsehoods in the ad.

> I'm using that term broadly, to include anyone from a hobbyist with a blog to the mass media.

Right, and as nice and democratizing as that can be, the practical effect is often to legitimize bullshitters and put them on the same level as serious people. I am afraid in the example above that is part of how these ads would go viral.

Or maybe it's not all that bad. I guess I hope not. I do think your idea is interesting.


So long as it deters misleading ads, does the mechanism really matter? It doesn't have to be strict consensus.


I am not at all sure it would deter misleading ads, since the truth or falsehood of a given claim or set of claims does not necessarily matter much to a chunk of the population. I am pretty sure the mechanism the OP described might reward especially outrageous ads designed to break out of the cage and go viral, in the manner I described.

I'm not saying it's not worth a try.


option 2 will create status quo bias. The system in power will marginally favor ads that keeps the current system in power.

option 1 can't be effective unless you also know the targeting criteria of the ads. Otherwise, interested parties cannot create 'counter propaganda' at the same cost/benefit ratio as the original propaganda campaign. And all it is really doing is decreasing the time to counter propaganda, not reducing the volume of it.

good ideas tho


And yet Facebook is a privately owned company and thus should be free to allow or deny any content including particular political ads. Isn’t that the whole freedom of speech thing?


Facebook, like any corporation, has a charter granting it extraordinary powers by the society in which it operates. Implicit in that grant is the idea that the corporation betters society by its existence. If, by reaching a certain scale, that society by necessity has to say "hey, you need to do X because you're not keeping up your end of the bargain", that's completely fine.


> Implicit in that grant is the idea that the corporation betters society by its existence.

Really? I don't think that is a thing, but happy to learn otherwise with a reference to some legal doctrine or legislation that spells this out.

That isn't to say that there can't be additional legislation that regulates an industry beyond the legal infrastructure that enables the formation of a corporation.


You're asking for "legal doctrine". I am pointing out to you telos.

I mean, forget stakeholder theory and other minimally responsible theories of operation, because that stuff is kind of new--just look at judicial dissolution. It exists on the books and it's been done in practice. That ability would have no reason to exist if it were to not protect the United States and its citizens from the malfeasance of a corporate entity to which a charter has been granted.

But this argument is wholly backwards. Why should corporate entities be granted any rights at all? We deign to grant them the status of existence. We, the people, not the fictitious ones but those pesky "natural" ones. Why should a corporation exist for any other reason than to better our society? Why should a society empower random individuals to create liability-shielding immortal pseudopeople if that society doesn't think they're supposed to make that society better by existing?

Hell, go even further. Why should a society do anything if it doesn't make the society in aggregate better for the doing?


> Why should a corporation exist for any other reason than to better our society?

Because it represents liberty. The freedom for people (natural ones) to assemble and organize and pursue their own interests as they see fit.

Our social contract is that the government protects the rights of individuals not that individuals serve the abstract needs of "society".

Of course that doesn't mean that a corporation (or any other group) is outside the law. I'm just pushing back on your notion that there is some abstract litmus test that says a group (corporate or otherwise) must "better our society". I don't even know what that really means and it sounds to me very much like the tyranny of the majority.


> Because it represents liberty. The freedom for people (natural ones) to assemble and organize and pursue their own interests as they see fit.

Liberty requires limited liability now, does it? Strikes me as a very feudal idea of liberty--the liberty to do unto others but to escape them doing unto you.

> I don't even know what that really means and it sounds to me very much like the tyranny of the majority.

It means that a corporation is trusted to not be an asshole to the people giving it its literal-not-figurative superpowers and when that corporation breaches that trust its fictional life can and should be forfeit.

What you call "the tyranny of the majority," I call "a community and a polity having the right to not be mooked by the first con man to come down the pike." Because people, actual people, matter more.


I feel like you are changing the goal posts. I explicitly said that any group isn't above the law. So I'm not exempting fraudulent behavior by any means. Nor did I say that liberty required limited liability. I've tried to suggest that my thoughts are about all sorts of groups, not just limited liability corporations.

What I was pushing back at was your suggestion that a corporation must "benefit society". That particular phrasing is completely open ended in my mind suggesting way too much power to be wielded if you aren't "benefiting society". Perhaps that isn't what you intended but IMHO appeals to "improving society", "helping the children", "the greater good" in the abstract are either power grabs or empty phrases. The specifics mater.


"All sorts of groups"? Only a very few sort of groups need special legal imprimatur to exist. And if you want to be granted special and extraordinary treatment by a community, why would that community not expect you to not be an asshole about it? If you don't want to uphold your end of that bargain there's always the general partnership. It confers no special legal privileges, it's purely contractual, but it's actually a "group of people" rather than A Fictitious Person Created To Avoid Liability. In my experience conflating these is never, never, done for good reasons, so I'd certainly like to hear yours. Maybe it'll be the exception, so lay it on me.

Here's a concrete example of what's actually a power grab: the Dole company literally-not-figuratively knocking over the Hawaiian monarchy and the subsequent creation of the "Republic of Hawaii" as a Dole-owned, Dole-operated state. Why is that not a "power grab" about which you get up in arms? Why is it that it's a "power grab" to say "hey you have a duty to not do that kind of shit"?

What you are categorizing as a "power grab" is a refusal to accept the status quo as legitimate just because it happens to be where we're at right now. Why is it that that advocacy, the use of the state to control its creations, is the "power grab" here--and not the accumulation of that power in the first place?


The silence speaks volumes.


Or maybe I just got tired of trying to explain myself.

You keep changing the topic and I keep trying to come back to my observation that requiring a corporation to "benefit society" is not actually a thing in law and would be a bad idea in any case.

If your definition of "benefit society" just means "doesn't break laws". Then I would agree with you but there really isn't any need for such indirection and expansive language about "benefiting society" in that case.

So I gathered that you were suggesting that there is some administrative hoop that a corporation must jump through to convince you that they are "benefiting society". But that language is hopelessly broad and just begging to be abused by anyone in power who doesn't like what a disfavored group is doing (even when they aren't breaking any laws).

I bet there are a lot of people who think that Fox News isn't "benefiting society" and I'll bet there are also a lot of people who think that CNN or MSNBC aren't benefiting society. Are you telling me that the majority opinion should hold here and organizations disfavored by the majority can be legislated out of existence because they don't "benefit society"?


Agreed. The ability to assemble and organize and pursue interests is distinct from the specific mode of organization of the limited-liability company, where stockholders (the owners) and protected from liability for the actions of the company. Other modes of organization (a partnership) arepossible, and could be argued to have stronger social outcomes due to the lack of insulation from liability.


This is such a tired argument, but I'll throw this out anyway: the parent isn't claiming that Facebook is legally prohibited from selecting which content they allow through their platform. They are legally allowed to choose to block certain content, but this does not make them exempt from public criticism.


They do seem to have successfully shifted the debate to whether or not they should be deciding about the ad instead of why they are somehow exempt from FEC regulations.


I'm pretty sure that the government is in on it. Every elected member of government is probably just thinking to themselves "I better be really nice to Facebook if I want to get myself re-elected".


I have a lot of issues with the framing of this article (it's hard to imagine any major strategic decision of Facebook that you couldn't find 250 employees to sign their name opposing it, and nearly everything that happens at Facebook has a corresponding, public, Workplace post).

Moving past that, the ideas mentioned in the final paragraphs did have one interesting suggestion: change the visual display for political ads. Zuck has consistently made the good point that it's very difficult to set a clear boundary for what constitutes a political issue, but it is not difficult to determine whether or not an ad is being run by or in service of a given politician. Changing the visual display (even something as draconian as a persistent disclaimer stating that this is an advertisement with claims made by a politician and that everyone should do their own research) would at least remind people of the policy.


Why should Facebook now be responsible for fact-checking political ads? Watching the testimony where a bunch of politicians basically berated Mark Zuckerberg about how it’s now Facebook’s job to police politicians and keep them honest (because, you know, ALL politicians lie and cannot be trusted) is very telling about the state of our government and democracy. Our leaders cannot police themselves or their peers so they are looking to an outside entity to do it, and moreover casting blame for their own failures.

Political attack ads have always been on cable TV, spouting bold-faced lies and half-truths for as long as I can remember. It now seems that politicians have found a new medium. And they want that service to bear the brunt of their operational status quo. Why not address the real problem in politics that leads to the symptoms of the disease at hand instead of shifting the work and burden of honesty to someone like Facebook? Has it even been proven they are equipped and capable of the task?


Forget about "equipped and capable". Who decides what is "true"? Is it the the US government's official version of "truth"? Is it "truth" that has the most objectively provable information? How are vague and/or ambiguous claims, about anything, supposed to be regarded in relation to their "truth"? Only someone who is incredibly stupid and/or trying to advance their own agenda would suggest that there is some sort of indisputable "truth" that can be discerned, let alone discerned by Facebook.

The easy answer to this problem is, and has always been, to teach everyone how to think critically. Teach the Socratic Method to every child from birth, and reinforce that mode of thinking continually through their entire educational journey. But there lies the rub. The powers that be aren't interested in "truth" or people who are able to discern truth. They want to be able to disseminate their message, and have it uncritically absorbed by the masses. That isn't possible if the people are intelligent and equipped with the tools to think critically and recognize logical inconsistencies. For decades they had it both ways. The government had a population who largely lacked critical thinking skills and uncritically absorbed what they were told as "truth". The government also largely controlled the message disseminated by the corporate media outlets, which were very few before the explosion of the internet. They controlled the narrative, and had conditioned a population that uncritically accepted this narrative. Once the internet exploded on the scene, they lost control of the narrative. All of a sudden you had people who weren't equipped with critical thinking skills, who were extremely vulnerable to whatever narrative was being pushed, and you had a wide variety of people pushing a whole range of messages on the internet and over social media.

Now the genie out of of the bottle. Information and (dis)information spreads like a virus across the internet and social media. How do you deal with this? There are two ways. Either make a concerted effort to teach people how to think, hash information and employ logical consistency in their thinking (which the powers that be don't want to do, because it means the permanent loss of control), or try to put the genie back in the bottle and silence all competing narratives. Its clear that the powers that be have chosen to attempt the latter, and it won't end well for anyone who believes in freedom of speech, freedom of action or freedom of thought.


> For the past two weeks, the text of the letter has been publicly visible on Facebook Workplace, a software program that the Silicon Valley company uses to communicate internally.

For those who have never used Workplace, this literally just means "someone posted it to Workplace." It's not an abnormal or unique thing. It also wouldn't surprise me if "250 people signed it" means "250 people commented in agreement". I wish the reporting gave more details on who posted the petitions and what it means to "sign the petition". I understand protecting sources, but unless Workplace has added new features, anything posted has to come from someone with a profile.

That said, it's still (arguably, at least) news to cover internal divisions over a policy, but unfortunately the authors don't seem to realize how common it is at Facebook for employees to openly push back on leadership decisions while concurrently working as hard as they can to deliver impact downstream of them (it may sound odd, but it's entirely possible to disagree with a strategy and vocally advocate for your preferred course but also trust that your leadership may be better equipped to set said strategy and work to implement a strategy that is not what you would have chosen).


Is less than 1% of an employee population expressing disagreement worthy of a discussion on HN?

NYT has completely dropped their mask of objectivity. This is clearly agenda pushing by an extremist minority.

WaPO called 'al big-daddy' an "austere scholar" yesterday. Beyond ridculous.


Personally, I agree with you in this example. As I mentioned elsewhere, given the culture at Facebook and its size, it's hard for me to imagine any major strategic decision that would not generate 250 employees willing to sign their name opposing it, down to things like eliminating single-use plastic.

In my original comment, I was simply conceding that covering internal disagreement about a major policy, in general, is at least arguably news, while still trying to make the rest of my point: that this article is either written in bad faith or wildly unaware of how employees communicate internally at Facebook.


It seemed to me you had insider knowledge of FB and were adding value to the discussion. Fair point that I missed your key point that is based on knowing internal politics/ procedure. I look at the big picture of the U.S. Constitution, and distributed power.

I believe in the Fediverse, and would rather take your employer out in the marketplace, than using the federal government's monopoly on violence. You might want to reconsider your alliances.

Find me a hedge fund that will short FB's stock and invest in Mastodon-based service companies (no ads or tracking), please.


I'm... not sure what you mean about reconsidering my alliances?


Facebook has no business saying which ads are wrong/lies. If they do this it opens up such a can of worms. Their stance is the only logical stance. I imagine if AOC's ads were blocked on Facebook she would suddenly want answers and claim she was censored.


> Facebook has no business saying which ads are wrong/lies

I mean it can and it does. The problem for Facebook is who Facebook should pander to. Politicians that want to break up Facebook, Google and co, or their opponent? There is an obvious conflict of interest, but many businesses have no problem promoting this or that political camp or politicians.

What is interesting in Facebook case is the internal struggle between the top and some political activist employees who disagree on who Facebook should give a platform to.


The people that want to break up Facebook are the same people that want Facebook policing political speech. I don't see the conflict of interest on Facebook's part in this regard.


You're making an awful lot of assumptions. I want to break up Facebook; but so long as it exists in its present form, I don't want it to be policing any speech, much less political speech.


Yes, but the dissident employees also want Facebook policing political speech.


This has came up in my conversations with friends who work at Facebook and they always seem to use some internal talking points about creating a "Ministry of truth" type of situation. They argue that Facebook cannot (or should not) be the arbitrator of truth. My answer to them is very simple, if you want to be a (social) media company then you have to take some (social) responsibility and not amplify falsehoods in an already charged environment. Corporate profits at the cost of ruining the society by spreading falsehoods should not be an acceptable norm.


Facebook already bans ads that all for violence or try to intimiadate voters. That's taking some element of social responsibility.

Gauging what is and isn't true is not a trivial task. Say a party runs an ad opposing nuclear power calling it dangerous and unsafe. The opposition phones into Facebook and complains about a factually incorrect ad, pointing to the fact that nuclear energy has inflicted much less harm per megawatt hour than most of not every other energy source. Should Facebook pull the ad for being false? Swap out nuclear for pretty much any other controversial opinion, like the effectiveness of busing or effects of taxation on the economy, and we can see that determining truth is a nigh impossible task.

Making Facebook an arbiter or truth seems like a remedy that is worse than the illness.


We’re acting like this is a problem that effects both parties equally when it’s clearly not. Who are the Russian trollbots helping? Who denies climate change and climate science? Who routinely lies? It’s primarily one political party. So Facebook saying they don’t want to get involved is just to appease republicans and they even do things like add Breitbart to the “quality news” tab... they’re clearly not doing all his to be impartial or neutral—-it’s purely to appease republicans who have to lie and deceive voters or else they’d lose.


> Who are the Russian trollbots helping?

Russian interference was both pro and against both parties - most estimates put them at 50-60% in favor of the right and 40-50% in favor of the left. They're not helping anyone, they're trying to sow discord wherever they can.

> Who denies climate change and climate science?

I regularly see people claiming that if no action is taken in the next 12 years humanity will go extinct. This runs contrary to mainstream climate science.

> Who routinely lies?

I think I've demonstrated why this is problematic for Facebook to determine.


Bullcrap false equivalency without any citation for those stats you pulled out of thin air. Russian trolls aren’t helping both sides... and when they help democrats or progressives it’s a stooge like Jill Stein or Tulsi Gabbard, to syphon off votes and help republicans. Russian trolls are primarily helping republicans because republicans are weak on foreign policy to Russia and a lot of Russian oligarchs have been giving money to Republicans. It’s clearly happening more on one side than the other, so again the false equivalence and whataboutism to try and paint both sides as the same doesn’t work. Putin is mad as hell about the Magnitisky act—passed by Obama—-in contrast to Moscow Mitch McConnnell and Trump who have repeatedly weakened sanctions on Russia and gone out of the way to aid Russian foreign interests—-no comparable enablement on the left...

Denying climate science and being hyperbolic is also false equivalence. The only climate deniers in developed countries are republicans... it’s clearly an issue specific to them. One side is arguing the problem doesn’t exist and you’re acting like a a few extremists on twitter speak for the left. Again, not an intellectually honest or logically supported argument.

You really have only demonstrated your own biased perspective and false equivalence and whataboutism that’s used to pretend like both sides are the same when one is clearly worse than the other.

Go look how Trump and McConnell reduced sanctions on Russia and now a Russian oligarch connected to Putin is building a steel plant in Kentucky. The quid who pro is so obvious. Look at how Trump handed Syria to Putin on a platter... It’s so obvious Russia is helping republicans and not democrats, I can’t even comprehend how someone would think they’re helping both sides unless you’re just not getting beyond the conservative media bubble.


And I present exhibit A of why tasking Facebook with categorizing truth and falsehoods is never going to work. It will inevitably lead to charges of bias from people who think Facebook is being partisan with it's categorizations, and prompt responses like these left and right.


And precisely who do you think should be appointed to head your “ministry of truth”?

For example, I believe Facebook employees currently skew 90% Democrat. Should the committee seek out and add more Republicans in order to have a 50/50 split reflective of the nation, or is a 90/10 split okay? If so, how do you justify this to the other side?


> And precisely who do you think should be appointed to head your “ministry of truth”?

The FEC. If an ad can't be printed in a newspaper or shown on TV, it shouldn't be on FB.


Then you may be surprised to find out that most ads being discussed are allowed on TV: https://www.politifact.com/truth-o-meter/statements/2019/oct...


Advertisements that are false, in many cases, are legally required to be aired on TV or in newspapers, for precisely the reason that there is a public interest in avoiding media outlets becoming arbiters of truth and restricting access to political speech. Those media outlets are free to fact check and run articles or programs challenging the content of the ads.


The ads that FB are showing were aired on TV. In fact those laws would say that FB legally cannot apply fact checking.

Maybe take it up with the FEC?


We've been through this same conversation with the FEC decades ago. They ended up with precisely the same compromise Facebook is implementing now.


> For example, I believe Facebook employees currently skew 90% Democrat.

What is your belief based on?


Numbers.


I don't understand why Zuckerberg doesn't just cut his losses and remove political ads. It does not seem worth it financially or non-financially.


That’s unfortunately naive —- as we’ve heard, these days “the personal is political”. To a lot of people, if you disallow politics you are also restricting personal expression. Banning politics would lead to yet another doubling of the avalanche of one-sided hit pieces out there.


Refusing to allow that content as paid ads doesn't make the content go away. It just makes people game the system to get views instead of paying for views. Then the moderation problem would be even harder.


Exactly. I work in digital advertising and we just don't take political or issue based campaigns. The spend isn't worth it. The hassle isn't worth it. The optics aren't worth it.


How do you decide if an ad is political or not?


You don't take money from clients who are representing/selling anything other than goods or services. You evaluate creative & landing pages for each and every campaign. It's extremely simple.

The problem is that FB wants to eschew all of the responsibility for content running through their platform and run everything 100% programmatically for scale. They expect the advertising user to know what's kosher vs what's not, and they expect the advertised-to user to moderate content. They refuse to put in place even the most basic of business controls. That strategy uh... doesn't work. As evidenced by their many FTC consent orders and massive fines.


Because he will need it when he himself is running for office.


[flagged]


I've got comments trashing FB and Zuckerberg on HN from 2009. These people have managed to make Zuckerberg a sympathetic character to me.

I wish you would tone down your rhetoric, however. I've been guilty of it myself and understand your frustrations, but it makes things worse.


You're right, the point gets lost, I hear what you're saying. I guess it reflects that I'm tired of trying to have normal conversations about it, I feel like the people who blame everything on Facebook are completely unhinged.


They are unhinged. I've got worse things I've written than you because of my frustrations that I won't get into. We are in complete agreement, it seems, except on certain tactics. I will probably be more agreeable to your approach after 7pm, but I try to stay off HN after drinks.


I toasted my old fashioned to you tonight.


There may be some truth to what you're saying, but your first message is what most people would call unhinged. "Fuck people who believe in X" without then substantiating it (or substantiating with "it's so obvious I don't even need to") will only find an audience who already agrees with you and does nothing constructive.


You're being downvoted because you're grossly mischaracterizing the concerns of Russian interference. It was far more than "some Russian troll spending $300,000 in ads".

> fuck the politicians (and whoever else) who thinks people are too stupid to make up their own minds regardless of a couple of advertisements.

Well...many people are too stupid to make up their own minds. How many people believed in pizzagate?


Any googling results in reports that Russians spent, even at the highest estimates, literally immaterial amounts on ads relative to the candidates and PAC's. And people on the left still insist it mattered! Meanwhile, Mike Bloomberg said he wants to spend $500 million (WTF!!!!!!) on anti-Trump ads in 2020. Reminder: he isn't even a candidate! So is this interference???

It's conversations like these that make me lose my temper.


Money spent on ads directly doesn't reveal the actual budget of the campaign, as most of it was likely concealed in "news" in one shape or another.

>It's conversations like these that make me lose my temper.

Everybody thinks the other side is brainwashed. However, judging from your statements, the people you support are currently overwhelmingly in power so maybe you can find some consolation in that.


Have you ever considered that Russian interference was more than just paid ads?


Non-financially is a big one...they can really determine the outcome of elections here , that is a power they don't want to loose


> that is a power they don't want to lose What makes you think they want that power in the first place?


I'm not really sure how to answer that question without just saying how extremely obvious it is that a massive national (and international) company would benefit massively from having influence over USA national politics.


It is equally obvious that there would be an extreme backlash by politicians and regulators if facebook ever decides to (openly) use that power. Also, it would alienate at least half their user base.

In short, why would they even want to be seen as possibly having that power?


Oh, to be clear, I suspect they definitely don't want to be seen as having (or using) that power.


Because they get to sell that power (advertising space) to others.


Power doesn't just fall into people's laps, and normally those who don't want power falling into their laps take the first opportunity to reject it.


Really ? It is obvious if you see their track record for last few years...I mean the ceo is having private dinner with both wings


It is obvious that the CEO of one of the biggest and most influential companies in the world would have dinner with world leaders.

I'm not sure I see anything that supports your conclusion. Have you considered what evidence would make you change your mind?


IIRC, Zuck has debated running in the future. Having Facebook with that power would probably benefit.


With the fit everyone threw over Trump having (relatively speaking) minor business connections, I'd hate to see what people would say over Zuckerberg running for any sort of office.


What's a political ad? Better yet, what's a NON-political ad?

If McDonalds advertizes how good and healthy their Big Mac is, is it political? What if someone complains that it contradicts some pending legislation? What if someone complains about the racial make up of the cast in the ad? What if the ad mentions a word that later someone complains about?

We live in a nation where we can't agree on what a man or woman is, and what that means. In fact, just talking about it is now political.

So, who's going to judge whether something is political or not?


There are clear, well-defined rules around what constitutes a political advertisement, published by the FEC. Some states have guidelines that go further. A smart place to start, instead of feigned ignorance or poorly constructed strawman arguments, might be there.

https://www.fec.gov/help-candidates-and-committees/making-di...


The rules are not clear or well defined, and often result in litigation. Often, what becomes determinitive is the context around the ad or the situation of the entity placing it, and not the ad itself. So, to do this Facebook would also have to track entities, etc.


The boundary may be fuzzy, but that doesn't mean there aren't clear things like "if you're promoting or smearing a particular candidate for an elected political position, that's a political ad".


The clear outliers aren't the thing the OP was referring to. It is precisely the "fuzzy" ones in the middle that can be interpreted both ways depending on context and political leaning, etc.


And I'm saying, don't pass judgement on the fuzzy ones. Draw the line where it's indisputable; you'll miss some but it's better than missing everything.


Honestly, I’ll believe that Facebook employees are sincerely concerned when I see them walking out or quitting in large numbers. “Open letters” will do zilch in a company known for lies, dishonesty and deception. Only if the earnings take a big hit will Mark Zuckerberg or Sheryl Sandberg do anything.

Edit: Where were these employees when fake news and misinformation resulted in the killing of thousands of people in other countries?


Does anyone know of a good group for tech workers to join to counter these issues? As an individual I don’t feel like I can do anything about this.


> As an individual I don’t feel like I can do anything about this.

Welcome to living in a liberal democracy where we don't get to force someone to stop saying something because it makes us mad.


FTA:

> Facebook’s position on political advertising is “a threat to what FB stands for,”

2 points I'd like to make:

(1) Facebook is a tool for mid-30s people like me to share pictures of their kids and vacations with friends and family. We are on it because we're platform locked with logins across various services, and because there's too much critical mass with the generation before us to migrate away.

(2) It shouldn't "stand" for anything, nor should any other communications medium. We did not elect a sect of silicon valley developers to decide what is proper communication and what is not.


But we can because it causes measurable harm, such as influencing the electorate to make decisions based on falsehoods.

We already have special rules for political advertisements on TV, radio, public posting, and print. Facebook is claiming that the rules everybody else follows don't apply to them.


Those special rules are more permissive than what these employees are asking for. The trump ads aired on most TV networks except for CNN, who objected because the ad called them "fake news".


Nobody is asking Facebook to ban Trump ads. They are asking Facebook to ban ads with explicit falsehoods.

Trump pretty clearly uses "fake news" as an insult, not a claim of fact.

Contrast with ads with false claims that Facebook has approved, like "Pope endorses Trump" or "Lindsey Graham voted for Green New Deal". No TV network would run those.


Really? I'm under the impression that TV networks _have to_ run ads even if they're not truthful:

https://www.politifact.com/truth-o-meter/statements/2019/oct...

> Broadcasters are bound by [Section 315 of the Federal Communications Act of 1934] and therefore can’t reject a presidential candidate’s ad, even if contains false information. (The candidates do have to abide by disclosure rules to make it clear who paid for the ad.)


Ummm Mark Zuckerberg has a billion times more power than I do.


But does not his vote count at +1, the exact same as your's does?


It does not, as well anyone without an axe to grind should know. He can influence other people's +1; even if only fractionally, that adds up.

There's a reason Murdoch is feared by UK politicians, why Berlusconi had such leverage even in the face of scandal. Media controls perception.


Zuck's social status and financial position enables him to influence way more people than most individuals can. As the parent of this thread stated, he has way more power.


So how do you all propose to quantify, define and draw a line on "power" and where to much must be quashed, i.e. "way more power"? If you have more friends and acquaintances who respect your opinion (and so can be influenced) than I do then do you have more 'power' than me, since your vote now can count more? How can/should I counter that so we are 'equal'. Oh that wonderful term 'equal' and how some of us wish and long for all of us to be perfectly equal.


Its not the individual vote so much as the difference in effective power. Who cares how many votes he gets if he can spend billions of dollars to get what he wants. If he had the same sized speech as any average individual then who would care what he has to say about anything.


Voting is not about 'getting what you want'. It's about getting a legal system that works. Getting what you want (within the framework of that legal system) is completely 100% your responsibility


Checkout the Tech Workers Coalition!

Edit: https://techworkerscoalition.org/


Thanks!


He should not have allowed political ads in the US election. I dont' remember what was his excuse for allowing them but it sounded like a bad decision. There's just no winning in that game. He allows himself to be used as a scapegoat.

Of course then they 'll go on and ask for facebook to censor all user posts , but that will probably hit free speech protections.


Remember Zuck wanted to run for president not that long ago. He wants to sit at the table and to do that he absolutely has to pick a side. How to do that and still pretend it’s in the interest of shareholders is what we’re kinda witnessing now.


> Remember Zuck wanted to run for president not that long ago.

This is absolutely false. There has never been any evidence for this whatsoever, and it is a good example of a falsehood becoming true in the media by constant repetition.


If by this day, 40 years from now, Zuck hasn't at least launched an exploratory committee for a presidential run, I would be extremely surprised. If this were reddit, I'd promise to do something like eat an insect or a shoe or something.


This is the same man who tried to suck up to the PRC by asking Xi Jinping to name his daughter. If he actually tried to run for POTUS, it would be a shitshow.

https://www.telegraph.co.uk/news/worldnews/asia/china/119106...


He has a seat in that table anyway, a front one. And even if he bans paid advertising, it is going to happen nevertheless through normal user posts. And if what you say is true, he should do the bidding of democrats and ban them.


So, Facebook is operating like TV did forever?


I don't see how they have a choice. If they decide to exercise editorial control then they might be considered in the content creation business by virtue of exercising editorial control and forfeit immunity under Section 230 of the Communications Decency Act. If Facebook had to be liable for what was published on their system they would be facing the possibility of liability judgements many times their market cap.

Link: https://technology.findlaw.com/modern-law-practice/understan...


That is the exact opposite of what Section 230 does.

Section 230 allows platforms to make moderation decisions while retaining legal immunity for the user created content they choose not to moderate.

However, this is currently being threatened by republicans in the senate and Facebook is trying to avoid them taking steps to reduce the scope of 230 (which was already weakened recently by inclusion of a sex trafficking exemption).


One of the biggest reasons Section 230 is being threatened by Republicans in the senate is because they believe Facebook has a bias against conservative content and viewpoints on their platform. If it's up to Facebook to moderate content as they see fit, I find it very unlikely they would find a way to do so without appearing to be biased against someone or something.


Of course Facebook has a bias against conservative viewpoints. You think the Facebook employees quoted in the article are referring to “misinformation” such as false assertions that our schools are “underfunded?”

That is not to detract from the crazy things the right has said. But it’s impossible to read the NYT or HuffPo or the like without cringing over misleading assertions. And it’s not just those organizations. As a card carrying ACLU member, I can’t tell you how many times I’ve worked up a froth reading some email to members describing a case, only to be distraught when the representation turned out to be gravely misleading when I researched the case. I’ve started to notice that on HN, even EFF articles get some comments these days asking “wait, is that characterization accurate?”


“Schools are underfunded” is not a factual claim, so I don’t see how it could be “false”.

Trump’s ad claimed that Biden promised Ukraine $1 billion to fire a prosecutor looking into “his son’s company”. This is an outright lie on several levels.


In the abstract you’re correct. But as a practical matter, “underfunded” can be a fact or an opinion. If you say “schools are underfunded, and here is why” then that’s an opinion. If you say, “schools are underfunded, and that’s why we need to tax rich people more,” that’s mich closer to using it as a factual predicate. That implies that school funding has been measured against some standard (such as what other countries spend) and found deficient.

As to the Biden thing, according to fact-check.org the claim came from “a witness statement” filed in Austrian legal proceedings: https://www.factcheck.org/2019/10/fact-trump-tv-ad-misleads-.... Assertions in legal proceedings are routinely cited as “facts” in the US media.


The “witness statement” comes from disgraced former Ukranian prosecutor Viktor Shokin, and was made directly to Dmitry Firtash’s legal team. Firtash is a Ukranian oligarch linked to the Russian mafia. It’s not entirely clear to me yet what Shokin is getting out of it.

Firtash is trading his willingness to manufacture fake dirt on Biden in return for the Trump administration dropping his extradition to the US to stand trial for corruption. He has been stranded in Vienna for years, and wants to go back to running his mob-tied business empire.

https://talkingpointsmemo.com/muckraker/the-debunked-biden-a...

https://talkingpointsmemo.com/muckraker/watch-this-closely-n...

https://foreignpolicy.com/2019/10/03/giuliani-claims-ukraine...

https://www.msnbc.com/rachel-maddow/watch/oligarch-used-giul...

etc.

Shokin’s affadavit is full of holes.

https://www.bloomberg.com/news/articles/2019-05-07/timeline-...


I strongly suspect you are correct. But notice that you had to go to witness credibility. In an American court of law, that means the assertion would survive a motion to dismiss. A jury would have to decide whether the witness was telling the truth, or lying based on ulterior motivations.

Do you think the Facebook should be making calls on things that would be “jury questions” in a legal proceeding? And if so, do you think Facebook employees are an unbiased jury on that front?


When the truth is something like “A Russian-mob-tied Ukranian oligarch and his disgraced pet prosecutor is trying to make up dirt about Biden who was sent as the representative of the US/NATO countries to demand Ukraine fire that corrupt prosecutor standing in the way of corruption investigations, because the oligarch thinks he can trade manufactured dirt for political favors with the US president.”

Then restating that as “Biden promised Ukraine $1 billion to stop investigating his son’s crimes” is pretty much slander.

* * *

Personally I think that Facebook should not run political ads, period.


Your “truth” is an inference that you are drawing based on circumstantial evidence that contradicts the witness’s story. Even if I agree with you that conclusion is probably correct, in US law we would treat that as a “disputed fact” that would require a jury to resolve.

I think Facebook shouldn’t moderate political content, full stop, but if it did, surely the limit is things that provably false without making judgment calls or evaluating credibility. E.g. “Hilary Clinton was indicted for her emails but Obama pardoned her.” It’s shocking to me that anyone would espouse Facebook making editorial decisions on political ads based on inferences from the evidence that in a court of law a judge wouldn’t be empowered to make.

And if Facebook moderators should be able to make inferences and weigh credibility in deciding “truth” doesn’t that circle back to my point about education spending? The US spends more on education than all but 1-2 other OECD countries. Can’t a moderator infer from that the assertion that schools are underfunded is false?


So biased that they team up with major conservative news outlets and regularly consult with high profile conservative politicians and lobbyists, you mean?

On an ongoing basis, conservative news sites tend to rank highly on FB's highest-traffic posts.

If you want to live in a fantasy world that's fine, but please don't spread nonsense. FB has teamed up with Breitbart - a news outlet explicitly described as a platform for the alt-right, with a Black Crime vertical - for their new Journalism work, and Zuckerberg sits down with conservatives on a constant basis: https://sanfrancisco.cbslocal.com/2019/10/14/deletefacebook-...


The fact that Zuckerberg is getting in trouble for having meetings in “his home over the summer with conservative figures, including talk show hosts, journalists and at least one Republican” proves the biased context within which Facebook and Zuckerberg must operate. Zuckerberg is surrounded by people who are so partisan that the fact that he had dinner with “at least one Republican” is casus belli.

Note that your own article discredits your premise that it is “nonsense” to suggest that Zuckerberg is in league with conservatives. Quoting a “normal“ Facebook user, it states:

> “I think that it would be leveling the playing field if he reaches out to Conservatives because he surely reached out to Liberals,” she said.


Fun how he "surely reached out" when news outlets tried to find a record of this and couldn't find anyone willing to confirm that he met with them, and his representatives refused to provide any examples.

I mean yes, in a sensible world, he would be reaching out to both groups - that'd be fine! He's not.

Also: if you want to give conservatives ideological representation, you can make better choices than Breitbart and Tucker Carlson.


Are you really calling letting a website be shown as “teaming up”? Breitbart isn’t even one of the 200 paid news websites and you're acting like they’re best buds with Zuckerberg. It’s this kind of foaming-at-the-mouth hyperbole that makes all of this so dangerous.


We're not talking about banning Breitbart, Facebook explicitly partnered with them in a new effort to promote "trustworthy" journalism: https://www.niemanlab.org/2019/10/facebook-launches-its-test... In my original post I clearly stated this, so I'm not sure why you're confused.

They can simply not partner with them and they will retain their existing access to the service. Partnering is an endorsement and adjacent to moderation.

Worth pointing out that there are other conservative outlets on their list, ones with a good reputation (at least compared to Breitbart).


I’m not confused at all. They partnered with hundreds of sites, and excluded Breitbart from getting paid. Some partnership.


But TV stations curate their advertising.

Many types of political ads are not able to be shown.


I didn’t now TV ads offered targeting and tuning ads to highly specific groups of people, why didn’t Cambridge Analytica use tv ads instead of FB? The issue here is the micro targeting based on the personality of the uses stringing them along without them knowing. It is known that CA used personality tests to craft specific messages. How can it be the same?


Political TV ads should be banned too. Would make it consistent and understandable.


I can't blame Zuck to work so hard and try to execute the balancing act to get that political ad money, because it's targeting what is now Facebook's core demographic.

Young people aren't using Facebook anymore. This doesn't mean young people don't have an account, but I suspect no one under 35-40 is really engaging with the platform meaningfully. Facebook is the new TV and is going to go out like TV - in a slow, overly long drawn out whimper chock-full of pharmaceutical, lawyer, and mesothelioma ads aimed at the aging demographic.

Facebook has a stranglehold over older people but younger people are not falling into the trap. Facebook's ability to give Zuckerberg power is going to fade over time.


...you're aware that Facebook owns Instagram, yes?


> pharmaceutical, lawyer, and mesothelioma ads aimed at the aging demographic

I created a new fake account a while back and for a while many of my ads were about vape pen litigation, similar to TV lawyer ads but for a younger crowd.


Silicon Valley's propension to introduce externalities into the world yet never want to deal with the negative ones because "you guys have no idea how hard this is" will never cease to amaze me. But hey, I guess this is why that book is named "Chaos Monkeys".

You know, if it's too hard to run a political ads business that doesn't enable mass scale targeted disinformation and wreaking havoc on democracies, then maybe the responsible thing to say isn't "sorry our platform has enabled 2 major election fuck-ups in the Western world in 2016, but it's not our role to be an arbiter of truth so we'll do nothing" but rather : "ok, we haven't yet found a way to operate this that's not harmful to society, so we've decided not to run political ads until we do " ?

Because at the end of the day, if you don't take this into your own hands and instead you make it look like it's a choice between preserving a 15 years old private company's bottom line and keeping centuries old democracies functioning, that's gonna be a really easy one to make for lawmakers around the world.

The hands off stance is a recipe for being regulated into oblivion eventually, which isn't good for shareholders either.


Do you want to get CDA 230 repealed? Because this is how you get CDA 230 repealed.


The problem to me can be summarized pretty simply: since unfortunately the USA doesn't have any law on the books to require political advertisement to be truthful (contrary to normal advertisement where it is enforced aggressively).

Considering how effective is Facebook at targeting individuals; you can do a lot of damage spreading lies on the platform. The question is moral: even if there's no law forbidding Facebook from spreading lies, should the company hold itself to a higher standard?

IMHO Facebook should do that, because it risks creating a lifelong enemy in the political side that's likely to win the next elections and as the Romans would say, Vae Victis.

https://www.factcheck.org/2004/06/false-ads-there-oughta-be-...


Facts and truth are two different things. A set of facts can be chosen to say something untruthful.

And there can be different 'truths' depending on the values people bring to the analysis of facts.

Having Facebook, or their designates, arbitrate 'truth' will only create a privatized ministry of truth.


Title:

> Dissent Erupts at Facebook Over Hands-Off Stance on Political Ads

From the article:

> More than 250 employees have signed the message

Facebook has >35,000 employees. 250 signees is <0.7% of employees. Hardly seems like an "eruption" of dissent.

The article does acknowledge this:

> While the number of signatures on the letter was a fraction of Facebook’s 35,000-plus work force...

So why use such a misleading title? "A tiny fraction of company employees does not like company policies" is a statement you can make about every sizable company.


Maybe it is just me, as I didn't see it in the comments. But why on earth should Facebook have to run political ads at all?

This should be regulated. Provide the same exposure to all the candidates. No targetted ads (how come targetted + political ever seemed like a good idea?). Only link to their program if there's a need at all.

But I bet there's plenty of people in queue for ads on FB's platform, so I don't think that not running political ads would hurt them much.


I don’t know why Zuckerberg has so colossally failed to convince the world that Facebook, Inc. should not be an arbiter of what is true and what is false.


The Correct Answer is to restore the Fairness Doctrine, updated to include cable, social, etc.

Media companies rejoiced when Reagan sabotaged political discourse. Political ads are huge money and are almost pure profit.

Why would Facebook, Twitter, etc. behave any differently?

https://wikipedia.org/wiki/FCC_fairness_doctrine


Facebook makes way more than a couple hundred $$M that they spent in last elections. And the cost of being broken up post election is far higher, so it definitely doesn't make economic sense for them


What would Twitter be worth today without politics? CNN?

Perhaps the same is true of Facebook. Can't think why it wouldn't be, but don't care enough to find out. Because it's Facebook.


The responsibility of handling and interpreting misinformation needs to be shifted to the consumer. People will lie to you almost every day, and you must figure out how to deal with it.

The validity of information should be vetted by those consuming it, not an entity who is in any kind of power. If enough people think someone is lying or untruthful, with enough evidence, then the content should be flagged, labeled, or potentially taken down, because every consumer had the opportunity to contribute their perspective leading up to handling said content.

We need to move away from the idea that certain authorities in our lives (governments, companies, organizations, or any entity with significant power) can determine what's true or not, because it's highly likely to be biased in either direction.

It's incredibly easy for a collective body to double cross their word—to say one thing and intend another at the expense of those who aren't in power.

The problem is, when an organization makes the decision to censor content, it is usually a very small few who make that biased decision on behalf of the—seemingly big—company. Effectively, it is a small team, or even one or two people, unless it's done by a dedicated team of moderators driven by policies, procedures—or worse: bribery—that may or may not be something those individuals believe in.

When it's left to the people interacting with that content, it's their choice in how to deal with it individually or collectively. That is maximum freedom. To enforce censorship, as a government or organization, is to assume that consumers are idiots, and that's not an assumption they should be making.


Exactly. The answer to false speech or hate speech is not censorship. The answer is more speech deconstructing the facts and logic used to perpetrate such falsehoods. Relying on "benevolent authorities" to arbitrate reality is a dangerous game. They are asking for a Ministry of Truth.

1984 was not a manual folks.


What you say has a kernel of truth -- we all need to be critical of everything we see and hear -- I think you take it too far.

> The validity of information should be vetted by those consuming it, not an entity who is in any kind of power.

This would not lead to a place that would be good for anybody. Most people don't have the time or skills needed to do this, and telling everyone that they have to fact-check everything for themselves can only result in some combination of two bad outcomes:

1) People will simply accept everything they read that confirms their own preexisting beliefs.

2) People will simply reject everything they read that goes against their own preexisting beliefs.

Both things encourage the continued decline of public debate as well as the continued increase of overall balkanization and the demonization of our neighbors.

Also, both will lead to a dramatically increased amount of lying.


So, Twitter?


Yes, people do lie, but there are plenty of regulations for commercial speech. There are laws against fraud, and so on. Acceptable ads policies may be implemented by private companies, but few people would say they aren't needed or aren't in the public interest.

Furthermore, you've bought into some libertarian assumptions about how markets work that aren't actually true. Some trust is necessary to make buying convenient, and the organizations running marketplaces have incentive to regulate who can sell there.


I think the point I'm trying to drive more is: the opportunity and ability to moderate content, speech, etc. needs to be in the hands of the people, not just the entity.

On Facebook, I have the ability to report or flag a post, and that's a fair amount of power I have, especially since I have the opportunity to fact-check said post if I want to, and contribute to a sincere and meaningful debate around whether that content should be available to others or not.

I don't think an organization should be able to blanket-determine what's acceptable or not, lacking transparency into the exact vetting process used show or hide a piece of content, because their values may be at odds with enough of the people using it (i.e. a large enough group of people hold a political view that Facebook may deem "dangerous").

I am confident we can agree that fraud is universally something that most people do not want, except for our lovely fraudsters.

When it comes to politics, that is an entirely different ballgame. We are dealing with equally valid views on both ends of the spectrum, in an environment where the majority of those in power are condemning and attempting to control speech coming from the one side.

I say this because of the sheer number of anecdotes (including my own) that have noticed practically unanimous support from most major media institutions for one side, while also condemning the other side. I'm not going to specify which sides, because I believe it's obvious.


When I joined HN, people were advocating for the creation of organizations that get people to voluntarily support them with their attention or $. Now people openly advocate for government force to be used to achieve admirable outcomes (even though the claimed benefits are unachievable).

Your comment and mine are being pummeled. I'm at -3 for nominating Warren for Minster of Truth (attempt at levity). You?


Indeed they are! Heh, I got a good laugh out of your comment about Warren :)


Ended-up at -2. How did you fare?

I moved off the Navajo reservation in 80s. My cousin (he just got electricity to his house this year; he's had solar for 20 years) still lives there. I can give tours, if you have interest. I'm there 2 weeks in Oct or Nov. Great trout fishing on the San Juan River at that time.


I'm sure lots of stuff gets flagged on Facebook, and in theory they could let people debate it sometimes, but then they still have to make a decision, right? On what basis do they do it?

We have lots of experience with online communities now and it's been pretty well proven that "the people" don't spontaneously organize into well-moderated communities. Software architecture can help make that more likely, but even places like Hacker News don't run themselves - the moderators have to keep tweaking things.

Having some good choices between communities that are moderated differently is about the best we can do.


Why are the only two options to let ads through or reject them? How about, fact check them and visibly mark them as being potentially false and a link to more details. This should make both sides happy: Zuck who believes the public should decide for themselves, and the rest.


They did try something like this and it backfired. People were more likely to click headlines marked as "fake new", not less.

It's pretty easy (and a bit depressing) to see why: imagine seeing a headline that agrees with you politically. The candidate that you support did something good. Or the other side did something bad. It confirms, yet again, what you already know. But now, there is a little label that says that this may be fake. It's very easy to rationalize that the moderator is biased or doesn't know what they are talking or that this is simply what they want you to believe, isn't it? The bubble is real.


Again, you’re damned if you do and damned if you don’t. In fact in this case we’ve already been through this. Instead of “why does Facebook allow possibly false ads” the discourse instantly becomes “why did Facebook allow this possibly false ad to go without a marker / why did Facebook put a marker on this ad that’s true in spirit”. For every action there is an equal and opposite hit piece.


I've seen you repeat these arguments over and over in this thread, without suggesting any solution. I am guessing you are content with the status quo of Facebook.


I agree there is a problem, I simply disagree that there is an easy solution. All "solutions" people suggest either propose to make Facebook nakedly partisan, or would just cause the very same criticisms to reignite in essentially the same form the next day.


Facebook is already doing moderation of political content posted by non-politicians. See the latest news about content removed about Lindsey Graham. Facebook is choosing not to stay on the sidelines themselves already. They are just doing so for politicians.

The status quo of Facebook is worse than them not touching any of the political content on their site.


This is yet another example of 'damned if you do and damned if you don't. If Facebook did literally no moderation whatsoever, then there would be (and indeed was, in the past) a furor of complaint over their callous indifference to society. The second Facebook censors anything, they are immediately hit with a furor of complaint that, under those standards, they are obligated to censor some slightly less objectionable thing. They resist for a while, then cave in, and the cycle repeats. For the past 10 years this has been a reliable mine of outrage porn, but not a cause of real progress.


This is yet another example of you just dodging the problem. FB is arbitrarily demarcating a line of their choosing with no consistency. Politicians are no different from people, and should not be treated to a special "free speech pass" on FB. Free speech for all, or free speech for none. There's no decent reasoning behind this midway solution. The true reason for this is that the politicians have regulatory leverage over FB.


I addressed exactly this. This is what you are doing:

> The second Facebook censors anything, they are immediately hit with a furor of complaint that, under those standards, they are obligated to censor some slightly less objectionable thing.

This is like trench warfare. Facebook never drew an arbitrary line: it just kept being pushed back by public and media pressure here and there, retreating in bits and pieces. Obviously if you just ignore that history, it looks like an arbitrary line now, but it was created by complaints almost identical to the ones you're making.


Retreating in bits and pieces is a choice made by Facebook. No one forced them to do this. It's their choice as a company trying maximize their visibility/profits. I don't think people would have left FB if FB just decided to not moderate political content at all. Just like right now, there is no exodus of people from FB in spite of the outrage.

You are attributing very little agency to a company that makes its decisions unilaterally (sometimes even ignoring laws). This is a gross misrepresentation of FB's position. FB is not a victim here.


It would not make the GOP politicians happy who wield their administrative power over FB, and have a lot of political capital invested in blatantly lying to the public.


Maybe it would be preferable to provide an immutable log of political ads that have been run, who ran them and with all targeting information.

This would be open and transparent and allow politicians to police the turf instead of facebook.


This is not a rhetorical question:

If it's ok to lie in a political ad, if the entire responsibility for determining its truthfulness lies on the shoulders of the people view the ad, is it also ok for an administration to lie to citizens?


Based on all of the evidence thus far, yes, there are no consequences whatsoever for the federal executive to lie about all manner of topics literally every day, up to an including matters of national security.


These people are going to be constructively terminated.

Constructive termination is where they want to fire you for 'x' but can't legally so they construct 'y' as the real reason for firing you.


Why would people be willing to give up their fundamental rights so easily. Isn't free speech mainly about invalidating what is false or immoral through discourse?


Why doesn’t Facebook just reject political ads and keep itself out of trouble. It seems like it could be the less costly alternative.


Am I the only one that wishes social media would just ban political advertising altogether?


Why are we going after the platform and not the party posting the deceitful ads?


The mainstream media has lost control of the narrative because of places like FB. Everything that covers politics is a form of political ad and EVERYONE has an agenda. So how will you control that?

What NYT, WaPo others offered was a brand and certain Network Effects (subscription). They can not compete with the Network Effects of FB and have been trying to rein in FB.

These entities are desperate to regain control of the narrative or they'll lose their value.

The reality is, NYT or WAPO can run false news or "political ads" under the name of op-eds. On their own platforms they can highlight these op-eds on their homepage or they can just boost them on FB. If NYT is fine with op-eds that talks about anything political related as "political ads" then they have a standing here.

It no longer has to be op-ed. Even their news coverage is turning to political propaganda. You know how bad NYT's own editorial practice is? Just watch this recent re-writing of history [0].

Any let's not forget it wasn't the political ads that gave us Donal Trump, but the $5 Billion free advertising that Trump got by the mainstream media [1], watch Bannon talk about how Trump got initial boost in the polls[2]

[0] https://www.youtube.com/watch?v=78CE8eiWItY

[1] https://www.thestreet.com/story/13896916/1/donald-trump-rode...

[2] https://www.youtube.com/watch?v=CKuPYArH0Gs (this is an interesting interview and Bannon talks about how Trump got his boost in the polls by mainstream media)


"hands-off stance" That phrase is doing a lot of work



I posted this in the other thread on the topic,

"The Facebook workers called for specific changes including holding political ads to the same standards as other advertising, stronger design measures to better distinguish political ads from other content, and restricting targeting for political ads. The employees also recommended imposing a silence period ahead of elections and imposing spend caps for politicians."

In the U.S., political speech is often afforded the highest amount of protection from govt. censorship (c.f. the FB is private platform/publisher). One of the reasons articulated by some First Amendment commentators is that political speech is important to self-government in a democratic society. To quote Brandeis, "Political discussion is a political duty." Further, "Implied here is the notion of civic virtue - the duty to participate in politics, the importance of deliberation, and the notion that the end of the state is not neutrality by active assistance in provided conditions of freedom . . . ." [1]

Public political speech should not be censored based on perceived truth or falsehood. In fact, political speech that promulgates false or misleading messages should be exposed to criticism. Again quoting Brandeis, "Sunlight is said to be the best of disinfectants . . . ."

However, political speech is regulated to an extent by the F.E.C., e.g. requiring disclosure notices, etc. However, the political speech issues presented on FB can be more complex than that of traditional 20th century print and broadcast media. For example, micro-targeting political speech to certain demographics may cross the line from public political speech to private speech, and perhaps should be affored less protections. See Alexander Meiklejohn [2].

Also, content based prohibitions of speech tend to be more troubling than content neutral restrictions, such as time, place or manner restrictions on political ads or spending caps as mentioned in the employee statement above.

[1] Lahav, Holmes and Brandeis: Libertarian and Republican Justifications for Free Speech, 4 J.L. & Pol. 451 (1987).

[2] Meiklejohn, Free Speech and Its Relation to Self-Governemnt (1948).


Just put a heavy crawling international distress orange outline around political ads with a watermark saying "not fact checked".


Imagine a law mandating a sign be put on every toilet stall door with the message "wipe your ass". Rather than overregulating, we should allow ourselves time to adapt to this new world, this might include - block your kid's ears - committing quite a few mistakes and learning with them.


We do have restaurant washroom signs - "Handwashing Laws For All 50 States" - https://www.signs.com/blog/handwashing-laws-for-all-50-state...


Not so clear when there’s a professional psychologist hired to convince you your ass will be clean whispering through the glory hole.


The amount of people making weak "both sides" arguments (nobody runs political ads without lies in them) in this thread is alarming. Facebook is easily capable of fact checking every ad on their platform, and if they can't, they should'nt run them at all. We should be prepared to demand that all political advertising be free of outright falsehoods.


I agree that claims like "the earth is flat" and "vaccinations cause autism" should probably be removed - but the trump ad in question while likely to be false, does not constitute an outright falsehood. If you ask anyone on the right they'll say it's probably true and at the very least just as truthful as any leftist political ad.

Furthermore, in the 1400s an ad claiming that the world is round would have been deemed outright false, and in the early 1900s an ad claiming equality between all races would have been deemed outright false, and in 2001 an ad claiming that Iraq had no WMDs would have been deemed outright false -- yet all are now known to be true.


The problem here is not so much Facebook (a company doing what companies do) as it is the regulatory system they fit in. This situation is unprecedented, as no single company had ever concentrated the media power Facebook has. Our legislators are barely starting to understand the problem, the ball is in their court really. In the meantime, Facebook will sit at the intersection of what's best for the company and what the law allows.


Facebook probably has considerably less media power than TV stations of old. But that was ok, because they got regulated.


The problem with Facebook is that it's too big. Different online communities have different standards of what sort of behaviour is acceptable. Facebook is effectively splintered, there is no one community and so there is disagreement on the community standards, to a degree that I don't think can realistically be resolved. Splintering may very well be the result.

If social media were more decentralized, the responsibility would also be decentralized. Standards would set by the communities. And as for overall standards, that would be dealt with by the legislature and courts, which would be a huge improvement, as those are way more transparent and fair than Facebook et al.

Abuse of power by Facebook (or advertisers pressuring them) would be much less of a problem if people could move more easily between social media platforms.

I think a more decentralized model of social media would be good all around. Add some interoperability so you can still communicate when you're not on the same platform, this should alleviate some of the tendencies for these platforms to become so big and centralized.


Buying political ad is kind of like buying a new car, or a firearm.

If you leave the dealership with your new vehicle, and decide to go run over 10 people, the dealer is not on the hook for your actions.

Same with gun stores not being held liable for gun owners.

There may be background checks in place to ensure they aren't selling a car to someone that can't drive (Driver's license) or to make sure someone can own a gun (Background check), but once you pass the initial screening you are on your own for liability.

Political ads should be the same, basic KYC to verify the person buying the ad is who they say they are or allowed to represent an entity, but beyond that anything they want say let them say it, let the public scrutinize it, and let their ideas be debated.

I could see a world of hurt if this was completely unregulated, as in anyone could pretend to be anyone and buy an ad any which way without verification. This would lead to an insane amount of slander/mudslinging.

Just my 2 cents, probably not worth a penny.


>If you leave the dealership with your new vehicle, and decide to go run over 10 people, the dealer is not on the hook for your actions. Same with gun stores not being held liable for gun owners.

That is not true of the law. If a gun store sells someone a gun, and the store owners knew or should have known you intend to kill someone with it...they gun store is most certainly liable.

Even when the gun store wasn't aware the buyer intends to shoot someone, the gun store can be liable if it was negligent in the sale, like this case where KMart sold a gun to a drunk person who shot a woman, KMart was found negligent and liable for $12M


>That is not true of the law. If a gun store sells someone a gun, and the store owners knew or should have known you intend to kill someone with it...they gun store is most certainly liable.

This is why literally the next line I say after the KYC/background check. Obviously if a guy sells a gun to someone without doing the proper checks they will be liable. Same if a dealer sells a car to someone without a license.

>Same with gun stores not being held liable for gun owners.

There may be background checks in place to ensure they aren't selling a car to someone that can't drive (Driver's license) or to make sure someone can own a gun (Background check), but once you pass the initial screening you are on your own for liability.

>Political ads should be the same, basic KYC to verify the person buying the ad is who they say they are or allowed to represent an entity, but beyond that anything they want say let them say it, let the public scrutinize it, and let their ideas be debated.


>Obviously if a guy sells a gun to someone without doing the proper checks they will be liable.

It has nothing to do with "proper checks", the background checks were all completed properly and the buyer passed them, but there was additional facts having nothing to do with background checks that created liability.


Hosting political ads is a political act, no matter the perspective. You want in the political game, take responsibility. You want out then don't host political content.


Its more like having someone buy a firearm that has a bumpstock and then also giving them a free stay at a hotel that looks out on a popular concert.

We're in the information age where information is starting to be used as a weapon and we need to have ways in which we are protecting individual people's right to not get undue influence by corporations or people in power.

The parties that are in control of the media need to take responsibility to protect the public's rights and if they wont, the government should regulate them.


>The parties that are in control of the media need to take responsibility to protect the public's rights and if they wont, the government should regulate them.

Disagree. It is your responsibility to be informed and call bullshit on things that are lies.

IF you think the same politicians who get elected by lies are going to force the corporations that line their pockets to tell the truth I have a bridge I will sell you. Nice one right there in San Francisco.


There has always been asymmetry and weaponization of information though and it probably used to be worse.

At least now there are options to verify and correct for those who care to do so.


When you buy a car, you buy it once. The dealer does not continuously give you the ability to run people over. When you post political ads, facebook runs them until your ad spend is done.

When you run someone over, there's no question about whether you did something wrong. But when you run a fake news political campaign, people may land on both sides of the fence.

When you drive a car, nobody but you can turn the steering wheel. When you buy a political ad, Facebook is the one that deploys it.

These are very different subjects.

These are in fact, so different, that you can look at the valuation differences between a SaaS company, and a traditional manufacturing company.

And if the actual products are different, the delivery method is different, the ROI is different.... Let's not make this comparison again. No fault to you, of course.

And back to your point: a background check makes sense if the deliverable is done once, like a car. A content filter/fact check/etc that happens before each post makes more sense in the context of a continuously-delivered model, such as paying for multiple ads.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: