Hey, Facebook VP of Integrity here (I work on this stuff).
This WSJ story cites old research and falsely suggests we aren’t invested in fighting polarization. The reality is we didn’t adopt some of the product suggestions cited because we pursued alternatives we believed are more effective. What’s undeniable is we’ve made significant changes to the way FB works to improve the integrity of our products, such as fundamentally changing News Feed ranking to favor content from friends and family over public content (even if this meant people would use our products less). We reduce distribution of posts that use divisive and polarizing tactics like clickbait, engagement bait, and we’ve become more restrictive when it comes to the types of Groups we recommend to people.
We come to these decisions through rigorous debate where we look at all angles of how our decisions will affect people in different parts of the world - from those with millions of followers to regular people who might not otherwise have a place to be heard. There’s a baseline expectation of the amount of rigor and diligence we apply to new products and it should be expected that we’d regularly evaluate to ensure that our products are as effective as they can be.
We get criticism from all sides of any decision and it motivates us to look at research, our own and external, analyze and pressure test our principles about where we do and don't draw lines on speech. We continue to do and fund research on misinformation and polarization to better understand the impact of our products; in February we announced an additional $2M in funding for independent research on this topic (e.g. https://research.fb.com/blog/2020/02/facebook-misinformation...).
Criticism and scrutiny are always welcome, but using cherry-picked examples to try and negatively portray our intentions is unfortunate.
Just to cherry-pick from your reply here, if $2M is the biggest ticket item you have to show for independent research on this topic - then you're woefully short given Facebook's revenues and size.
10 short years ago, nobody could have imagined that huge swathes of the population could have been swayed to accept non-scientific statements as fact because of social media. Now we're struggling to deal with existential threats like climate change because a lot of people get their worldview from Facebook. Algorithms have decided that they fall on one side of the polarization divide and should receive a powerful dose of fake science and denialism ... all because clicks and engagement.
10 short years ago, huge swaths of the population were swayed to accept non-scientific statements like eating fat and cholesterol were unhealthy. I don't think Facebook is the problem here.
How much exactly do you think Facebook ought to donate to independent researchers? Most tech companies donate ~0$ to such efforts.
> 10 short years ago, huge swaths of the population were swayed to accept non-scientific statements like eating fat and cholesterol were unhealthy. I don't think Facebook is the problem here.
That's a horrible example, I don't think that is even remotely comparable to swaying the population into burning down dozens (hundreds?) of cell towers or accusing Bill Gates of starting the coronavirus pandemic.
So, pretty much every new medium in history has been accused of formenting conspiracies at one point or the other.
The core issue here is that the internet allows many, many voices to flourish, and some of these voices speculate attractively but incorrectly (from my viewpoint, at least).
Blaming the platform which allows the voices to spread seems like a bad move, given that the core issue is the people who choose to go along with it.
The only way in which I can assume that FB is responsible for all the alternative theories on their platform is by refusing to accept any agency on the part of FB's users, which I think is probably the wrong idea.
As an example, right now you are promoting a narrative on an internet site holding FB responsible for the behaviour of others. Do your readers have so little agency that they will mindlessly act on your words without reflection?
If so, what differentiates your post from a similar post on FB?
If not, what makes FB different?
These are genuine questions by the way, I'm actually interested in your answers.
I don't idealize the past at all. Too many bad things to write.
However, the internet amped up the level of overall craziness to an unprecedented level. Even the first twenty years, it was much like the regular world, but geekier. But then in a fairly short time all these crazy ideas just started to spread in an incredibly tiny period of time.
I have always been a student of the paranormal and conspiracy theories - as a skeptic. Suddenly random people were spouting all the obscure classics, and brand-new ones appeared every day.
Last year, before COVID, I decided that this was akin to an infectious disease - suddenly people were thrown in with hundreds of thousands of anonymous people, many with bad but infectious ideas.
Before the internet, you acquired most of your delusional ideas at birth from your parents under the guise of religion etc.
Now you could pick up delusional ideas one at a time - more, the presentation gets to evolve because the creators get second-to-second feedback. Call these Vemes for virulent memes perhaps?
Some once-friends of mine clearly have very poor immune systems as they picked up many vemes.
> The only way in which I can assume that FB is responsible for all the alternative theories on their platform is by refusing to accept any agency on the part of FB's users, which I think is probably the wrong idea.
1. Infectious diseases don't work that way!
2. Also, a lot of this is caused by a small number of actual psychopaths who literally just want to cause grief. Allowing a tiny group of people to damage the whole is wrong.
---
Trying to assign agency to the crowd is madness and not backed up by observations of humanity en masse or reading a history book. In such a mob scene, crazies, aggressive people and criminals will always win.
If it's not immediately apparent, pretty much no other medium in history (new or old) had/has the real reach Facebook has now. So taking it out of context and simply defining Facebook as a "new medium" is disingenuous. A firecracker and a bomb only differ in scale, hence the massive difference in how they're seen and treated.
With great power comes great legal and moral responsibility, and greater scrutiny. You can't shrug it off just because others have also done badly at this, especially when nobody else did it badly at such a massive scale.
I wonder what the cell phone generation would think of our pulp paper fake Necronomicons from the 1970s? There was also a book of spells at my local library in the very-white very-evangelical suburbs of Little Rock, AR when I was a kid, too. I liked the one about how to become a werewolf a lot, but didn't really get to the point of going and murdering a dog and painting myself with its blood and boiled fat.
The pearl-clutching about social media is because power has been taken away. Media executives can no longer fancy themselves in control of people's minds, because they no longer have a monopoly on eyeballs via print and TV information.
Worse yet (for them), the user customizable nature of social media feeds mean that they can't even know what people really see.
If people use the internet to make themselves worse that's a failure of people not the internet.
But back in Little Rock, AR when you were a kid. Are you sure that there was someone who sat in on every private gettogether in every house, located in a corner listening for keywords and when someone said something mildly racist this cornerman gently told your uncle that "these racist inclination you seem to have, did you know that the local bookstore White Pages™ has a book called Mein Kampf that confirms your suspicions, you should read that one, if you like of course, not telling you, just a friendly suggestion, it's actually on sale now".
Does this mean you consider scale to be irrelevant to this situation? Because that's pretty much what the difference boils down to. You mention "media" as s singular entity but in reality the media of the past was a mosaic of thousands of individual outlets, newspapers, radios, TV stations. Today FB is a singular entity with the combined power of most of those together. Do you agree that changes the game?
FB has almost every eyeball now. Can you say the same about your fake Necronomicon? Book of spells? The media that controlled your mind?
The last statement is basically one against any form of control or regulation. If people use the x to make themselves worse that's a failure of people not x. Sometimes people need help.
Saying that scale is irrelevant makes me think you are intentionally obtuse about this. FB picks which newspaper's article to show you just like newspapers choose which journalist's article to show you. And just like newspapers give a tremendous amount of power to one individual (the journalist) compared to another (you), FB can do the same for individual articles and outlets. Except FB has the kind of reach no newspaper ever had or will. They give a podium for others to climb on, they decide who gets the front seats, and they monetize it.
For all intents and purposes they should be responsible for everything that happens on the platform regardless of where the content was picked up from. They should have a responsibility but this comes in conflict with their goal to to drive engagement and profits.
Their algorithm curates what people see, shows specific links, and influences opinions. And when you can influence opinions on such a large scale and monetize it we have a conflict of interest and even a weapon. FB and others have proven this in the past. Why do you think the media is regulated so tightly around election time? FB can take even a blog post and push it so aggressively right before elections that they manage to sway opinions but still claim they were just a platform.
The counter-argument is that it's easier than ever for an individual to find critical analyses of that information. If you were suspect of the information on TV in 1983 there was no one to tell you any different, unless you put effort into finding it.
Let's put it in terms other than politics...
In the early 1980s a full third of all households in the US watched the TV show "Dallas" every weekend. Was "Dallas" the greatest performance ever made or just the best option out of 3 choices?
Honestly far more violent and highly accepted behaviors are pushed through media outlets as sane opinions every day, like military intervention or straight up misinformation campaigns (see: the OAS, who for some reason never evaluates the election health of rich northern american countries, who should logically have released some statement about facebook right now, let alone all the primary voting discrepancies). Hell the NyTimes alone had enough fuckups in the 00s alone they should be on the shortlist for fact checking suspicion.
If facebook is wading into being a truth-teller, they’re going to run straight into government-media-academia social circles a la Pinker, Chomsky, all the punditry on TV, all the punditry in opinion columns (claims still need fact checking even if the result is an opinion).
Then, how do you deal with framing the presentation of verifiable facts with extremely ominous and sinister tone/hinting? You can do an enormous amount of damage just making untestable, unverifiable implications.
IMHO facebook is gonna get squeezed till they pop over this, either by blatantly having political double standards or by becoming a misinformation-based hellhole. The accessibility of Truth is much harder than people realize, and I don’t even think it’s fair to off load this responsibility onto facebook. People are just strangely ok with believing bullshit.
Yes, the misleading dietary advice was objectively worse. How many needless deaths lost to the likes of diabetes and heart disease. The ramifications of the replication crisis (of which social media censorship also runs afoul) run much deeper.
I think you are knowingly missing the point. Nutritional experts disagreed among themselves, and probably still do. Is there even a single 5G engineer that will take up the cause of 5G causing coronavirus? How about an immunologist that will claim Bill Gates is trying to inject people with microchips delivered in vaccines?
Which is why it's a non-sequitur. Any well-adjusted adult with a modicum of common sense can see through such claims. Cults and conspiracies flourished long before FB's existence, as they will after. Removing such content just validates it in their minds — it must be true that's why "the corporations" insist on covering it up!
Most of the west has learnt that we should not penalise drug addicts. We treat the underlying problems that brought about their habitual use. The need to self-medicate disappears.
Like the war on drugs failed, so too shall the war on misinformation. The solution is tending for the human condition that leads one to not only believe, but want to believe, in these fantasies.
> Cults and conspiracies flourished long before FB's existence, as they will after
The problem is that Facebook's algorithms are directly helping those cults grow by polarising people, as stated in the linked article:
> Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
I'm sure we can all agree that you can't end all cults, but surely actively encouraging their growth is a bad idea.
Hundreds of thousands of people are still in jail for drugs in the United States! Neither presidential candidate is for legalization of cannabis alone, and there is no talk of legalizing anything else.
I hope one day the war on drugs will end - but sadly it is still alive and well in the United States and all over the world.
As for the war on misinformation, we have lost continuously on this one for years in a row, and there's no evidence at all we're succeeding.
Would literal witchhunts be a better example? Or hunting "communists"? Or Young Earth creationsim? The "idea" that your skintone and level of intelligence are "biologically" linked? Or Jet fuel cannot melt steal beams and 9/11 was an inside job? And don't even get me started about that fake moon landing... Oh no... I can hear an airplane, here come the chemtrails.
If I was a higher up at FB, I'd consider the risks to me as a business of these issues of polarization, and I'd spend according on advice and evaluation (a lot).
For example, if FB comes to be seen as a kind of mind control platform, that could be devastating as national govts decide to step in and put a stop to things. Imagine even a mid-sized country regulating that FB was responsible for tagging any posts that contained not just Covid-19 misinformation but all sorts of misinformation. That sort of thing could be extremely dangerous to FB's business model.
These sorts of risk in my mind would be very high indeed and I would devote a lot of resources to at the very least understand them. $2M is a drop in the bucket for a company with revenues of $70B for addressing such risk.
(Actually I checked and a bucket contains approx 10K drops, so this is actually surprisingly close to being a drop in the bucket).
> Worse was Facebook’s realization that its algorithms were responsible for [extremist groups'] growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
Everything social that exists today already existed when humans were still living in trees. That doesn't mean that social media like Facebook can't serve as an amplifier that turns problems from annoyances to existential threats to our society.
This is classic whataboutism. What does people being led to believe that fat was healthy have to do with Facebook’s deliberate inability to make their content less divisive
You missed the point. Too many people are credulous, and lack a rigorous education in epistemology and the scientific method. This has been a problem forever and isn't something Facebook can fix.
I don’t expect them to come out with a perfect solution but if they can’t fix it and no one else can, but they still contribute to it I don’t see how why they should be above criticism and reproach?
Also the article mentions that even though they had they option to reduce it, they actively chose not to. How is this not something that I shod be critical of?
Why is growth above everything else important? What if they disabled the group discovery part? How is that impossible?
We are talking about existential threats to humanity. It might be hard to believe but humans ability to bring about bad outcomes is growing at a very fast pace.
Key players like Facebook throwing in the towel because “we don’t see an easy way to stop contributing to problems that are growing in size and can potentially destabilize democracies, world ecosystems, or a few decades from now hunan existence” is not an option.
The problems we have today are growing exponentially in seriousness. Human beings need to learn to get along in ways they never needed to before both due to resource stress and technological powers we never had before.
Facebook’s amplification of many human weaknesses is only one of many risk factors. But I don’t think many young people realize how easily humans have fallen into disasters in the past, which amplified by progress could easily become existential today.
The while thing is, what does what other tech companies donate to have to do with anything?
Btw, some tech companies donate a fuckload to open source and other aligned initiatives. Its annoying to see this ignored for the sake of a weak argument.
You inverted the statement in the act of repeating it back - people were misled into believing eating fat was unhealthy, not that it was healthy.
And this is a part of the problem, right? It's not whataboutism, it's a lesson from history. Your belief in the incorrect expert advice of yesteryear is so strong that even trying to type out the opposite is hard.
At the moment a whole lot of people, of the type who read and post to HN, have decided that disagreeing with "experts" is divisive. The problem is these "experts" aren't really experts by normal definitions, like someone who has a strong track record of correctly understanding and predicting a complex topic. The word is instead being used to mean something more like, "people employed by the government who claim special knowledge". Nutrition is the example chosen here for non-expert experts, because nutrition has been hit hard by the replication crisis. But it's hardly the only field with this problem - basically every medical authority has discredited itself during COVID.
Disagreement with authority is a classic justification for free speech. It is inherently perceived by the ruling classes as "divisive" because that's exactly what it does - it divides people into those disputing their authority and those who don't. You thus can't combat "divisiveness" without simply shutting down all disagreement with the government in all ways.
A counter-point to this is that studies show polarization has also fallen in some countries over the past years - including ones where social media (Facebook or otherwise) is popular. Studies also show some of the most polarized segments in the US to be the older population, which uses social media less. We definitely have work to do, but this suggests there are many factors at play.
You have an interest in the outcome of the research, why should we trust you to conduct or fund it properly? Your track record is terrible. The fact you are throwing around dubious research as facts is crazy.
I'm sorry, but really, there's no reason to believe a word a you say.
You get paid from Facebook, Facebook lied again and again, how on earth do you expect people to take you seriously?
Your job and title is to give FB the optics of caring about integrity, and was invented as part of FB PR in the aftermath of the Christchurch massacre.
>why should we trust you to conduct or fund it properly?
You are welcome to conduct research or fund it. You can even help current research by criticizing their research in its substance, or engage yourself politically (at least by voting) so we can have more and better research about the topic.
Outside of actual weapons research, research can't be "weaponised" and academia is rife with incredibly strong political biases. Political bias in academia is so severe that there is actually an entire foundation devoted to trying to combat it (the Heterodox Academy).
Your posts sound like you believe corporations shouldn't ever do research and worse, that academics don't have any interest in the outcomes of their own work. But that's nonsense, of course they do. They want to publish papers, they want their research findings to be novel and widely cited, they want to build a reputation. They have all kinds of self-interested incentives that act against producing accurate research findings; hence the replication crisis!
How do you think Facebook should deal with this? If Facebook-funded research is suspect and nobody else is able or willing to fund it (why should someone else fund soemthing for Facebooks potential benefit, when Facebook are rich, and if someone else is funding research for Facebooks detriment, then I would say its equally suspect), how should such research be funded then?
How can you show results of the research in a way that you wouldn’t consider as “weaponized”?
I’m not a fan of Facebook and am quite suspicious of them, but I’m also not sure what they can do in this particular situation that we would find satisfactory,
That is interesting. My take would be that the older population may spend less time on social media (who can compete with a 20 year old with a phone welded to their hand anyway), but that a disproportionate number of these seniors are babes in the wood where technology is concerned, and are more amazed by, believing of, and susceptible to the influence of social media than youngsters who have grown up along with those social media platforms.
Would be very interesting to learn about these countries where polarization has fallen if you have links to hand.
Except now the people that use Facebook the most are grandparents (i.e. the demographic you mentioned that is most polarized). Facebook is no longer cool as it was 15 years ago.
The NBER paper that you point to in a subsequent comment doesn't have any detail on the popularity of social media; we explicitly can't make any determination on whether social media is having an effect there because the data is missing. If, say, Facebook were to provide external researchers with data about the growth in Facebook use - users, median time spent on site, average time spent, SD of time spent - for a number of years, that would help to identify whether social media has a role in such polarisation.
Although it's trivial to say "I see a lot of polarisation on social media, therefore it's worse than it was", a satellite-view paper like that NBER one gives zero insight into the role of social media, partly because the data isn't provided, but also because it doesn't examine what effects there might be on smaller groups within the population who are, say, heavy social media users.
I think the most useful thing Facebook could do would be to make more information available to researchers, rather than pointing to research which hasn't been able to use data and claim that helps exonerate Facebook.
Are you really trying to take credit for a flimsy correlation between facebook usage and polarization (in some countries, though you didn't say which ones or how many compared to other heavy use countries)
I think it's the reverse. They're trying to distance themselves from the notion that Facebook is the primary factor in what's polarizing (certain) people.
> Now we're struggling to deal with existential threats like climate change because a lot of people get their worldview from Facebook
You state this as a matter of fact. How do you know this?
Even if it were to be true, that people were more polarized in the climate worldview by Facebook and more so to the wrong side than the right side, we all know that climate change is the result of our behaviour the last centuries and that counter efforts has been resisted the last 50 years.
Something like 85% of the planet believes in non-science. There are 2.3 billion Christians, 1.9 billion Islam, 1.1 billion Hindu and probably a billion "other" religion. The fact that people believe non-science has got nothing to do with Facebook.
If you're not being given the benefit of the doubt, it's because your employer has 16 years of lying about this and related issues. Zuck long ago torched whatever shred of trust ever existed, so no, we are not going to be impressed by an extra 0.003% of annual revenue thrown to problems you've created.
On top of that - it seems their efforts to prioritize friends and family may not take into consideration this is where the divisiveness seems to begin? How many of us have friends and family that share news articles, worldview opinions, and memes that fit into divisiveness, fake news, and/or borderline racism.
You can reshuffle the deck but the same cards are still inside.
The fact that FB has not banned political ads is pretty shocking and absolutely related to this topic.
Twitter managed to do it, but FB continues to allow political parties to spread misinformation via the algorithm, and FB profits from it.
So essentially VP of Integrity, your salary is paid for in part by the spread of misinformation. Until you at least ban political ads your integrity is non-existent.
But of course! Every company should have a VP of Integrity, especially Facebook. Creating the impression of trustworthiness while profiting from lies is important at any company. At Facebook it’s the core business.
Fcebook's record to date is such that it has all but no credibility on this account. This gives it virtually no room for effective action.
Partners have walked from billions of dollars in shares, its former concience, Alex Stamos, quit after being repeatedly blocked, stymied, subverted, undermined, or backstabbed. And the very label "VP of Integrity" reads as so perfectly Orwellian and ironic that the position negates itself.
Without an ability to fully rope in Zuck directly it is every bit as toothless as it sounds.
Being receptive and responsive to critics is a function of marketing. And that's what this is -- marketing.
Further more just because the role exists doesn't mean it matters. My Org has all sorts of diversity and sustainability managers, and their relevance ends the moment business decisions come into play.
Integrity should be woven into the fabric of a company's culture. It can't possibly be a role. Making it a role looks like window-dressing and effectively an empty gesture.
Hi there. Have you ever considered making the decision making open? I mean it seems you are obsessed with criticism and scrutiny. Then here is an idea for you: invite journalists from major media outlets to your decisions making. Then you can avoid these "unfortunate" cherry-pickings as you put it.
Why I am saying this: it seems you sit backwards on your high horse, criticising those people who for all intents and purposes have very limited insight into the decision making.
Me and my close friends are fed up with Facebook and how obviously it is trying to polarize everyone in this world.
No sympathy for you on my side, and I cann assure you I speak on behalf of my friends too.
Yep we actually do this! (invite journalists to decision-making meetings). One of our regular meetings is about the content policies, we publish the minutes here - https://about.fb.com/news/2018/11/content-standards-forum-mi... - and have also hosted journalists and outside academics from time to time.
Thanks for the reply. But I am still not convinced. It seems these are watered down versions of outlines of the actual decision making. Which is different from what I am suggesting.
Here is an actual example:
> "• Question: How much will we communicate about this? We usually don't want people to
game the Feed but this change might hit their pocketbook so my instinct is to be open
about this.
> • Answer: Comms is aware and we will be proactive about communicating this. Inside
Feed is one channel we can use to do this. The only way this can happen at the domain
level is with full transparency and product is aware we can't do this unless we have that."
So the commitee passed the buck and all the transparency is gone. Also there are no names or responsible persons assigned. And by the way, the words "feed", "news feed" or "ranking" appears only once throughout the span of 4 months. I very much doubt there were no decisions in any form throughout those 4 months regarding the news feed.
So overall this looks to me like pixie dust rather than a true representation or involvement in decision making.
We have entered an era in which non-state actors like Facebook have power that was once the exclusive domain of governments [1]. Facebook understands this, and justifiably views itself as a quasi-government [2].
I would really like to understand Facebook’s theory of governance. If I want to understand my own government, I can read the Federalist papers. These documents articulate an understanding of history and a positive view of the appropriate role of government in society. I can use these documents to help myself evaluate a particular government action in light of the purpose of government and the risks inherent in concentrated power.
Has Facebook published something like this? I struggle to understand Facebook’s internal view of it’s role in society and its concept of what “doing the right thing” means. Without some clear statement of governing principles, people will naturally gravitate to the view that Facebook is a cynical and sometimes petty [3] profit maximizer.
Without some statement of purpose and principles, it is hard to level criticism in a way that Facebook will find helpful or actionable. We are left to speculate about Facebook's intentions, instead of arguing that a certain outcome is inconsistent with its stated purpose.
This may come off as condescending, but I'm honestly just curious.
From the outside looking in, it seems as though you are paid to drink cool-aid and paint FB in a positive light. How does one get to be in your position? What are the qualifications for your job?
Remember that VPN app that Apple pulled from the app store, the one that Facebook was using to spy on users' internet usage to gain intel about potential competitors. When Facebook acquired the VPN app this guy came with the purchase. VP of Inegrity. Oh, the irony.
Facebook's "VP of Integrity" Guy Rosen (guy_ro) co-founded Onavo, a spyware company that Facebook acquired in 2013. Onavo's flagship app was Onavo Protect, a VPN service that Facebook used to monitor the activity of its competitors, including Snapchat. Facebook acquired WhatsApp and copied features from Houseparty based on the data it harvested from Onavo users.
Onavo Protect was removed from the App Store in 2018 for privacy violations, and from Google Play in 2019. Onavo then rebranded to Facebook Research and marketed itself through targeted ads to teenagers on Instagram and Snapchat. Apple revoked Facebook's developer certificate because Onavo was using it to bypass the App Store review process. Three U.S. senators (Richard Blumenthal, Ed Markey, and Mark Warner) criticized Facebook Research for harvesting data from children. Facebook Research was discontinued later that year.
Guy Rosen moved on to be Facebook's VP of Product Management. In 2019, he briefly adopted the "VP of Integrity" title when performing damage control for Facebook's live stream of the Christchurch mosque shootings, and then reverted back to VP of Product Management after he was called out on it.
> The next time Facebook needs a public relations injection, it should consider using someone other than the co-founder of Onavo.
It seems like they knew exactly what they where doing. They want money and I don't believe for a second that the most ethical road is where the money is, you have to be able to say no to money and Facebook clearly shows us time and time again that money has the highest priority.
The highest value seems to be in companies that brand themselves as trustworthy but really isn't. No matter if it is Goldman Sachs, Facebook, Google or Nestle.
Unbelievable. Just when I think I've seen it all in tech, how can someone like this take their title seriously having peddled their spyware to the highest bidder. Hope the cognitive dissonance keeps these execs with a broken moral compass up at night, if not for what they've done for themselves, then for contributing to making the world a worse place for their children and beyond. Thanks for sharing, wish there was a database full of these snakes and their slimy legacy they've left behind.
Fair enough, but I also get to see first-hand how decisions are made and how rigorous debates take place, so I have more faith in the process. We've got lots to do to improve transparency of how this stuff happens, because I know people care about it. One way we started a while ago is publishing minutes to one of our meetings where decisions on content policies get made. https://about.fb.com/news/2018/11/content-standards-forum-mi... -- lots more to do!
Can you qualify what you mean by “rigorous” ? I think this is an important point of contention because Facebook is so accustomed to using data to justify decisions. Yet these ethical issues often either have no data and/or the consequences of Facebook’s actions impact users who are not on Facebook. Moreover, the data is not available to the public (despite the public generating the data) so the public can’t actually reproduce the warrant used internally at Facebook. This lack of reproducibility is why I think there’s so much friction around Facebook claiming their decisions are made with “rigor.” Thanks.
His work history is on LinkedIn. VP Integrity is essentially an executive product management role so qualifications would be along those lines.
Edit: LinkedIn says he was an engineering manager, then took a series of roles culminating in a data startup that got him into Facebook in the current role.
'A data startup'? How very generous of you. He co-founded a spyware company that was bought by Facebook to discover what other companies/apps to buy or clone.
You do not get to the position he has at Facebook by having either ethics or morals. Having seen this up close at Facebook HQ I can assure you that everyone at VP level or above there knows what they are doing, knows the long-term consequences, and simply does not care because the paycheck is far too large for simple ethics to enter into the discussion.
Morality and strong ethics do not place a glass ceiling on anyone’s achievement. Arguably, consistently good ethics lead to more opportunites and better outcomes.
The main issue here is people have different ideas of what is ethical. Disagreements arise when countries and increasingly companies exert power of people given to them by those same people.
In my mind, the same problems existed before facebook and expecting facebook to solve them for everyone is rediculous. Even more rediculius is expecting everyone to accept facebook’s solutions.
Potentially useful context: the parent here was the Co-Founder and CEO of Onavo which he sold to Facebook for $120 million. If the name "Onavo" doesn't trigger any bells: https://en.wikipedia.org/wiki/Onavo . It was ostensibly a VPN but tracked its users' behavior and Facebook used the data from Onavo to judge how much traffic various startups had when deciding whether to acquire them.
I think that the fact that the founder/CEO of Onavo is now Facebook's VP of Integrity is entirely consistent with everything else we've read about Facebook over the years.
> The Ministry of Peace concerns itself with war, the Ministry of Truth with lies, the Ministry of Love with torture and the Ministry of Plenty with starvation. These contradictions are not accidental, nor do they result from ordinary hypocrisy: they are deliberate exercises in doublethink.
-- 1984
His title VP of Integrity is not accidental; it can't be.
Anyone want to hazard a guess at the total comp of a VP @Facebook? Curious
I hope there are more people at the top with the integrity to stand up for injustice like Tim Bray. I have all the respect in the world for someone who puts their neck on the line for what they believe in. Thank you Tim @tbray
> We reduce distribution of posts that use divisive and polarizing tactics like clickbait, engagement bait, and we’ve become more restrictive when it comes to the types of Groups we recommend to people.
This seems inherently a political evaluation. What are the criteria and is this driven manually automatically?
Would you care to post the same stats but updated for 2020 then? Also i see that this is statement is also copy pasted on your twitter so I have a hard time believing you actually do read this feedback and didn’t just post This comment as damage control.
If we're talking metrics, the 64% stat cited in the article turns out not to be a good way to measure impact of recommendations on an extremist group. We internally think about things like prevalence of bad recommendations. More on prevalence here - https://about.fb.com/news/2019/05/measuring-prevalence/. I don't have a metric I can share on recommendations specifically but you can see the areas we've shared it for so far here: https://transparency.facebook.com/community-standards-enforc...
You deny the statistic in the article. Then you use corporate speak like "we internally think about things..." Then you link to a blog post titled "Measuring Prevalence", which doesn't have a single data point with a number in it. Not to mention that it's another mess of CorpSpeak that sounds like a PR committee wrote it, not a human being.
The problem is not your intentions. It's clear that you actually want to engage people, given that you're replying actively on this board, even to impolite comments. The problem is the style of communication. It sounds constructed, artificial, developed through a process, like you and a bunch of people are sitting there drafting replies and tweaking until you have something that won't come back later and bite you in the butt. The problem with that strategy is that you end up saying very little of meaning at all.
So maybe to make things more concrete - is there a statistic on measuring the impact of recommendations that you do agree with that you can share?
Thanks for the message. Does your group have a centralized place where your team’s research, recommendations, and roadmap for changes are visible to the public?
Your MO over the years seems to be "we're working on it" from your frontline interview in 2018 regarding Myanmar to a business insider article from last year regarding the New Zealand shooting.
$2M for independent research. haha, that's really generous of Facebook. You all probably made that in a days worth of misleading political ads you've sold.
One question I have for you mr VP of INTEGRITY... I haven't used FB in over 8 years but I would like the full set of every data point that you have on me, my wife and my kids. Where can I get that? And if I can't, please explain to me why.
I guess you've been drinking so much moneysaurus rex kool aid that you seem to not understand that people just don't believe anything you all say anymore. Your boss can't even give straight answers in front of the government.
Maybe you all should change your tagline to:
Facebook, we're working on it.
Maybe that's how Facebook builds their organization: VP of Integrity, VP of Honesty, VP of Sincerity. So that Zuckerberg doesn't have to take on any of those roles.
Yes - but in this case the CEO has none so he needs to outsource his integrity to someone else, and when you are a vacuum of integrity, everyone else looks like a saint. Therefore, top comment.
Might be a long list. We've got >35,000 people working on safety and security. I expected some challenges recruiting people with all the bad press we got over the last few years, but instead I've seen that talented people eager for a really hard challenge are even more excited to join and wok on this area.
So where's that list? I would imagine it is not hard for a company at the size of FB to reliably host a static file of 35000 redacted names. :-)
And my snarky response is inspired largely by what I would call your diversion of a response. To me reads like you are not acknowledging the real answer: no, as FB VP of Integrity I can't and won't provide the list.
"Might be a long list. We've got >35,000 people working on safety and security. I expected some challenges recruiting people with all the bad press we got over the last few years, but instead I've seen that talented people eager for a really hard challenge are even more excited to join and wok on this area."
I didn't read the article, because paywalled.
So you're implying seriously that 35'000 people are actively involved in such decisions? Really?
Let me wager a guess:
A Google search reveals 44'942 Facebook employees as per December 31 2019.
You're implying here that 78% of all your employees are directly involved in such decisions?
Here's my take. 99.998 % (and yes, the number is pulled out of thin air) of those 35000 people are lowly paid contractors sifting through the horrible crap that some of your users post.
Sure, they're working on "safety and security" in the broadest sense, but are sure as shit far away from any such decisions.
Then why did you offer? You aren't confronting anything directly, everything you are saying here is indistinguishable from any other PR agent replying but dodging questions to seem like they are transparent.
Respectfully, to a top brass executive at Facebook: I respect the hard work and innovation that has gone into building Facebook. It's allowed billions of people to connect worldwide in ways that were never possible before. It is truly a billion dollar platform.
The problem is that your entire executive leadership believes it is a five hundred billion dollar platform. They have to, because the investors demand it.
Why are you asking third parties to conduct this research?
Why isn’t this an initiative driven by an internal team? Where applications for this program advertised outside of Facebook?
Is $2M realistic for this research? I know I wouldn’t be enthused considering my total compensation. Do you expect top quality researchers to apply?
The platform is constantly evolving. At any point millions of individuals could be part of an A/B test that alters their experience. How can a third party navigate these conditions without corrupting their findings?
Well, when your management has such a record regarding their own integrity why on earth would you expect us "dumb fucks", as Zukerberg put it, to trust you?
Facebook has lied again and again, took any shady approach to get data, mishandle that data, practically endorsed a genocide. What integrity are you talking about?
You maybe serious about your job, but you work for people that proved they have no moral compass. I'd quit if I were you and actually believe in what you try to accomplish.
Edit: just realized you are the founder of onavo, a spyware bought by facebook. Were you also heading project atlas? Were you responsible for inviting teens to install spyware so you can collect all their data?
The mere fact YOU are chosen as VP of integrity just says it all. When the head of integrity is a spyware peddler... Well....
Yeah, I guess it's a cynical move I should've expected of Facebook.
> What’s undeniable is we’ve made significant changes to the way FB works to improve the integrity of our products, such as fundamentally changing News Feed ranking to favor content from friends and family over public content (even if this meant people would use our products less).
Emphasis added.
What is that supposed to mean? Would you like a sticker for doing the right thing?
Meta question. How is this post from a user created 6 hours ago, with 66 Karma at the top of this comment section? This is particularly interesting since just this week, TripleByte’s CEO’s top level comments were buried. Is there some kind of change or is this really the naturally occurring top comment?
Thanks for chiming in here! Honestly the response to your comment reminds me of the youtube comment section. I'm not sure why people aren't capable of civil discourse, would have expected better from HN.
“We come to these decisions through rigorous debate where we look at all angles of how our decisions will affect people in different parts of the world”
If you had any integrity, you would delegate this governance back to the captured provinces.
This WSJ story cites old research and falsely suggests we aren’t invested in fighting polarization. The reality is we didn’t adopt some of the product suggestions cited because we pursued alternatives we believed are more effective. What’s undeniable is we’ve made significant changes to the way FB works to improve the integrity of our products, such as fundamentally changing News Feed ranking to favor content from friends and family over public content (even if this meant people would use our products less). We reduce distribution of posts that use divisive and polarizing tactics like clickbait, engagement bait, and we’ve become more restrictive when it comes to the types of Groups we recommend to people.
We come to these decisions through rigorous debate where we look at all angles of how our decisions will affect people in different parts of the world - from those with millions of followers to regular people who might not otherwise have a place to be heard. There’s a baseline expectation of the amount of rigor and diligence we apply to new products and it should be expected that we’d regularly evaluate to ensure that our products are as effective as they can be.
We get criticism from all sides of any decision and it motivates us to look at research, our own and external, analyze and pressure test our principles about where we do and don't draw lines on speech. We continue to do and fund research on misinformation and polarization to better understand the impact of our products; in February we announced an additional $2M in funding for independent research on this topic (e.g. https://research.fb.com/blog/2020/02/facebook-misinformation...).
Criticism and scrutiny are always welcome, but using cherry-picked examples to try and negatively portray our intentions is unfortunate.