Hacker News new | past | comments | ask | show | jobs | submit login
YouTube moderators forced to sign statement acknowledging job can give them PTSD (theverge.com)
152 points by untog on Jan 24, 2020 | hide | past | favorite | 206 comments



"Forced" is a weirdly charged word - like "garbage disposal workers forced to sign statement acknowledging that their work will involve handling garbage". What's the alternative here?

Yeah, it's awful, dirty work, and we should give them as much support as possible (definitely more support than they are getting right now) - but at the end of the day, "dealing with things most people don't want to touch" is literally the reason that the job exists...

TBH I'd be more concerned if YouTube was signing up moderators WITHOUT telling them that they're in for a life of emotional torture :/


Signing a nonnegotiable disclaimer, without full knowledge of impacts, under economic duress, under an extreme power and negotiation disadvantage, as a third-party contract employee, without collective bargaining, is not free and fully-informed consent.

https://news.ycombinator.com/item?id=22139444


What do you mean without full knowledge? This article is about adults being informed of the risks of PTSD up front.

Obviously everyone who seeks a job is under some form of "economic duress" for if they had no need for income they'd either not work, or work pro bono.

What's the alternative here? A YT moderators union isn't going to negotiate away the intrinsic shittiness of YT moderation.


There's informed and then there's informed. A few words in a disclaimer presented to you at the last moment is not anything like having full knowledge of the consequences. Long ago I did sysadmin work for an early anything-goes community site, bianca.com, which at the time involved a bunch of anti-abuse work. 20+ years later, there's still shit I can't unsee. Right now I can close my eyes and see my CRT, the room behind it, and the horrible things abusers were posting to make other people feel awful.

And what I had to deal with was a small fraction of the horror now flowing through the ugly parts of social media sites. I don't think written words alone can convey what people doing that work have to deal with, or the possible long-term impacts. At a minimum, for "fully informed" I'd suggest a one-hour documentary that mixes the kind of things they see with interviews from people who ended up with long-term trauma from the work.

And the alternative is what we've done with all sorts of other dangerous work: making it reasonably safe. Off the top of my head, I'd want to see: 1) tools that minimize exposure to harmful imagery (e.g., ML-driven auto-summarization and by-default pixelization), 2) weekly paid sessions with a therapist, 3) a limited amount of time per week dealing with the especially bad stuff, 4) significant, mandatory paid vacation, and 5) mandatory paid sabbaticals for anybody verging on burnout.

Occupationally, mental health isn't fundamentally different than physical health. Nobody reasonable today would deal with radiation or chemical exposure by just having people check the "I might get cancer and that's cool" box when they apply for a job, because we took the problem seriously. We need to do the same with toxic digital media.


Upfront, you mean other than those employees already working.

The more nefarious part of it is that at no point does Accenture suggest that they have any responsibility or will assist with said PTSD. Hell, they even imply that they'll most likely fire you if they find you do have it (one, because the document requires you to tell them, and two, because their supervisors repeatedly pressure therapists at the in house WeCare to disclose that information).


All the more reason to find work elsewhere which will cause those available to do the work be more scarce which will cause the compensation for the work to go up. For those under economic duress as it were, there's time. This isn't an imminent danger.


Nope. We should not sacrifice poor people on the altar of quarterly profit goals. Doubly so for companies like Google and Facebook, which are hugely profitable.

There is a sufficient supply of desperate and/or naive people that the market equilibrium is significant and lasting harm to humans. If companies won't solve that problem on their own, then the alternative is regulation. Given the history of physically dangerous jobs over the last century or so, I expect regulation is the likely outcome.


But these are not poor people. At $37k annually they are in the middle third of US income and substantially above the poverty level.

The article linked below profiles a man that "worries that he will not be able to find another job that pays as well as this one does". This indicates to me that all else being equal, they are already offering a premium.

Nor does it sound like he is either desperate or naive. He is well aware of market compensation and he is struggling to decide on the tradeoffs.


Oh? What's that relative to your income? Because my point isn't about absolute dollars. It's about America's long-running tendency toward exploitation of people with less money.


And two years later, they develop PTSD, and look for help with that, at which point Accenture waves this document saying they have absolutely zero responsibility to assist with it.


As someone with personal experience with PTSD, there's no such thing as being informed of the risks up front. No matter how bad it's described to be, it will be worse.


Addressed in linked comment, also by me, also in this thread.


This is one reason I'm pro-UBI; while it might be reasonable to tightly couple subsistence to full-time labor while below a certain threshold of GDP, if our societal project is to maximize human dignity and autonomy (and why the hell not), truly consensual social relationships cannot exist at extremes of negotiation asymmetry, such as when survival / Maslow needs are reliant on employers and private capital.

Across some invisible line of total societal / planetary wealth, it becomes feasible to provide every human a baseline of economic and biological well-being; moreover, I would argue that such an economic ecosystem would be on net more generative than the "work to keep the wolves from the door" model:

- Economic anxiety decreases executive functioning and ability to plan for the future.

- Authoritarian labor models carry an "enforcement tax"; we can get more done if there's no need for a middle manager caste to enforce compliance and productivity (or the corresponding incentive for workers to "look busy").

- A great deal of real-world wealth creation isn't metered and carries no bargaining leverage, so goes unpaid / underpaid: raising children, community work, charities, FOSS

- As we continue to pluck low-hanging innovation fruit, we benefit from higher degrees of risk-taking. The more people who can afford to gamble on education and entrepreneurship, the more "hits" of unrealized value we'll discover (to say nothing of cultural and creative works).


"under economic duress"

So basically if someone needs money, they cannot give consent because their judgment is clouded by their desperation?



Correct. You're not allowed to Dell yourself into slavery.


In what way is working an unpleasant job that you could choose not to do slavery? Sounds to me like the person prefers the probability of PTSD to the guarantee of a life of poverty. How is that not informed consent? Because we wouldn't choose the same when living in a life of not-poverty? Are we really reducing all people without a certain level of income to children who can no longer make decisions for themselves or enter into contractual arrangements?


So college students should not be allowed to donate plasma for money because they aren't able to give consent due to being under economic duress?


That's actually one of the main arguments against paid blood/plasma donation, and the major reason that we don't pay for organ donations, because it results in a society where the poor literally sell their own health for basic survival. Plasma is sufficiently replenishable and donationations are safe enough that the net gain of lives saved is considered worth the societal risks.


Plasma, or other readily replenished fluids (there's another popular option, at least for men, though it carries other pregnant long-term implications) are propbably reasonably fair. There's little long-term harm or consequence.

A kidney, lung, or liver lobe, not so much.

Psychological research or DNA submission enters a pretty freighted grey area.


Pretty much. If your choice is either "Do X or live in poverty," what do you choose? Keep in mind that poverty is often a trap that is very difficult to get out of.


You just explained like 90% of working class jobs.


I'm aware of that.

At least now. Less so in my younger years.


it's so great you changed idea and you became both more realistic and more emphatic.


I'd like to think the empathy was present.

The first-hand and theoretic knowledge was not.

Much of the theory I'd been supplied (mainstream economics, an unbeknownst to me at the time hardline Libertarian professor in my intro course: https://news.ycombinator.com/item?id=22133958), was false.

Undoing the damage of false priors and models, especially under ongoing objection and denial based on those, is difficult and expensive in time, effort, and willpower.


If you go with a notion of exploitation rather than lack of consent, it describes all waged labour.


Marx was right.


.. about nothing. Most people in his day knew this right away but even today some people still deny this.


And that's why we need strong unions run by smart people who have the interests of the workers in mind.


Yes, we need to put smart people in charge. Those stupid workers can't figure anything out on their own.


"Smart people" and "workers" are not mutually exclusive sets.


>is not free and fully-informed consent.

The standard of consent that we apply in other areas would make it impossible for an individual, worker or customer, to ever enter into a contract with a larger corporation (exactly how large can be debated, but Google size with their Google size legal team is definitely beyond that point). I think it would be a good thing to standardize the law in this regard.


Alternatively, use it as continuous motivation to improve the relative position of the worker and consumer so they are closer to having the ability to give free and fully-informed consent.


They can inform themselves, weigh their options, and make a decision. Like adults. Why this constant infantilization of people?


The problem is, they cannot. The only way to be “informed” is to actually be exposed to the material.


I'm pretty sure YouTube moderators know how to read and use the Internet.


What a bizarre thing to say. If reading about diets was enough then no one would ever be fat. But it turns out in the real world that reading about it and actually doing are very different. The same applies here.


I would put "economic duress" and "hate speech" in the same category of hand wavy rhetorical devices.


You don't believe people sometimes do things because they would otherwise be forced to live in poverty? Because that's literally what "economic duress" means.


Maybe, but not in this case. Youtube reportedly pays moderators an annual salary of $37k [1] and Glassdoor reports the average content moderator salary is $45k. The US Health and Human Services put those salaries between 3 and 4 times that of the poverty guideline for a single person.

This poverty guideline describes the bottom 12% of the US population. The Youtube salary describes the middle 33%.

In the referenced Youtube article, the man "worries that he will not be able to find another job that pays as well as this one does". Not that he worries he will be "forced to live in poverty".

[1] https://www.theverge.com/2019/12/16/21021005/


Interesting. How does it work with prostitution and/or working in the adult movie industry ?


Garbage workers have used labor organization to stand up to unfair treatment.

Do YouTube Moderators?


Handling garbage as a garbage collector is not unfair treatment, it’s the actual job.

Handling garbage as a YouTube moderator is not unfair treatment, it’s the actual job.


It seems to me that a youtube moderator has to moderate things ranging from a simple boob on screen to the most atrocious acts you can imagine.

Is like telling the garbage collector that picks up the thrash in your neighborhood to go to a melted reactor to pick up a bag because, hey, his/her job is to collect garbage. Would it be unfair treatment?


... but yet noone would be surprised at all if someone found a baby, mutilated animal, dead homeless, toxic wasted, etc. in a dumpster


But, due to the labor organization, they have the healthcare guarantee to have their traumas addressed.

Do the YouTube moderators have the same?


What are you talking about. I would be horrified (and yes, also: surprised) to find anything like that in a dumpster. You live in a scary place.


But aren't there ways to mitigate some of the issues of both? Might those ways incur costs to the company? Isn't the role of organized labor to advocate for the workers in areas where pure drive for profit would put workers lower priority?


The role of organized labor is to advocate for workers who want to organize. I don't think it's reasonable to assume that something must be wrong with a company simply because it doesn't have a union.


I didn't intend to say that something is wrong with a company if its employees are not unionized.

I did intend to say that a union can be an effective tool in advocating for employees to an organization that is driven to maximize profit.

For profit companies are often rewarded for minimizing costs. Certainly one can envision that FB might view moderation teams as an expense rather than strategic investment.

Organized labor can be a good counter balance to that.


I’m sorry but either you don’t know what you’re talking about here or have consumed some serious late-stage capitalism koolaid. This reporter has previously covered the experiences content moderators face in these types of jobs, and I suggest you read those articles.

Every day these moderators need to consume a rapid stream of horrifying videos depicting extreme forms of physical and sexual violence, in some cases involving children. In the FB moderator article they mentioned having to do so as fast as possible and with very few breaks. The employers provide pitiful mental health support, and unsurprisingly a lot of these workers develop serious PTSD and in some cases have committed suicide.

This is treatment that would normally be considered psychological torture, but because these people get paid (rather poorly at that) and signed some waiver then it’s “fair”? A lot of these people take these jobs because they’re desperate or don’t fully understand the consequences. I know that HN community is pro business, but I’m shocked at the callousness displayed by some of the comments here. A society that allows companies to pay desperate people to destroy their mental health is not a healthy one.


So just no one should do the job then? What’s your solution? Yeah they see horrible shit, that’s the job. There’s no way to do the job and NOT see horrible shit.

Should they get mental health care? Yes. Should they be treated better? Sure. Should they unionize? I’m all for it. But none of that stuff keeps them from handling garbage day in and day out. Because that’s the job.


Explain how this is unfair treatment.


For one, this is something existing employees are being told to sign. So they've been working at a job liable to give them PTSD for quite some time, and only now is the company acknowledging it... via a legal disclaimer they're forcing employees to sign.

> It also seeks to make employees responsible for monitoring changes in their mental health and orders them to disclose negative changes to their supervisor or HR representative.

Absolute abdication of responsibility. Doesn't exactly scream "fair treatment" to me.


It's a responsibility that should be abdicated. Would it be better if Google could force people to attend therapy because the company decided they're unwell?


If employment at Google were the direct cause of mental health issues, then, yes, Google should take responsibility. That responsibility should include providing mental health care, not forcing it upon anyone. Don't strawman the issue.


It would be better if every employee had a regular check in with a mental health professional. Google don’t need to decide anything. “Forced” makes it sound very authoritarian, but yes, requiring such checkups as a condition of employment would be a good thing.


I respect the consistency of your viewpoint, but mandating that all employees undergo a specific regimen of prophylactic mental health treatment would be incredibly unpopular and probably illegal.


How so? I believe that police officers need to do this after particularly bad encounters, for example. (In fact if you search for "police psychological interview" you'll find hundreds of pages and even YouTube videos showing "how to pass your police psychological exam.") So it's definitely not illegal, at least not across the board.


I admit I had not thought about that. It does go against my case quite a bit.


Unpopular with who? Also more unpopular than offering no help to the moderators?


It is very much the nature of mental health that by the time you notice the deterioration you have already been possibly permanently harmed. What exactly will happen to someone who reports themselves as such to HR?

Let me repeat again for anyone who hasn’t figured it out yet: HR is not your friend.


Having been around the corporate block a few times. I agree and say as much to my younger coworkers.

You have a problem. You think, "I'll ask HR for help."

Now you have 2 problems.


Exactly why these employees should all be provided regular therapy sessions as part of their employment, regardless of whether they've noticed any issues.


> Employment law experts contacted by The Verge said Accenture’s requirement that employees tell their supervisor about negative changes to their mental health could be viewed as an illegal requirement to disclose a disability or medical condition to an employer.

> The Verge’s investigation last month into Accenture’s Austin site described hundreds of low-paid immigrants toiling in what the company calls its violent extremism queue, removing videos flagged for extreme violence and terrorist content. Working for $18.50 an hour, or about $37,000 a year, employees said they struggled to afford rent and were dealing with severe mental health struggles. The moment they quit Accenture or get fired, they lose access to all mental health services. One former moderator for Google said she was still experiencing symptoms of PTSD two years after leaving.


Well garbage workers unions have ensured that the job is one of the best paying you can get without a degree or credential.


Sanitation worker is also one of the most dangerous jobs you can do. Being a sanitation worker is more dangerous than being a cop: https://www.usatoday.com/story/money/2019/01/08/most-dangero...


from https://www.bls.gov/emp/documentation/nem-definitions.htm:

No formal educational credential. This category signifies that a formal credential issued by an educational institution, such as a high school diploma or postsecondary certificate, is not typically needed for entry into the occupation. Examples of occupations in this category include janitors and cleaners, cashiers, and agricultural equipment operators.

I didn't look to see if garbage workers specifically get higher salaries, but could be. But even if that's the case, it doesn't seem like a very high salary, so, assuming that unions are not in a zero-sum game, and didn't need to cause other salaries to lower, this sounds like a benefit of unions. Or is it a drawback? I wasn't sure how your statement was taken.


Dunno why the downvoted. here at least, they mske more than a say a nurse.


The comment seems to imply an unjust degree of compensation, or at the least, fails to acknowledge its merits.

Given the charged and rather wanton nature of other comments in this thread, it may be being judged unfairly, though that's not my view.


You can't give Youtube moderators reasonable working conditions. The job itself is unreasonable.


It's more like. Hey you signed up for this little known and fancy "Moderator" position, no prior experience required, before you start, let us explain what it's actually about.


Agreed. "Required" would be a more accurate, less emotionally charged word.

Signing the acknowledgment may be required if you want the job, but you're free to leave. You're not literally being forced to work at YouTube.

Yes, you might really need the job. That's still not force.


Some other comments mentioned that existing employees are required to sign the waiver to continue employment. To me that's more justifying of the word forced, because of the threat of losing your job.

And how do you force someone to do something? It's generally by threatening them of some consequence they dread or can't afford. Losing your job sounds like it qualifies to me.

Another thing I want to bring to the debate is that opportunity does matter. It may not be YouTube that is forcing them to work under such conditions, but our society at large. We set standards which create the alternative choices. If we offered better means of making a living to people, they wouldn't agree to such conditions. So this social aspect is really important. We need to all uphold standards of respect and dignity for ourselves and others. This is really important, because those standards define the selection of choices that you are free to choose from.


> What's the alternative here?

Not a popular one, but another alternative is to fundamentally change the service. But that involves possibly losing YouTube.

Assuming Alphabet/Google (and most others) won't even consider that option...yeah, you're pretty on point.


"When you invent the plane, you invent the plane crash."


Automate it away? Or 99.5% of it?


As if they haven't tried that anyway


A disclaimer that says they could be fired for disclosing disabilities (illegal), and that says nothing about Accenture assisting with their PTSD (beyond WeCare, which has already been flagged for many problems, including inability to access at work, and Accenture/Facebook supervisors pressuring counselors for visit summaries or patient care notes), and obligates the employee to find their _own_ mental health care because of work issues, at their own cost.

And being pressured to sign the voluntary form or risk being fired.


That sounds like you believe YouTube hosting appalling, extreme content is just a fact of life and there's nothing that can be done about it. That's a sad testament about Google engineers if it's true.

If Google could do something about this problem and are choosing not to then a better analogy would be garbage collectors signing something to accept they have to handle anthrax, and you saying "well, they don't have to be garbage collectors!"


I’m personally more interested in societal impacts to it.

YouTube is an impactful media, and it is moderated by personal judgements of, low paid, unqualified “moderators”, who may or may not have PTSD warping judgements or has political or religious inclinations.

I remember there was a news story of a Facebook moderator whose PTSD caused them to flag a legitimate family photo as a child pornography. I don’t think there is any other interpretation to it than that repeated exposure to images to flag will lead to the standards within the moderator to rise towards infinity sooner or later, beyond logic.

This is a slow fire burning the library of Alexandria that is the Internet, because everything WILL be labeled as “hateful offensive violent pornography” or something, past which point only static noises remain.

This needs to stop.


Giant extrapolation alert. So what's the alternative? Just leave the ISIS beheading video at #1 trending?


Garbage collectors have always existed. That's not true for shocking videos. Youtube and other social networks need to be held accountable for creating the incentive to produce such content in the first place. The impact on their own employees is only the first sliver of harm they are enabling.


The "incentive" to produce most shocking content is mostly due to the video sharing websites being free and open; even if there were no recommendations or subscriber feeds on YouTube, they would need this content moderation job since the same people would be uploading the same stuff.


>What's the alternative here?

The alternative was not warning workers of the risks involved, and being sued for the injuries sustained on the job.

It's fairly recent that mainstream, non psych experts are becoming aware people other than soldiers can experience PTSD.


When you are working and the company wants you to sign something to keep working it is different than if you are looking for work and the company says before you can work here you have to sign this.


It is kind of weird that people freak out when Google requires that employees acknowledge something factual as Prerequisite for employment.


Garbage disposal workers handling garbage is a lot more obvious and more importantly, it's just unpleasant but doesn't cause long-term damage.


> Garbage disposal work [...] doesn't cause long-term damage

Fun fact: it's actually the 6th most deadly job in America

Raw data source: https://web.archive.org/web/20150208215521/https://www.bls.g...

User-friendly graphs: https://qz.com/410585/garbage-collectors-are-more-likely-to-...


Do garbage collectors in the US still empty the cans manually instead of using robotic arms for that? Or is there something else particularly harmful about the job I don't know about?


... and 99% male, funnily enough


I don't think any garbage disposal worker in the country would tell you their job can't cause long-term damage.


I should have expected that answer. Yes it does, but that's because it's physically demanding not because garbage is icky.


All that heavy lifting takes a toll


And those unionized workers usually have pretty great benefits, pay, and pensions that they've bargained for because of that.

If their back goes out, the insurance and/or company will have to foot the bill. If the company fails to provide adequate training and safety precautions [within reason] they're highly liable.

This sounds a little like YT is attempting to abdicate responsibility, rather than coming to an agreement.

All of that said, I have no insight into the onboarding practices for a job like this one. Are they trained not only in how to moderate, but how to recognize signs of work-related stress or mental issues ahead of time and given reasonable precautions against causing long-term damage? Are they thoughtfully prepared for some of the content they will have to view? I guess I could look this up myself, but I'm also wondering if anyone here has that kind of insight.


> Yeah, it's awful, dirty work, and we should give them as much support as possible

Have you ever moderated anything? I don't see how a one sided content moderation can invoke much emotions, like even talking on HN to others is more of an "emotional torture" than such moderation. The problem with moderation is probably more about overloading people, forcing them to keep focus all day long.


> I don't see how a one sided content moderation can invoke much emotions

What?! These people are watching videos of extreme violence, child pornography. You don't think it can "invoke much emotions"?


I don't think it can, that's not how emotions work.


If you can watch a video of a person being murdered and not feel any emotional response I seriously suggest you see a mental health professional.


I think it's about detachment. I consume a lot of dissection, postmortem, surgical, etc content.

As long as I can surpress thinking it was a real person, I can watch it without any problem. After many videos, it becomes automatic. Desensitisation is a real problem with watching things from a screen.

But key factor is control, I know what I am going to watch and I can control whether if I want to. Those people on moderation team may not have such situation.

OP is grossly misunderstanding about the long term cost, though. Short term reactions differ from person to person. Some people can quickly associate real things from a video while others don't. It's akin how people can become too mean and insult others online without feeling bad about it.

But things still float around in your head even if you didn't have an immediate reaction. You would remember it some time later or have dreams about it. It will be horrible.


Meh, western entertainment is already full of violence, torture, and murder. People get pretty good stomaching that, and I don't think it's very different if a video is real. As long as it's all happening away from you, people you don't know, events that don't affect you, it might as well be fiction.


You might want to check with psychologist.


Here's what I think is happening: megacorps want to overwork people into the ground, but pretend that it's disturbing content that's causing it, not them forcing people to keep focused and assessing content all day long.

I moderated millions of images myself, plenty of disturbing images, but for my own project and this is why it sounds like bullshit to me.


Here's a thought. Maybe it's both, first off. The shit hours and disturbing content.

Also, your sample of experience, n=1, is not great. And genuinely, if you can look at child porn, murder, and other awful, genuinely terrible things all day, every day, and not be impacted, you 100% need to speak to a therapist. That is not the normal human response to trauma.


> And genuinely, if you can look at child porn, murder, and other awful, genuinely terrible things all day, every day, and not be impacted

Well, that's not what moderation is. Most of it is not actually disturbing content. And when you are moderating it, you don't know what kind of content you are looking at, until you get to the point that you do, but that's the point where you have to block it and move on to the next thing. There is not much possibility for emotions. But it is very mentally exhausting work.


Or it is both...

Honestly speaking, whether it be videos of extreme content or fluffy rabbits. I will feel absolutely terrible after a few hours of having to watch it. I can't imagine doing that for months.


They have to watch rapes, beheading and murder videos people submit on youtube. There were multiple articles about how it messes with them.


There are risks that people can understand and appreciate up front.

There are risks that simply are not fully conciously appreciated until they've been experienced directly.

There are risks whose cumulative effects only build with time, often taking years, even decades, to fully manifest.

And there are risks which fundamentally change the capacity of the affected to even recognise or admit their existence or severity.

PTSD is all of these.

Self-monitoring, encouragement to seek help, and provision of a nonclinical "wellness coach" is grossly insufficient.

The companies, and in this case, contractors (Accenture), providing such services, as well as outside oversight entities such as government regulators, physical and mental health services, unions, and insurers, know (or damned well should), can monitor, and impose regulations, limits, risk premiums, compensation and settlement systems, and collective bargaining powers, for such work.

Business is a risk-externalising engine. Some risks absolutely need to be fully internalised. This is a prime case.

NB: I've worked at various times as a moderator, on spam-detecting and reporting systems which entail seeing some unpleasantness. And on a large social media network where I was tasked with removing flagged images from the storage network. The flagging process hadn't been explained in any detail, and as I would be deleting millions of items of user content, I spot-checked a small handful to confirm the flagging was accurate.

After no more than a half dozen (and probably fewer) I simply didn't want to see any more -- the image from over a decade of a young girl still lodges in my mind, though I've never wiped, nuked, and shredded files on disk harder.

I'm satisfied with the fact that the scripts I created wiped all that content from our and our CDN's storage not in the weeks or months of the initial estimate (our CDN vendor apparently had never considered widespread deletions, or experienced them), but less than two days, with verification. Never heard any reports of unwarranted deletion either.

Somewhere else within the organisation, unknown to me, others had seen and verified those images. I think of who they might have been and the impacts on them.


I really don't understand how images or even videos are causing PTSD. Maybe it's because I grew up on 4chan in the age of goatse, but shocking content is a tiny fraction of how disturbing real life can be.

I understand that looking at disturbing content is going to have some impact on people, and that they may not be able to evaluate that risk properly, but it's nowhere near what every nurse, doctor, firefighter, police officer, soldier, and even teacher will live through over the course of their career.

I've been diagnosed with PTSD after seeing some deeply disturbing medical shit. I'm definitely impacted for life, but there were doctors and nurses there too, and they're impacted too.

Maybe I've been on the internet too long, but there's a massive delta, at least in my view, between seeing a disturbing video and living through real-life trauma. I'd take the video moderation job over being an ER nurse any day.


Forgive the bluntness, and speaking as someone who knows 4chan all too well, and is a paramedic:

goatse is one thing. We're more talking high definition videos of people being beheaded by Mexican drug cartels, people being held down while dogs eat their genitals. Videos of toddlers being raped.

You can't really compare that to a fairly mundane, if explicit, naked man showing his anus.

Also, as a paramedic, and speaking for many that I know (though not all, of course) - trauma is rarely (or a lot less) PTSD inducing. Gruesome, gory, sure, but in the end it's all blood and tissue. What gets to most of the people I know is the emotional violence - being called to child sexual assault cases, accidental deaths, things like that, that take the toll.


Thank you for your service to the community as a paramedic. I have no idea how you people do it.

As someone who has exposure to that world, how do you do deal with it? Is there some kind of training or protocol or therapy that's built in to your job that's different than that of the moderators? Does any of the mitigation even work?

For every one of those beheading videos, someone has to actually go collect the head. I would think that must be orders of magnitude more traumatizing than skipping through the video of it happening enough to flag it.


It's a very good question, because for the longest time, paramedics/EMTs/firefighters -were- expected to just "suck it up".

Now, with increases in the protection of our PPE (bunker gear, etc.) and other knowledge, there are less line of duty deaths due to accident or illness (typically cancer, though that's still a big one) - now the biggest cause is suicide, mostly as a result of PTSD.

There's a documentary that was funded in part by Denis Leary called "Burn: A Year in the Frontlines in the Battle to Save Detroit", talking about fire departments there. One of the veterans says "I wish my mind could forget what my eyes have seen".

Around here, the PNW, at least, there's a huge movement toward handling it proactively, access to counseling, therapy, hotlines, and as importantly as anything else, active efforts to remove the stigma associated with things.

We used to do CISDs (critical incident stress debriefings), which are now largely discredited - essentially "put everyone in a room and 'make' them talk about how they feel after a bad call, whether they want to or not", but now, more and more departments are hiring full-time mental health professionals. One near me has someone who specializes in PTSD, and another who works with sleep regulation (all those alarms in the middle of the night), and alcohol/drug use.


You don’t have to understand it for it to be real. The shit they see is way way worse than goatse. For example, a helpless screaming crying child (who may look like the moderator’s own child) getting raped by someone they trust. That and worse gets uploaded, and it’s their job to delete it.

Obviously being in the room with the victim is probably worse, but that doesn’t mean seeing this kind of thing regularly isn’t still enough to do some damage to some people.


Is there some expectation that they sit and watch the whole thing end-to-end with headphones on?

I'd think skipping around and finding a few frames would be more than enough to be like "nope, that's a bad one", flag it, and move on.

I would think the police and prosecutors that follow up would have to see and document that stuff in way more detail, and nobody's talking about their PTSD.

I'm not saying it's not bad, but I think the initial moderation flag has to be one of the least bad of all the people in the pipeline that will have to deal with those cases.


The moving on is part of the threat.

Dismissing one atrocity faster only means you get to the next one sooner.

You have no agency over the inflow, and there's little evidence that what you're doing has an effect -- the stream never stops. Metrics for performance and benefit of the effort as a whole are hard to construct and communicate.

As a moderator, it's just shitty shit often being done by shitty people, posted by other shitty people, again and again and again and again and again and again and again.

There's no diseased root to hack off. And if you do happen to find one, another grows in its place -- a literal Medusa's head.


I hadn't thought of it that way. I see your point now.


Thank you, appreciated.


> I would think the police and prosecutors that follow up would have to see and document that stuff in way more detail, and nobody's talking about their PTSD.

Uhm, it's actually super well known within law enforcement that law enforcement is emotionally exhausting, so much so that specific units are separated out and the officers rotated to avoid burnout. (Special victims Units, you may know them as)


> I would think the police and prosecutors that follow up would have to see and document that stuff in way more detail, and nobody's talking about their PTSD.

Sure they are.

https://www.surrey.ac.uk/news/police-officers-risk-ptsd-when...

https://medicalxpress.com/news/2018-06-police-officers-ptsd-...


There’s no technical guarantee of such a threshold though


PTSD has long been associated with combat stress, and has gone by various names over time -- "shell shock", "combat fatigue", even "nostalgia" origially referred to a mental disability.

Burnout or occupational stress, now being recongised as a disorder by the WHO, seems to me strongly similar. And the first documented description of it comes not from those directly exposed to stress, but in trying to address it -- staff and volunteers at a free clinic for drug addicts, in 1974, by Henry Freudenberger.

https://spssi.onlinelibrary.wiley.com/doi/abs/10.1111/j.1540...

For both PTSD and burnout, the failure of those not experiencing it to grasp and understand the severity, complexity, causes, and sheer insidiousness of the condition is a large part of the frustration of experiencing it.

The idea that burnout is a form of PTSD has occurred to me. This being the Internet, there are others who argue similarly, though I don't know that we're correct:

https://academic.oup.com/occmed/article-abstract/66/1/32/275...

https://a-new-way-to-work.com/2017/08/09/is-burnout-a-form-o...

With Internet gore sites, among other elements:

- You have agency over whether or not you're going to be exposed.

- You can chose to stop viewing, without consequences (generaly).

- You can block the sites entirely, in many cases, through various screening or filtering tools (PiHole, dansguardian, Squid Proxy, various parental control tools, etc.)

Content moderators are looking at what they're faced with, all day, every day, day in and day out, and their jobs, livelihood, rent/mortgage, food, and family depend on this.

That's a wholly different environment than occasionally seeing a gaped anus.


Typically people in emotionally traumatic jobs aren't going to be spending all day, every day, doing nothing but witnessing trauma. There's downtime - a police officer could go weeks to months at a time dealing with only relatively routine things.

The people doing moderation are spending all of their time, baring a few breaks, looking at stuff that other people have already flagged. That's going to take an emotional toll on you.


> Maybe it's because I grew up on 4chan in the age of goatse, but shocking content is a tiny fraction of how disturbing real life can be.

I'm surprised you can say that. I also frequent 4chan and there have been tons of times that I have to close a thread within the first few posts because I do not want to see the content in them. There have also been times when something I've seen in a thread has lingered in my head for hours after, making me uncomfortable and definitely impacting my mood.

> it's nowhere near what every nurse, doctor, firefighter, police officer, soldier, and even teacher will live through over the course of their career.

Blood, gore, and dead people are only some things that could trip people up. You have to understand that many of these flagged posts are actually engineered to cause outrage or to trigger the primitive, emotional parts of your brain. Even pure audio alone can scar you - if you're really interested, and I do not recommend this, and I'm going to put a large TRIGGER WARNING here so you can look away before I describe this, look up the dashcam audio of a family traveling down a highway when a 2x4 from a nearby pickup truck falls off the bed and flies through the window, killing the mother instantly. Listen to the screams of both children and the father driving as they realize their family has just been destroyed. It will haunt you more than any "simple death" can because it interacts directly with the emotional core that makes you human.

Honestly, I believe only sociopaths can do a job like this and make it out unscathed.


This feels like an externality that needs to be internalized by these platforms.

It shouldn't be legal to destroy people's minds, then drop them back off into society with no help.

If you want to run a massive platform where anyone can upload anything, then you should have to pay for it, at least insofar as financially supporting the potentially life-long therapy someone is going to need after doing a moderation job.

One of many, many, externalities these platforms create, I realize.


>If you want to run a massive platform where anyone can upload anything, then you should have to pay for it, at least insofar as financially supporting the potentially life-long therapy someone is going to need after doing a moderation job.

But this is so tricky though, isn't it? Anyone can run a media sharing platform. Hacker News is a media sharing platform. Mandating these sort of laws...makes the Internet less free.

The same sort of horrifying content that is moderated out today could have just been easily been uploaded to the BBS's and phpbb's of the early 90s-2000s. Does it make sense to make Facebook pay for lifetime care of moderators? Sure - they make billions of dollars - but what about a regional car club forum, or a subreddit about cat pictures? Should reddit be paying anyone who makes and runs a subreddit because they will inevitably have to moderate something grotesque?

It's not an easy problem.


I'm thinking of this in the context of being an employer. As in, if you're going to hire people to be moderators, you can't destroy them as human beings.

If platform companies are responsible for the content on them, and they are hiring (even indirectly) people to moderate the content, and that content harms the moderators, then the company should be responsible for their care. Like, workplace disability, basically.


Isn't this the same risk anyone enlisting in the military also runs?

Still, I agree - it would be a lot easier to avoid these awful posts if people had to pay to be on the platform. It wouldn't be perfect of course, but it at least provides a barrier to entry and a slightly harsher consequence if you forfeit your membership fee when banned for inappropriate content.


Isn't this the same risk anyone enlisting in the military also runs?

Yes, and we've been slowly forcing the military to acknowledge to own up to that. When PTSD was first recognized (as shell shock in WWI), the initial response of the (UK) military was to declare these people physically healthy and return them to the front, or execute them for desertion. I'd say the armed forces have come a long way since.


I'd say that the military places a massive externality on society as well, so, yes.

I guess at least enlisting gets you better benefits than sitting in a mod-farm in Phoenix will.


Companies like Google, Facebook etc should force their executives, from VP to CEO, to do a week of content moderation every quarter.

Then we would see how sympathetic they would be to the plight of these workers and how quickly the problem would be solved.


Toronto's current mayor did this early in his tenure after a massive number of complaints had been going up about the trains with malfunctioning cars on Line 2 during a heat wave. I remember riding those. It was torturous some days—especially when they'd get stuck in a tunnel.

He rode one, from end-to-end on the line, during the heatwave, in rush hour. He came out the other end drenched in sweat, jacket hanging in his hands, looking exhausted and quickly set about resolving that problem. (Nuanced details left aside, so don't hang me fellow Torontonians)

https://www.thestar.com/news/gta/transportation/2016/09/07/j...


Counterpoint: I used to work at a place where every quarter you were supposed to do a support rotation answering customer support requests for a day or so.

Doing something for a very limited amount of time, that you know has no bearing on your job performance evaluation, is extremely different from doing that thing full-time.

In some ways I think it reduced empathy in some people, because it causes some to come away with a very warped view of that job and think "Hey I did this for a week and it was fine, what're they complaining about?"


It isn't because of a lack of sympathy but capabilities. Even cheap moderator warehouses are expensive compared to algorithmic approaches - except they aren't there yet and may never be there if they can't define what they want properly.


It's basically reinventing https://en.wikipedia.org/wiki/Worker_cooperative and rotation of responsabilities.

But yeah. That would be good.


To me, a more fair comparison would be how Amazon requires all office workers to shadow a customer support representative for one week out of each year.

I don't think the OP's suggestion does anything close to reinventing the concept of a worker cooperative.


Amazon really is not a good example of a successful employee management. Their attrition is horrible, worst work life balance culture out of FAANGs, their reputation is horrible and working there is most disliked. They cast a wide net, but lose employees fast.


Have you worked there? I have quite a different experience than this.

I found Amazon's hiring practices and employee management to be very effective compared to other places I've worked. In general, I found their whole dogma really worked in practice, which surprised me at first.


I have not worked there. It is what I hear from other people's experience. Of course it will vary between teams, but the reputation is still low and attrition rate very high.


That is definitely not a thing that Amazon does.


I worked at Amazon for six years and did this four times - one for each promotion I received at the company.


What makes them seem unsympathetic? Are they somehow making the job worse than it has to be?


For one, they’re farming the job out to a third-party: Accenture. Hoping to put themselves at arm’s length from any problems that arise. This is standard “late stage capitalism” shit, but I doubt Google would be willing to bring their moderation team in-house and let them interact with their precious engineers - the people who could actually help alleviate some of the stresses the moderators are facing.

Conway’s Law[1] applies. In terms of product priorities, the design needs of “contracted moderators” are somewhere below IPv6 support in the product roadmap.

[1] https://en.m.wikipedia.org/wiki/Conway's_law


Talking about "force" you are implying as if people are slaves to these companies.

What about other things like cleaning, plumbery etc. There are other terrible jobs besides moderation. Should we rotate on these also?


I’ve never heard of plumbing or janitorial work causing PTSD. If it did then I would want to do the same thing.


Instead of having few willing people being exposed to dangers of PTSD we should force all employees to face the danger of PTSD if it is about that? What will that help achieve? Some odd sense of justice?


How can executives be expected to fix problems if they don’t fundamentally understand what them problems are? It’s the fastest way to get the problem fixed, by making those with power have skin in the game.


How would you solve the problem?


I don’t have enough information or enough power to solve the problem. I would start by doing at least 1 week of content moderation to understand what the workers are going through and then talk with other people with power who have done 1 week of content moderation to figure out how they could help solve these problems.


Making repetitive decisions over a long period of time can make your mind get weird. This counts for the "normal" development of training sets but it gets much worse when the detailed semantics are challenging and hairs must be split carefully.

For instance I was filtering images on my tablet while lying in bed in a dark room and by the time I got to the 2000th image my visual system started getting weird. (e.g. I would see an image that was photocomposited and feel strong cognitive dissonance... I would see the photocompositing and not see the image I was supposed to see)

When you put a group of people in a room and have them label things that require careful discriminations, some of which are critically important, others of which are necessary for the process but not necessary for the product, you will see people get emotionally disturbed. I haven't seen punches flying but I think I could make it happen if the group was compressed enough in space and the stakes were high.

I also had a toxic data set that was derived from Wikipedia where if you started clicking on things from the first record, you would quickly come to various phrases involving the "F-word". Just about everyone who looked at this data set would soon be sitting in a daze in front of the computer repeating the "F-word". I'd tell them that they have the "F---s" and they should get a grip on itself.


The PTSD form describes various support services available to moderators who are suffering, including a “wellness coach,” a hotline, and the human resources department. (“The wellness coach is not a medical doctor and cannot diagnose or treat mental health disorders,” the document adds.)

They don't even provide a licensed therapist? Most tech companies have an employee assistance program where employees can talk a licensed mental health professional 24/7. It seems like Accenture didn't even provide that here.


I wonder what a good approach would be for jobs which need to be done, but are harmful to the people doing them?

They talk about "the industry-leading wellness program and comprehensive support services we provide" - but should there be a basic standard of support set which they have to meet?

I'm not sure of what a good answer would look like.


I'm not sure either, but things like this:

> The PTSD form describes various support services available to moderators who are suffering, including a “wellness coach,” a hotline, and the human resources department. (“The wellness coach is not a medical doctor and cannot diagnose or treat mental health disorders,” the document adds.)

do not inspire confidence. They sound very much like cut-price answers to very complicated questions. I'm sure a company like Facebook could afford to give its moderators access to fully qualified mental health professionals. But hey, they could also make those moderators full employees rather than third party contract workers, and show no signs of doing that either.

(not to mention, suggesting you talk to HR? The second you do that liability alarms are going to start ringing and attached to your name. Terrible idea)


> They talk about "the industry-leading wellness program and comprehensive support services we provide" - but should there be a basic standard of support set which they have to meet?

> I'm not sure of what a good answer would look like

IMO the bare minimum is credit to cover any and all inpatient/outpatient therapy with LCSW/PsyD/Clinical Psy PhD/MD health professionals, as well as a generous amount of allowance for prescribed paid leave(from a LCSW/PsyD/Clinical PhD/MD) due to distress.


>jobs which need to be done, but are harmful to the people doing them?

Does this job need to be done? Perhaps we have learned that opening a platform for anyone to upload anything in an anonymous state is not good for society.

Maybe it is time to force anyone uploading to YouTube to be approved prior.

All the hand wringing concerning online services and the belief they should exist since everyone seems to like them is something society has to deal with. No one is practically looking at the ill the services are doing to the world.

Why should anyone get PTSD to ensure YouTube can continue to offer videos? Wow.


Approved... by content moderators, perhaps?

Rearranging the deck chairs doesn't solve the fundamental dilemma.


Rearranging the deck chairs doesn't solve the fundamental dilemma.

Charging $10 or $100 per upload would kill this material stone dead. And ensure that everything could be traced to the person responsible if it did slip through.


Prescreening would massively damper speech to make it essentially Cable TV while not even solving the issue of PTSD as there would still be screeners. That isn't a solution - it is misguided spite.

Second it is the price of a remotely free society - not having to ask of freedom of speech - letting the government decide what is "unacceptable for society" with speech is foregone conclusion of abuse. Besides even if it was miraculously representative society doesn't even know what is good for society! The Romans thought that their gladiatorial games were opposing decadence and that wearing pants was barbaric and worse for society than putting lead in wine.


Raising the bar on people uploading wouldn't eliminate the need for this work; at best, it would reduce it.

Someone still needs to watch whatever gets uploaded, even if it's before that account is allowed to post publicly.


The existing standard in old industry was disability and workman's comp but the lines are a bit more fuzzy than just "x-ray says you have black lung" or "you don't have a right hand anymore".

The ergonomics of what could actually prevent PTSD from developing other than massive churn would be an interesting area of study period.


This take is a non-starter for these companies, but I'll say it: these problems come from the fact that these online services have massive reach and scale. These systems are designed to facilitate monetizing off of low-cost worldwide content creation and distribution. You get the good and the bad with that. Without changing that core goal, I feel any effort at "content moderation" is just PR lip service not actually done in good faith.


>I wonder what a good approach would be for jobs which need to be done, but are harmful to the people doing them?

So like any other job, only people who are suitable for the job should apply for the job.

Not every one is harmed by gore video, thats why you select these kind of people to do the job.


This is just to protect the corporation from potential legal action. It s like a disclaimer on a police agent job that you might get shot when doing your job. Problem is when you don t care enough to take care of the ones who are affected by the stress.


No amount of technology will be able to stop all people from being exposed to all forms of content at all times which may one day trigger PTSD symptoms. Social media shitholes use blacklists when they "should be" crowdsource-whitelisting content and not moderating anything at all. And then users decide for themselves if they wish to turn off the whitelist content filter and expose themselves to the content that may one day trigger PTSD symptoms.

Prepare your child for the path, not the path for your child.


Mr Burns is currently getting Homer to sign something saying that his job may give him inoperable cancer and a third arm.


Google reps should be forced to say it out loud to new employees. Some random sentence in a contract is far too casual.


I get your point, but it doesn't seem to be a random sentence in a contract. For starter, "The PTSD statement comes at the end of the two-page acknowledgment form, and it is surrounded by a thick black border to signify its importance."


Using AI to replace human moderators doing this stressful work must be morally good, right?


There's mountains of money to be made if you can fulfill this political promise Google and Facebook have so far failed make good on. Go for it.


I have PTSD. Believe me, you don't want it.


How sad is it that moderators are even needed. Why can't people just be nice?


Because there aren't sufficient repercussions. You can see this everywhere: lack of social accountability leads to more openly incivil behaviour. This isn't just online, but offline it works the same: if the cost of being incivil is low enough, there is a lot gain from it. See for example: backroom deals, mob behaviour, road rage.

A few years ago someone pointed me to https://ncase.me/trust/ (I think I learned about it here), which captures the problem very well for me: if social cohesion goes down (i.e. fewer interactions but with more different people), the penalty for inciviliy goes down but the rewards stay the same.


Also: there is really no central, balanced standard for justice/justness scale, but only an illusion of it(probably generated through said cohesion).

I learned it being an non-American with some English literacy on internet, and am being reminded of it every week.

Justice and/or “being nice” perceived scale is just an AI model of events around you that one uses to predict certain outcomes, and most of logic you see in it are purely coincidental correlations to aid in footprint reduction.


I would imagine it's more for things like illegal content (i,e. child sexual exploitation) as opposed to people being "mean". But I'm sure abusive behavior is a component.


Max Woolf (@minimaxir on twitter and here) has downvoted you, screenshotted your comment, and posted it on twitter for public shaming ️


I do not regret downvoting the comment since it is an unproductive take in 2020, and there's been more than enough incidents in the past decades to show why content moderation is necessary and inherent human virtue is not sufficient. I've deleted the tweet (the intent wasn't public shaming, it was more highlighting an unusual take).

That said, creating a throwaway for snitch-tagging isn't a moral highground.


> That said, creating a throwaway for snitch-tagging isn't a moral highground.

Yeah, that wasn't me. I don't know who you are or follow you.


I wasn't accusing you. (I'm more-or-less curious who did; it's the first time I've seen snitch-tagging on HN)


I'm not ashamed of my post.

I'm curious, how do you know they down voted me?


The screenshot had an undown button.

I have deleted the tweet.


If this was up to me, each and every employee of YouTube (and Google, for as long as YouTube is not set free) would be trained as a content moderator and would be required to do so for a couple of weeks (or as sufficient to complete the task) every year.

Edit: yes. Every single one.


Most of their top engineers would quit and go to work for another company that values their time. Quality of service would fall because top people have left. They would have a lot more difficult time to find any engineers to work for them.


“Most of their top engineers would quit and go to work for another company that values their time.”

Maybe. But I doubt it.

First because people don’t quit that fast and second because I think something else might happen that is worth the risk of loosing a few “top” quitters.

I think top people will realize there is an opportunity here, and after seeing how effective it can be for a team to work toward a common goal, will request other such tasks be added to the list.

Eventually, people at that company can choose where to use their “20% service job” time.

But, then again, don’t worry, it’s all theoretical anyway, it’s obviously not up to me.


Engineers switch companies already on average every few years. All companies have trouble keeping top engineers as it is with all the competition going on. Look at the perks and benefits being offered to them. If those perks can sway engineers then imagine how being forced to do something someone is not good at, might cause them PTSD, might be waste of time for progression in their life, be in general just a lot more unpleasant activity can affect the choice among companies they work for. All of those extra activities besides being able to focus on work will affect productivity and achieving goals. There is enough work and goals to achieve, I am not going to choose a week every quarter to do something that is not going to help me towards career goals I have. I do not think I am superior or others are beneath me but it is logical for me to choose a company that aligns best with what I want to achieve and if I have the option to do so, why not?


I am not going to choose a week every quarter to do something that is not going to help me towards career goals I have.

In which case, if it was up to me, and it is not, obviously, I’d open a short discussion about goals, and if we couldn’t find a common ground, would write down “culture fit” and wish you better happiness somewhere else.


If there are enough engineers who do not want to be forced to do chores unrelated to what they excel at, what they don't see benefitting their lives and what has dangers to their mental health it is likely you won't have many to build your video platform left.

Why should engineer choose this culture compared to other cultures where he can just build things and provide value at what they are best at?

Knowing other engineers most would avoid such activities where they can't use their skills valuably like a plague.

Also you would lose goal/results oriented people. The ones who are best at reaching those goals are the ones who can not stand doing anything that does not align woth those goals.


you won't have many to build your video platform left.

I’m not building a video platform. It’s already there. I’m just, totally hypocritically, fixing the company that operates it.


"I’m not building a video platform. It’s already there. I’m just, totally hypocritically, fixing the company that operates it."

Are you saying there is no more work left to do?

What about improving algorithms that can moderate those PTSD causing media themselves so it would not be a problem in the first place.

Are you claiming that YouTube does not have a need for engineers at all because the product is already available for public?


... falling Quality of Service would make it take longer to upload YT videos, resulting in fewer videos uploaded ...


That would be the market at work, saying that Youtube/Facebook isn't wanted.


But we do already have working market where people are willing to accept moderation jobs for YouTube.


I wouldn't work for a company that doesn't value my time as much as I do. Working on a platform I wished didn't exist would be very insulting. Your statement sounds like a typical out of touch high level manager that didn't think through an idea.


Being forced to do a low-level job for a somewhat extended period that is probably not directly to your day-to-day is a bit much. However, depending on the situation/company etc., having employees work support or otherwise put themselves in the shoes of people using the products they're involved with making every now and then shouldn't be something they view as beneath them.


Not beneath me, but I do value using my time as optimally as possible. Maybe a week of customer support once in life or few years would be great, but why should I choose a company that forces me to spend my time unwisely? It is not about being beneath me. I just want to solve problems programmatically, not be customer support. You know engineers can not be forced either, they can leave any time due to market demand so good luck to anyone trying to make their employees do something like that.


I’m honestly not sure of the value of spending time in first line customer support. I would note though that, at least in companies in enterprise sales, engineers are very commonly in the ultimate escalation path, expected to help in sales situations as needed, and do other things related to retaking customers and gaining new ones. Senior engineers are also often expected to be a face of the company in a variety of ways.


Also having just a week spent in customer support might create biases because your customers had this time these specific issues. Analyzing those issues and understanding what is actually important requires a broader view and analyzing all the tickets etc


The time is being spent educating higher-tier engineering, business, sales, marketing, and other staff, of the full implications of their work.

If it's not possible to do a thing, safely, effectively, and socially beneficially at scale, maybe it shouldn't be done at all.

Markets externalise all kinds of costs, with long-term and latent risks and power inequalities prime among them.


Are you saying we should not do video platforms at all?

Also moderation is not the only bad job in the industry or life in general. Why should we specifically rotate on moderation? Why not cleaning after old people, plumbing anything else you can think of that might be gross or mentally dangerous to do?

Moderation is only a minor factor of all the implications of this job.


I'm saying the question should be asked.

I mean, we have people literally having themselves shot dead for YouTube clicks and ads:

https://www.cnn.com/2017/06/29/us/fatal-youtube-stunt/index....

(Something I'd commented on at the time. YouTube are absolutely culpable.)

If the job has to be done -- and I'd put healthcare, garbage collection, policework, and military service among those -- you take all reasonable efforts to minimise risks, especially unnecessary ones, and support those who've become disabled through them. That's the principle argument for veterans healthcare, and an exceptionally sound one.

There are of course many instances of work under hazadous conditions not properly compensated. A few off the top of my head:

- Workers in lead-related facilities: mining, smelting, fuel production, paint, and printing. (I'm excluding resulting environmental contamination, that's also an issue, but not workers.)

- "Radium Girls" watchdial painters. Instructed to lick their brushes with their lips/tongues. https://en.wikipedia.org/wiki/Radium_Girls

- Coal miners: https://en.wikipedia.org/wiki/Coalworker%27s_pneumoconiosis

- Asbestos workers -- concrete, shipbuilding, roofing, insulation, and more: https://en.wikipedia.org/wiki/Mesothelioma

- Fire hazards, generally: https://en.wikipedia.org/wiki/Triangle_Shirtwaist_Factory_fi...

Spinning up a platform for cat videos and pratfalls with no larger social conscience is not excusable.

(Yes, there's good content on YouTube, and I rely on it myself. I'm aware of the costs. And I'm aware that many proposed alternatives, including peer-to-peer systems, would or do face similar issues.)


Of course you should minimize the risks, but what we are debating about is whether employees should be forced to do that job on rotation. In reality this is something that should be regulated legally so the company would be legally required to provide appropriate amount of assistance/support to the moderators.


I'm leaning that way myself.


I agree with this sentiment and would like to see more this sort of this not just for moderators but for every "low skill" job. Especially the C-level employees and shareholders - no one should be asking anyone to do something without either A) doing it themselves so they know what they're asking or B) treating the workers as experts whose opinions and requests should be taken seriously.


The fact that you and parent are getting downvoted just shows how out of touch with reality some developers have become. They really are exhibiting the same cluelessness they often attribute to managers.


There’s more. I think people are afraid. First from having to do something “unpleasant” and on a deeper level from loosing their own status of “importance”. It’s a pity indeed.


It's a great idea that should be practised in most industries.


Why the employees rather than the shareholders and executives who profit the most from it? The returns in their portfolios literally come at the expense of lifelong psychological wounding (which the corp is directly admitting in this contract).


You are not wrong.

Why not both, or all?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: