Hacker News new | past | comments | ask | show | jobs | submit login

> Milagros Miceli, a sociologist and computer scientist who has been interviewing distributed workers contracted by data annotation companies for years. Miceli has spoken to multiple labelers who have seen similar images, taken from the same low vantage points and sometimes showing people in various stages of undress.

> The data labelers found this work “really uncomfortable,” she adds.

This is an interesting point - the article seems to present that this was not done out of malice (as the woman's face was pre obscured).

> Labelers discussed Project IO [another assignment by Scale] in Facebook, Discord, and other groups that they had set up to share advice on handling delayed payments, talk about the best-paying assignments, or request assistance in labeling tricky objects.

It's clearly against policy,

> But such actions are nearly impossible to police on crowdsourcing platforms.

> When I ask Kevin Guo, the CEO of Hive, a Scale competitor that also depends on contract workers, if he is aware of data labelers sharing content on social media, he is blunt. “These are distributed workers,” he says. “You have to assume that people … ask each other for help. The policy always says that you’re not supposed to, but it’s very hard to control.”

I'm honestly not that suprised that something like this happened, where similar things happen for mturk.

1. Society demands this kind of automation 2. Companies reacting to this demand have to hire humans to perform manual labelling 3. Humans that perform labelling don't always follow the rules and policies in place 4. Data leaks occur 5. Article like this are written 6. Demands for automation don't really change

(repeat)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: