Hacker News new | past | comments | ask | show | jobs | submit login

Turkit, Javascript code for achieving consensus among AMT workers, especially with multiple stages. See the Find-Fix-Verify pattern: http://www.behind-the-enemy-lines.com/2011/04/want-to-improv... By doing multiple stages of crowdsourcing, you can do complicated tasks with high quality, like: Improve the grammar of these reviews. Zappos did this and saw a lift in conversion.

Crowdflower, which is good for achieving high-quality one-shot annotation, but not so much for pipelines of annotations.

MobileWorks, which is good for achieving high-quality one-shot annotation, and says in the docs that it has pipelines. I haven't figured this feature yet out.

CrowdControl, which supposedly solves every problem, but is priced as an enterprise solution.

If you want to build something cool, implement pipelines of work. i.e. build a crowd-programming layer that has subroutines. Look at what CrowdControl says they are doing.

[edit: If you also want to build something cool, implement a reputation system. Don't just assign workers a single number. Figure out what kind of tasks the workers are good as, and do a per-task reputation system. For bonus points, solve this correctly, by dynamically gauging the skillset and difficulty for each task, rather than simply grouping tasks into N clusters, where N is low.]

Email me if you want to discuss. I've been thinking about this for a while.




We also assume pipelining in http://human.io because the UI at each stage is pretty minimal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: