Hacker News new | past | comments | ask | show | jobs | submit login
MobileWorks (YC S11) is a Hands-Off Mechanical Turk (techcrunch.com)
151 points by anandkulkarni on Aug 12, 2011 | hide | past | favorite | 48 comments



The logo makes me nostalgic for my days as a BeOS user:

http://www.iconfinder.com/icondetails/34752/128/beos_people_...

UPDATE: If I remember correctly, the original BeOS icons were all 32x32 pixel art. I didn't find the original in my first quick Google search, but I found the above link instead which appears to be from an open-source stock icon set based on a vectorized version of the BeOS artwork. It also happens to match the image used on the MobileWorks site. Here are some of the original icons from BeOS: http://dsandler.org/entries/images/2007/beman.png and http://media.soundonsound.com/sos/feb00/images/beos_1.l.gif

I didn't mean to imply that MobileWorks was copying anyone (or for my off-hand comment to get voted to the top of their launch announcement discussion). Their web site may be a licensed use of this stock art, which itself is an inexact copy/interpretation of the BeOS art.


It's a pixel-for-pixel copy of the yellow guy from that logo, down to the irregular shadow on the collar.


No offense taken! Thanks for your keen eye. In fact, it is a licensed use of this very image.

Image classification is one of the tasks our crowd specializes in! You should consider joining the MobileWorks crowd. :)


Never seen this one before though I did play around with Haiku at one point. I will let our co-founder and Chief Design Officer know.


If you're a team of 4, what's the role of a Chief Design Officer?


I would guess it's partially "have a funny title".


I tried using BeOS back in the day, but it was a BeOtch to get the drivers working.

Linux in it's many flavours ended up being enough to scratch my geek itch.


We use Grasshopper for our phone-service. They do a computer-transcription of voice mail by default and they send it to you by email. If you don't like the auto-transcription from the email you can have it done by humans. And 5 min later you have a more accurate one in your inbox.

Which makes me think that there are a lot of services that could benefit from human intervention. I can see that you guys are starting out in a particular space but I think that the opportunity is huge!


I wish this existed in the past -- trying to figure out best practices for mturk use is almost more work than just doing the task yourself, but simplifying it for common tasks makes it a lot more useful.


Even if you have invested time in MTurk, there is no reason you have to continue spending your time dealing with spam, dealing with task starvation, optimizing quality rather than focusing on your core competency.


Can they be given a task that requires additional research, training, or creativity?

Example: What if I sent you a dental xray and asked you to report cavity count and location? This would require minor training but could easily be done by anyone with a trained eye.


In it's present form no. But I do believe that if someone was to build an application with a correct work flow and focus on helping the crowd (by creating tutorial etc.) something like this could definitely work.

We would love to talk to developers who want to build innovative applications on our platform. We would also help in setting the right work-flow and improving quality.


Wonderful. I'll be in touch.


I didn't get the price. Care to explain? 100 tasks free, and then?


Sure thing.

While the service is in beta, anyone can try a maximum of 100 tasks for free for any of the products built on MobileWorks: form digitization, web scraping, or tasks that new app developers build on the API. The idea is to get developers started working with crowd as easily as possible.

We'll announce the formal pricing for form digitization and scraping products soon.

For community-built applications, prices are set depending on the complexity of the task and how long it takes workers to do them. We time the first few workers to do a task to see how long it takes, then return a price based on that.


"While the service is in beta, anyone can try a maximum of 100 tasks for free for any of the products built on MobileWorks: form digitization, web scraping, or tasks that new app developers build on the API. The idea is to get developers started working with crowd as easily as possible."

Wait, I don't understand. If I build an application on the API am I limited to 100 hours for now? If so, that severely disincentivizes me to build on the platform.

"For community-built applications, prices are set depending on the complexity of the task and how long it takes workers to do them."

And that price is WHAT during the beta period? I would really like to build an application. If I make an application and sell it to other people, what is the price? Am I limited to 100 jobs?


Congratulations, Anand! Didn't know you were in this yc batch.


Thanks! Looking forward to having the research community start using our crowd and giving us their thoughts.

Bonuses: no task starvation, no spammers, a usable API, and highly motivated workers.


Let me also add that you probably know my co-founders Prayag, Philipp, and Dave from their days at Cal. MobileWorks spun out of Berkeley's Department of CS and School of Information, and was built in response to the pain we all felt as they struggled to get good results out of crowds.


Congrats guys! This team has been studying crowdsourcing for a long time and this startup is a product of everything they have learned.


Regarding your TOS:

"Permission is granted to temporarily download one copy of the materials (information or software) on MobileWorks, Inc.'s web site for personal, non-commercial transitory viewing only."

I see nowhere in the TOS that if I upload data and pay for some data transformation that I own the copyright to the transformed data. According to the current wording, it seems that YOU own all the output. Could you explain?

Also, what does the following mean: "Call us toll free at 800 100 4023 from any India phone."

Lastly, do you have a support email address? I prefer that channel, since I can archive support for future reference.

https://sandbox.mobileworks.com/contact/message/


Thanks for catching this! It's an unintended ambiguity in the wording.

These TOS refer to the website itself, not to the API or platform. When you pay to use the MobileWorks crowd for your own work, the output is yours, not ours.

We'll adapt the terms to better reflect what's intended!


The developer sandbox presently shows a fairly faithful representation of what workers see when they use the application, including the phone number and chat window workers can use to contact us.

If you'd like to reach us directly, you can use info@mobileworks.com.


"Unfortunately things are a little more complicated than that."

When it said that, I thought it was because of http://coinnovative.com/the-mechanical-turk-experiment-how-i...


Having trouble registering here: https://sandbox.mobileworks.com/accounts/register/

It just ends up refreshing the page, with the form still filled in (except password fields).


Let's figure out what's going on with your activation!

Could you send us the username you're trying to activate, as well as your browser and OS stats? info@mobileworks.com. We'll get it resolved.


Congrats Anand!

I've been working in the crowdsourcing field for several years now (our workers are from Vietnam), and can understand that the profit margins can be very thin for some of the more basic tasks.

Drop me a line if you're interested in branching out to Vietnam.


I have testing this out and must say that the communications from Prayag have been very very helpful. Thanks so much for clear line of communication on and about your product.


Pricing page has no pricing.

Cannot commit to testing a platform that cannot commit to me a price of some sort? ... are they treating me like a crowdsourced beta tester?


We are in fact in beta right now, so we expect to announce formal pricing in the very near future for the digitizer and excavator products.

Prices for new applications pushed with the API are always different. They're set based on the complexity and difficulty of the task. We have a few workers try out a new task to see how complex it is, then use that to establish a price so that workers can earn a fair wage.


Do the workers speak and write English?


The majority do, but no all. We track accuracy obsessively, and tasks requiring fluent English won't go to workers with the weakest skills.

Our crowd is surprisingly diverse – there are medical students, engineers, doctors and housewives.

For our best-tested category of work to date, optical character recogition, even workers with the weakest English skills do well at recognizing English text.


I hope that your workers aren't solving CAPTCHAs for hire, but that's probably a vain hope.


In fact, I can guarantee they're not solving CAPTCHAs for hire. Our review process for new software applications keeps the quality of our tasks high and keeps spam work out.

CAPTCHA-breaking jobs generally rely on sweatshop labor and don't pay the fair wages we do, so they're not even capable of hiring our workers.

One of our core motivations as a team and a company is to use crowdsourcing as a force for good: happy workers, better technology, no spam. If you're interested in cracking CAPTCHAs, we'll be happy to refer you to our better known, difficult-to-use competitor.

Handwritten and printed OCR tasks, on the other hand, are fantastic for our system.


That is one of the issues with Mechanical Turk. In fact up to 40% of the tasks hosted at turk are spam tasks. Our goal is to be entirely spam free. We train our workers to report all tasks that ask for email addresses, creation of accounts or tasks that look similar to CAPTCHAs.


Is the skillset ranking system public at all, or just internal?


Right now it's kept internal for most purposes, but there's no reason not to open it up. The idea behind managed crowdsourcing is that end users shouldn't need to deal with reputation and crowd management themselves.

The next version of the API will support the ability to assign work only to individuals with certain skillsets - very powerful! You'll get exactly the crowd you need for the job you have.


Bravo, this looks like a really savvy way to handle these kinds of tasks.


Congrats Anand et al! It's gonna be fun competing...


On the contrary! You should be using our crowd instead of Turk's.


Congrats from Houdini as well; MobileWorks looks great! We'd definitely be interested exploring this option further.


Agreed, friend. Agreed.


Congrats guys! Berkeley I School & CS FTW :)


This sounds awesome.


Thanks! I happen to think so, too :)

The reason is that putting humans inside software lets you do incredibly powerful things.

Here's one of my favorite examples from the academic world, but it took a slew of difficult crowd-control hacks and months of research to get it working on Turk: http://hci.cs.rochester.edu/currentprojects.php?proj=vw

It'd be a weekend project to rebuild it on MobileWorks.


Sounds interesting. Similar to Crowdflower, but with less management on the user end. Crowdflower provides "gold tests" that help filter out mistakes and makes it my go-to source for Turk tasks with a bit too much complexity. Of course, that means I have to manage questions, gold tests and review results. The value proposition here is thus quite appealing. Suppose we wanted to grab information from historic form D's:

http://www.sec.gov/Archives/edgar/vprr/03/9999999997-03-0208...

These older forms lack consistency and would probably require lengthy instructions for data extraction. How do you do quality control?


One of the founders here.

We do a number of things to maintain quality. We do managed crowdsourcing, which means we provide support to our crowd. They can ask us when they do not understand a particular task. Also, our crowd is loosely forming a community both online and IRL. So, people help each other.

On top of that we have developed algorithms that check the crowd's answer against each other. Our algorithms also make sure that the right task is routed to the right person.

We also believe in paying our workers fairly. Happy, more productive workers leads to better quality work.


Awesome, sounds like you automated and crowd-sourced quality control. Will definitely check it out.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: