UPDATE: If I remember correctly, the original BeOS icons were all 32x32 pixel art. I didn't find the original in my first quick Google search, but I found the above link instead which appears to be from an open-source stock icon set based on a vectorized version of the BeOS artwork. It also happens to match the image used on the MobileWorks site.
Here are some of the original icons from BeOS:http://dsandler.org/entries/images/2007/beman.png and http://media.soundonsound.com/sos/feb00/images/beos_1.l.gif
I didn't mean to imply that MobileWorks was copying anyone (or for my off-hand comment to get voted to the top of their launch announcement discussion). Their web site may be a licensed use of this stock art, which itself is an inexact copy/interpretation of the BeOS art.
We use Grasshopper for our phone-service. They do a computer-transcription of voice mail by default and they send it to you by email. If you don't like the auto-transcription from the email you can have it done by humans. And 5 min later you have a more accurate one in your inbox.
Which makes me think that there are a lot of services that could benefit from human intervention. I can see that you guys are starting out in a particular space but I think that the opportunity is huge!
I wish this existed in the past -- trying to figure out best practices for mturk use is almost more work than just doing the task yourself, but simplifying it for common tasks makes it a lot more useful.
Even if you have invested time in MTurk, there is no reason you have to continue spending your time dealing with spam, dealing with task starvation, optimizing quality rather than focusing on your core competency.
Can they be given a task that requires additional research, training, or creativity?
Example: What if I sent you a dental xray and asked you to report cavity count and location? This would require minor training but could easily be done by anyone with a trained eye.
In it's present form no. But I do believe that if someone was to build an application with a correct work flow and focus on helping the crowd (by creating tutorial etc.) something like this could definitely work.
We would love to talk to developers who want to build innovative applications on our platform. We would also help in setting the right work-flow and improving quality.
While the service is in beta, anyone can try a maximum of 100 tasks for free for any of the products built on MobileWorks: form digitization, web scraping, or tasks that new app developers build on the API. The idea is to get developers started working with crowd as easily as possible.
We'll announce the formal pricing for form digitization and scraping products soon.
For community-built applications, prices are set depending on the complexity of the task and how long it takes workers to do them. We time the first few workers to do a task to see how long it takes, then return a price based on that.
"While the service is in beta, anyone can try a maximum of 100 tasks for free for any of the products built on MobileWorks: form digitization, web scraping, or tasks that new app developers build on the API. The idea is to get developers started working with crowd as easily as possible."
Wait, I don't understand. If I build an application on the API am I limited to 100 hours for now? If so, that severely disincentivizes me to build on the platform.
"For community-built applications, prices are set depending on the complexity of the task and how long it takes workers to do them."
And that price is WHAT during the beta period? I would really like to build an application. If I make an application and sell it to other people, what is the price? Am I limited to 100 jobs?
Let me also add that you probably know my co-founders Prayag, Philipp, and Dave from their days at Cal. MobileWorks spun out of Berkeley's Department of CS and School of Information, and was built in response to the pain we all felt as they struggled to get good results out of crowds.
"Permission is granted to temporarily download one copy of the materials (information or software) on MobileWorks, Inc.'s web site for personal, non-commercial transitory viewing only."
I see nowhere in the TOS that if I upload data and pay for some data transformation that I own the copyright to the transformed data. According to the current wording, it seems that YOU own all the output. Could you explain?
Also, what does the following mean:
"Call us toll free at 800 100 4023 from any India phone."
Lastly, do you have a support email address? I prefer that channel, since I can archive support for future reference.
Thanks for catching this! It's an unintended ambiguity in the wording.
These TOS refer to the website itself, not to the API or platform. When you pay to use the MobileWorks crowd for your own work, the output is yours, not ours.
We'll adapt the terms to better reflect what's intended!
The developer sandbox presently shows a fairly faithful representation of what workers see when they use the application, including the phone number and chat window workers can use to contact us.
If you'd like to reach us directly, you can use info@mobileworks.com.
I've been working in the crowdsourcing field for several years now (our workers are from Vietnam), and can understand that the profit margins can be very thin for some of the more basic tasks.
Drop me a line if you're interested in branching out to Vietnam.
I have testing this out and must say that the communications from Prayag have been very very helpful. Thanks so much for clear line of communication on and about your product.
We are in fact in beta right now, so we expect to announce formal pricing in the very near future for the digitizer and excavator products.
Prices for new applications pushed with the API are always different. They're set based on the complexity and difficulty of the task. We have a few workers try out a new task to see how complex it is, then use that to establish a price so that workers can earn a fair wage.
The majority do, but no all. We track accuracy obsessively, and tasks requiring fluent English won't go to workers with the weakest skills.
Our crowd is surprisingly diverse – there are medical students, engineers, doctors and housewives.
For our best-tested category of work to date, optical character recogition, even workers with the weakest English skills do well at recognizing English text.
In fact, I can guarantee they're not solving CAPTCHAs for hire. Our review process for new software applications keeps the quality of our tasks high and keeps spam work out.
CAPTCHA-breaking jobs generally rely on sweatshop labor and don't pay the fair wages we do, so they're not even capable of hiring our workers.
One of our core motivations as a team and a company is to use crowdsourcing as a force for good: happy workers, better technology, no spam. If you're interested in cracking CAPTCHAs, we'll be happy to refer you to our better known, difficult-to-use competitor.
Handwritten and printed OCR tasks, on the other hand, are fantastic for our system.
That is one of the issues with Mechanical Turk. In fact up to 40% of the tasks hosted at turk are spam tasks. Our goal is to be entirely spam free. We train our workers to report all tasks that ask for email addresses, creation of accounts or tasks that look similar to CAPTCHAs.
Right now it's kept internal for most purposes, but there's no reason not to open it up. The idea behind managed crowdsourcing is that end users shouldn't need to deal with reputation and crowd management themselves.
The next version of the API will support the ability to assign work only to individuals with certain skillsets - very powerful! You'll get exactly the crowd you need for the job you have.
Sounds interesting. Similar to Crowdflower, but with less management on the user end. Crowdflower provides "gold tests" that help filter out mistakes and makes it my go-to source for Turk tasks with a bit too much complexity. Of course, that means I have to manage questions, gold tests and review results. The value proposition here is thus quite appealing. Suppose we wanted to grab information from historic form D's:
We do a number of things to maintain quality. We do managed crowdsourcing, which means we provide support to our crowd. They can ask us when they do not understand a particular task. Also, our crowd is loosely forming a community both online and IRL. So, people help each other.
On top of that we have developed algorithms that check the crowd's answer against each other. Our algorithms also make sure that the right task is routed to the right person.
We also believe in paying our workers fairly. Happy, more productive workers leads to better quality work.
http://www.iconfinder.com/icondetails/34752/128/beos_people_...
UPDATE: If I remember correctly, the original BeOS icons were all 32x32 pixel art. I didn't find the original in my first quick Google search, but I found the above link instead which appears to be from an open-source stock icon set based on a vectorized version of the BeOS artwork. It also happens to match the image used on the MobileWorks site. Here are some of the original icons from BeOS: http://dsandler.org/entries/images/2007/beman.png and http://media.soundonsound.com/sos/feb00/images/beos_1.l.gif
I didn't mean to imply that MobileWorks was copying anyone (or for my off-hand comment to get voted to the top of their launch announcement discussion). Their web site may be a licensed use of this stock art, which itself is an inexact copy/interpretation of the BeOS art.