//
<<
Details
JournalIEEE Transactions on Multimedia
Volume16
Issue/Number7
Pages2075-2079
Date01.11.2014
Author(s)Luke Gottlieb, Gerald Friedland, Jaeyoung Choi, Pascal Kelm, Thomas Sikora
TitleCreating Experts From the Crowd: Techniques for Finding Workers for Difficult Tasks
AbstractCrowdsourcing is currently used for a range of applications, either by exploiting unsolicited user-generated content, such as spontaneously annotated images, or by utilizing explicit crowdsourcing platforms such as Amazon Mechanical Turk to mass-outsource artificial-intelligence-type jobs. However, crowdsourcing is most often seen as the best option for tasks that do not require more of people than their uneducated intuition as a human being. This article describes our methods for identifying workers for crowdsourced tasks that are difficult for both machines and humans. It discusses the challenges we encountered in qualifying annotators and the steps we took to select the individuals most likely to do well at these tasks.
Key wordsmultimedia analysis, social networking (online),video signal processing,Amazon Mechanical Turk,annotators,artificial-intelligence-type jobs,crowdsourcing,multimodal location estimation,social media video,unsolicited user-generated content,Cities and towns,Crowdsourcing,Estimation,Reliability,Tutorials,Videos,Visualization,Annotation,cheat detection,crowdsourcing,mechanical turk,multimodal location estimation,qualification
DOI10.1109/TMM.2014.2347268
URLhttp://dx.doi.org/10.1109/TMM.2014.2347268

BibTeX