Salk, C., Sturn, T., See, L. ORCID: https://orcid.org/0000-0002-2665-7065, & Fritz, S. ORCID: https://orcid.org/0000-0003-0420-8549 (2016). Limitations of Majority Agreement in Crowdsourced Image Interpretation. Transactions in GIS 1-17. 10.1111/tgis.12194.
Preview |
Text
Task Difficulty Rerevised CLEAN.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. Download (1MB) | Preview |
Abstract
Crowdsourcing can efficiently complete tasks that are difficult to automate, but the quality of crowdsourced data is tricky to evaluate. Algorithms to grade volunteer work often assume that all tasks are similarly difficult, an assumption that is frequently false. We use a cropland identification game with over 2,600 participants and 165,000 unique tasks to investigate how best to evaluate the difficulty of crowdsourced tasks and to what extent this is possible based on volunteer responses alone. Inter-volunteer agreement exceeded 90% for about 80% of the images and was negatively correlated with volunteer-expressed uncertainty about image classification. A total of 343 relatively difficult images were independently classified as cropland, non-cropland or impossible by two experts. The experts disagreed weakly (one said impossible while the other rated as cropland or non-cropland) on 27% of the images, but disagreed strongly (cropland vs. non-cropland) on only 7%. Inter-volunteer disagreement increased significantly with inter-expert disagreement. While volunteers agreed with expert classifications for most images, over 20% would have been mis-categorized if only the volunteers’ majority vote was used. We end with a series of recommendations for managing the challenges posed by heterogeneous tasks in crowdsourcing campaigns.
Item Type: | Article |
---|---|
Research Programs: | Ecosystems Services and Management (ESM) |
Depositing User: | Luke Kirwan |
Date Deposited: | 28 Apr 2016 12:43 |
Last Modified: | 19 Oct 2022 05:00 |
URI: | https://pure.iiasa.ac.at/12947 |
Actions (login required)
View Item |