Assessing quality of volunteer crowdsourcing contributions: lessons from the Cropland Capture game

Salk, C.F., Sturn, T., See, L. ORCID: https://orcid.org/0000-0002-2665-7065, & Fritz, S. ORCID: https://orcid.org/0000-0003-0420-8549 (2016). Assessing quality of volunteer crowdsourcing contributions: lessons from the Cropland Capture game. International Journal of Digital Earth 9 (4) 310-426. 10.1080/17538947.2015.1039609.

[thumbnail of Assessing quality of volunteer crowdsourcing contributions.pdf]
Preview
Text
Assessing quality of volunteer crowdsourcing contributions.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial.

Download (592kB) | Preview
Project: Harnessing the power of crowdsourcing to improve land cover and land-use information (CROWDLAND, FP7 617754), Geo-Wiki

Abstract

Volunteered geographic information (VGI) is the assembly of spatial information based on public input. While VGI has proliferated in recent years, assessing the quality of volunteer-contributed data has proven challenging, leading some to question the efficiency of such programs. In this paper, we compare several quality metrics for individual volunteers' contributions. the date were the product of the 'Cropland Capture' game, in which several thousand volunteers assessed 165,000 images for the presence of cropland over the course of 6 months. We compared agreement between volunteer ratings and an image's majority classification with volunteer self-agreement on repeated images and expert evaluations. We also examined the impact of experience and learning on performance. Volunteer self-agreement was nearly always higher than agreement with majority classifications, and much greater than agreement with expert validations although these metrics were all positively correlated. Volunteer quality showed a broad trend toward improvement with experience, but the highest accuracies were achieved by a handful of moderately active contributors, not the most active volunteers. Our results emphasize the importance of a universal set of expert-validated tasks as a gold standard for evaluating VGI quality.

Item Type: Article
Uncontrolled Keywords: crowdsourcing; volunteered geographic information; cropland; data quality; image classification; Geo-Wiki
Research Programs: Ecosystems Services and Management (ESM)
Postdoctoral Scholars (PDS)
Depositing User: IIASA Import
Date Deposited: 15 Jan 2016 08:52
Last Modified: 04 Jan 2024 14:05
URI: https://pure.iiasa.ac.at/11287

Actions (login required)

View Item View Item