Crowd science user contribution patterns and their implications
See allHide authors and affiliations
Edited by Adrian E. Raftery, University of Washington, Seattle, WA, and approved December 11, 2014 (received for review May 13, 2014)

Significance
Involving the public in research may provide considerable benefits for the progress of science. However, the sustainability of “crowd science” approaches depends on the degree to which members of the public are interested and provide continued labor inputs. We describe and compare contribution patterns in multiple projects using a range of measures. We show that effort contributions can be significant in magnitude and speed, but we also identify several challenges. In addition, we explore some of the underlying dynamics and mechanisms. As such, we provide quantitative evidence that is useful for scientists who consider adopting crowd science approaches and for scholars studying crowd-based knowledge production. Our results also inform current policy discussions regarding the organization of scientific research.
Abstract
Scientific research performed with the involvement of the broader public (the crowd) attracts increasing attention from scientists and policy makers. A key premise is that project organizers may be able to draw on underused human resources to advance research at relatively low cost. Despite a growing number of examples, systematic research on the effort contributions volunteers are willing to make to crowd science projects is lacking. Analyzing data on seven different projects, we quantify the financial value volunteers can bring by comparing their unpaid contributions with counterfactual costs in traditional or online labor markets. The volume of total contributions is substantial, although some projects are much more successful in attracting effort than others. Moreover, contributions received by projects are very uneven across time—a tendency toward declining activity is interrupted by spikes typically resulting from outreach efforts or media attention. Analyzing user-level data, we find that most contributors participate only once and with little effort, leaving a relatively small share of users who return responsible for most of the work. Although top contributor status is earned primarily through higher levels of effort, top contributors also tend to work faster. This speed advantage develops over multiple sessions, suggesting that it reflects learning rather than inherent differences in skills. Our findings inform recent discussions about potential benefits from crowd science, suggest that involving the crowd may be more effective for some kinds of projects than others, provide guidance for project managers, and raise important questions for future research.
Footnotes
- ↵1To whom correspondence should be addressed. Email: henry.sauermann{at}scheller.gatech.edu.
Author contributions: H.S. and C.F. designed research, performed research, analyzed data, and wrote the paper.
The authors declare no conflict of interest.
This article is a PNAS Direct Submission.
This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1408907112/-/DCSupplemental.