-
- Anna Noel-Storr, Gordon Dooley, Julian Elliott, Emily Steele, Ian Shemilt, Chris Mavergames, Susanna Wisniewski, Steven McDonald, Melissa Murano, Julie Glanville, Ruth Foxlee, Deirdre Beecher, Jennifer Ware, and James Thomas.
- Radcliffe Department of Medicine, University of Oxford, Level 4, Academic Block, John Radcliffe Hospital, Headington, Oxford OX3 9DU, UK; People Services Department, Cochrane, St Albans House, 57-59 Haymarket, London SW1Y 4QX, UK. Electronic address: anna.noel-storr@rdm.ox.ac.uk.
- J Clin Epidemiol. 2021 May 1; 133: 130-139.
Background And ObjectivesFiltering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a "crowd" of people, has the potential to support evidence synthesis by addressing this information overload created by the exponential growth in primary research outputs. Cochrane Crowd, Cochrane's citizen science platform, offers a range of tasks aimed at identifying studies related to health care. Accompanying each task are brief, interactive training modules, and agreement algorithms that help ensure accurate collective decision-making.The aims of the study were to evaluate the performance of Cochrane Crowd in terms of its accuracy, capacity, and autonomy and to examine contributor engagement across three tasks aimed at identifying randomized trials.Study Design And SettingCrowd accuracy was evaluated by measuring the sensitivity and specificity of crowd screening decisions on a sample of titles and abstracts, compared with "quasi gold-standard" decisions about the same records using the conventional methods of dual screening. Crowd capacity, in the form of output volume, was evaluated by measuring the number of records processed by the crowd, compared with baseline. Crowd autonomy, the capability of the crowd to produce accurate collectively derived decisions without the need for expert resolution, was measured by the proportion of records that needed resolving by an expert.ResultsThe Cochrane Crowd community currently has 18,897 contributors from 163 countries. Collectively, the Crowd has processed 1,021,227 records, helping to identify 178,437 reports of randomized controlled trials (RCTs) for Cochrane's Central Register of Controlled Trials. The sensitivity for each task was 99.1% for the RCT identification task (RCT ID), 99.7% for the RCT identification task of trials from ClinicalTrials.gov (CT ID), and 97.7% for the identification of RCTs from the International Clinical Trials Registry Platform (ICTRP ID). The specificity for each task was 99% for RCT ID, 98.6% for CT ID, and 99.1% for CT ICTRP ID. The capacity of the combined Crowd and machine learning workflow has increased fivefold in 6 years, compared with baseline. The proportion of records requiring expert resolution across the tasks ranged from 16.6% to 19.7%.ConclusionCochrane Crowd is sufficiently accurate and scalable to keep pace with the current rate of publication (and registration) of new primary studies. It has also proved to be a popular, efficient, and accurate way for a large number of people to play an important voluntary role in health evidence production. Cochrane Crowd is now an established part of Cochrane's effort to manage the deluge of primary research being produced.Copyright © 2021 The Authors. Published by Elsevier Inc. All rights reserved.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.