The preregistration revolution

Edited by Richard M. Shiffrin, Indiana University, Bloomington, IN, and approved August 28, 2017 (received for review June 15, 2017)
March 12, 2018
115 (11) 2600-2606
Letter
Reply to Ledgerwood: Predictions without analysis plans are inert
Brian A. Nosek, Charles R. Ebersole [...] David T. Mellor

Abstract

Progress in science relies in part on generating hypotheses with existing observations and testing hypotheses with new observations. This distinction between postdiction and prediction is appreciated conceptually but is not respected in practice. Mistaking generation of postdictions with testing of predictions reduces the credibility of research findings. However, ordinary biases in human reasoning, such as hindsight bias, make it hard to avoid this mistake. An effective solution is to define the research questions and analysis plan before observing the research outcomes—a process called preregistration. Preregistration distinguishes analyses and outcomes that result from predictions from those that result from postdictions. A variety of practical strategies are available to make the best possible use of preregistration in circumstances that fall short of the ideal application, such as when the data are preexisting. Services are now available for preregistration across all disciplines, facilitating a rapid increase in the practice. Widespread adoption of preregistration will increase distinctiveness between hypothesis generation and hypothesis testing and will improve the credibility of research findings.

Continue Reading

Acknowledgments

This work was supported by grants from the Laura and John Arnold Foundation and the National Institute on Aging.

References

1
GEP Box, Science and statistics. J Am Stat Assoc 71, 791–799 (1976).
2
GEP Box, Robustness in the strategy of scientific model building. Robustness in Statistics, eds RL Launer, GN Wilkinson (Academic, New York), pp. 201–236 (1979).
3
AD de Groot, The meaning of “significance” for different types of research. Acta Psychol (Amst) 148, 188–194 (2014).
4
P Hoyningen-Huene, Context of discovery and context of justification. Stud Hist Philos Sci 18, 501–515 (1987).
5
TS Kuhn, Logic of discovery or psychology of research? Criticism and the Growth of Knowledge, eds I Lakatos, A Musgrave (Cambridge Univ Press, Cambridge, UK), pp. 1–23 (1970).
6
E-J Wagenmakers, R Wetzels, D Borsboom, HLJ van der Maas, RA Kievit, An agenda for purely confirmatory research. Perspect Psychol Sci 7, 632–638 (2012).
7
W Forstmeier, E-J Wagenmakers, TH Parker, Detecting and avoiding likely false-positive findings–A practical guide. Biol Rev Camb Philos Soc 92, 1941–1968 (2017).
8
MR Munafò, et al., A manifesto for reproducible science. Nat Hum Behav 1, 0021 (2017).
9
BA Nosek, JR Spies, M Motyl, Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspect Psychol Sci 7, 615–631 (2012).
10
; Open Science Collaboration, PSYCHOLOGY. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015).
11
GG Swaen, O Teggeler, LG van Amelsvoort, False positive outcomes and design characteristics in occupational cancer epidemiology studies. Int J Epidemiol 30, 948–954 (2001).
12
NL Kerr, HARKing: Hypothesizing after the results are known. Pers Soc Psychol Rev 2, 196–217 (1998).
13
B Fischhoff, R Beyth, I knew it would happen: Remembered probabilities of once–future things. Organ Behav Hum Perform 13, 1–16 (1975).
14
B Fischhoff, Hindsight not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. 1975. Qual Saf Health Care 12, 304–311, discussion 311–312 (2003).
15
M Lewis The Undoing Project: A Friendship That Changed Our Minds (W. W. Norton & Company, New York, 2016).
16
CB Hawkins, BA Nosek, Motivated independence? Implicit party identity predicts political judgments among self-proclaimed Independents. Pers Soc Psychol Bull 38, 1437–1452 (2012).
17
D Kahneman, P Slovic, A Tversky Judgment Under Uncertainty: Heuristics and Biases (Cambridge Univ Press, Cambridge, UK, 1982).
18
D Kahneman Thinking, Fast and Slow (Farrar, Straus and Giroux, New York, 2011).
19
M Bakker, A van Dijk, JM Wicherts, The rules of the game called psychological science. Perspect Psychol Sci 7, 543–554 (2012).
20
R Giner-Sorolla, Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Perspect Psychol Sci 7, 562–571 (2012).
21
MJ Mahoney, Publication prejudices: An experimental study of confirmatory bias in the peer review system. Cognit Ther Res 1, 161–175 (1977).
22
JJJ Christensen-Szalanski, CF Willham, The hindsight bias: A meta-analysis. Organ Behav Hum Decis Process 48, 147–168 (1991).
23
Z Kunda, The case for motivated reasoning. Psychol Bull 108, 480–498 (1990).
24
RS Nickerson, Confirmation bias: A ubiquitous phenomenon in many guises. Rev Gen Psychol 2, 175–220 (1998).
25
BA Nosek, RG Riskind, Policy implications of implicit social cognition. Soc Issues Policy Rev 6, 113–147 (2012).
26
E Pronin, MB Kugler, Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. J Exp Soc Psychol 43, 565–578 (2007).
27
T Sellke, MJ Bayarri, JO Berger, Calibration of ρ values for testing precise null hypotheses. Am Stat 55, 62–71 (2001).
28
R Hubbard, PA Ryan, Statistical significance with comments by editors of marketing journals: The historical growth of statistical significance testing in psychology—and its future prospects. Educ Psychol Meas 60, 661–681 (2000).
29
PA Stephens, SW Buskirk, CM del Rio, Inference in ecology and evolution. Trends Ecol Evol 22, 192–197 (2007).
30
JP Simmons, LD Nelson, U Simonsohn, False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol Sci 22, 1359–1366 (2011).
31
RL Wasserstein, NA Lazar, The ASA’s statement on p-values: Context, process, and purpose. Am Stat 70, 129–133 (2016).
32
DJ Benjamin, et al., Redefine statistical significance. Nat Hum Behav, 2017).
33
CW Dunnett, A multiple comparison procedure for comparing several treatments with a control. J Am Stat Assoc 50, 1096–1121 (1955).
34
JW Tukey, Comparing individual means in the analysis of variance. Biometrics 5, 99–114 (1949).
35
Y Benjamini, Simultaneous and selective inference: Current successes and future challenges. Biom J 52, 708–721 (2010).
36
R Saxe, M Brett, N Kanwisher, Divide and conquer: A defense of functional localizers. Neuroimage 30, 1088–1096, discussion 1097–1099 (2006).
37
A Gelman, E Loken, The statistical crisis in science. Am Sci 102, 460 (2014).
38
A Franco, N Malhotra, G Simonovits, Social Science. Publication bias in the social sciences: Unlocking the file drawer. Science 345, 1502–1505 (2014).
39
AG Greenwald, Consequences of prejudice against the null hypothesis. Psychol Bull 82, 1–20 (1975).
40
R Rosenthal, The file drawer problem and tolerance for null results. Psychol Bull 86, 638–641 (1979).
41
JPA Ioannidis, Why most published research findings are false. PLoS Med 2, e124 (2005).
42
LK John, G Loewenstein, D Prelec, Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci 23, 524–532 (2012).
43
RM Kaplan, VL Irvin, Likelihood of null effects of large NHLBI clinical trials has increased over time. PLoS One 10, e0132382 (2015).
44
L Cybulski, E Mayo-Wilson, S Grant, Improving transparency and reproducibility through registration: The status of intervention trials published in clinical psychology journals. J Consult Clin Psychol 84, 753–767 (2016).
45
A Odutayo, et al., Association between trial registration and positive study findings: Cross sectional study (Epidemiological Study of Randomized Trials-ESORT). BMJ 356, j917 (2017).
46
P Armitage, CK McPherson, BC Rowe, Repeated significance tests on accumulating data. J R Stat Soc Ser A 132, 235–244 (1969).
47
G Dutilh, et al., A test of the diffusion model explanation for the worst performance rule using preregistration and blinding. Atten Percept Psychophys 79, 713–725 (2017).
48
R MacCoun, S Perlmutter, Blind analysis: Hide results to seek the truth. Nature 526, 187–189 (2015).
49
W Lin, DP Green, Standard operating procedures: A safety net for pre-analysis plans. PS Polit Sci Polit 49, 495–500 (2016).
50
DJ Klionsky, et al., Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition). Autophagy 12, 1–222, and erratum (2016) 12:443 (2016).
51
AG Greenwald, BA Nosek, MR Banaji, Understanding and using the implicit association test: I. An improved scoring algorithm. J Pers Soc Psychol 85, 197–216 (2003).
52
L Campbell, TJ Loving, EP Lebel, Enhancing transparency of the research process to increase accuracy of findings: A guide for relationship researchers. Pers Relatsh 21, 531–545 (2014).
53
A Cockburn, Long-term data as infrastructure: A comment on Ihle et al. Behav Ecol 28, 357 (2017).
54
M Fafchamps, J Labonne, Using split samples to improve inference about causal effects (National Bureau of Economic Research, Cambridge, MA). (2016).
55
JR Platt, Strong inference: Certain systematic methods of scientific thinking may produce much more rapid progress than others. Science 146, 347–353 (1964).
56
S Holm, A simple sequentially rejective multiple test procedure. Scand J Stat 6, 65–70 (1979).
57
MS Anderson, BC Martinson, R De Vries, Normative dissonance in science: Results from a national survey of U.S. Scientists. J Empir Res Hum Res Ethics 2, 3–14 (2007).
58
MC Kidwell, et al., Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS Biol 14, e1002456 (2016).
59
CD Chambers, E Feredoes, SD Muthukumaraswamy, P Etchells, Instead of “playing the game” it is time to change the rules: Registered reports at AIMS Neuroscience and beyond. AIMS Neurosci 1, 4–17 (2014).
60
BA Nosek, D Lakens, Registered reports: A method to increase the credibility of published results. Soc Psychol 45, 137–141 (2014).

Information & Authors

Information

Published in

The cover image for PNAS Vol.115; No.11
Proceedings of the National Academy of Sciences
Vol. 115 | No. 11
March 13, 2018
PubMed: 29531091

Classifications

Submission history

Published online: March 12, 2018
Published in issue: March 13, 2018

Keywords

  1. methodology
  2. open science
  3. confirmatory analysis
  4. exploratory analysis
  5. preregistration

Acknowledgments

This work was supported by grants from the Laura and John Arnold Foundation and the National Institute on Aging.

Notes

This article is a PNAS Direct Submission.
This paper results from the Arthur M. Sackler Colloquium of the National Academy of Sciences, “Reproducibility of Research: Issues and Proposed Remedies,” held March 8–10, 2017, at the National Academy of Sciences in Washington, DC. The complete program and video recordings of most presentations are available on the NAS website at www.nasonline.org/Reproducibility.

Authors

Affiliations

Center for Open Science, Charlottesville, VA 22903;
Department of Psychology, University of Virginia, Charlottesville, VA 22904
Department of Psychology, University of Virginia, Charlottesville, VA 22904
Center for Open Science, Charlottesville, VA 22903;
Center for Open Science, Charlottesville, VA 22903;

Notes

1
To whom correspondence should be addressed. Email: [email protected].
Author contributions: B.A.N. designed research; B.A.N. performed research; and B.A.N., C.R.E., A.C.D., and D.T.M. wrote the paper.

Competing Interests

Conflict of interest statement: B.A.N., A.C.D., and D.T.M. are employed by the nonprofit Center for Open Science that has as its mission to increase openness, integrity, and reproducibility of research.

Metrics & Citations

Metrics

Note: The article usage is presented with a three- to four-day delay and will update daily once available. Due to ths delay, usage data will not appear immediately following publication. Citation information is sourced from Crossref Cited-by service.


Citation statements

Altmetrics

Citations

Export the article citation data by selecting a format from the list below and clicking Export.

Cited by

    Loading...

    View Options

    View options

    PDF format

    Download this article as a PDF file

    DOWNLOAD PDF

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Personal login Institutional Login

    Recommend to a librarian

    Recommend PNAS to a Librarian

    Purchase options

    Purchase this article to access the full text.

    Single Article Purchase

    The preregistration revolution
    Proceedings of the National Academy of Sciences
    • Vol. 115
    • No. 11
    • pp. 2539-E2665

    Figures

    Tables

    Media

    Share

    Share

    Share article link

    Share on social media