Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3–9; median total sample = 1,279.5, range = 276–3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (Δr = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00–.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19–.50).

Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability / Mathur, Charles R. Ebersole Maya B.; Baranski, Erica; Bart-Plange, Diane-Jo; Buttrick, Nicholas R.; Chartier, Christopher R.; Corker, Katherine S.; Corley, Martin; Hartshorne, Joshua K.; Ijzerman, Hans; Lazarević, Ljiljana B.; Rabagliati, Hugh; Ropovik, Ivan; Aczel, Balazs; Aeschbach, Lena F.; Andrighetto, Luca; Arnal, Jack D.; Arrow, Holly; Babincak, Peter; Bakos, Bence E.; Baník, Gabriel; Baskin, Ernest; Belopavlović, Radomir; Bernstein, Michael H.; Białek, Michał; Bloxsom, Nicholas G.; Bodroža, Bojana; Bonfiglio, Diane B. V.; Boucher, Leanne; Brühlmann, Florian; Brumbaugh, Claudia C.; Casini, Erica; Chen, Yiling; Chiorri, Carlo; Chopik, William J.; Christ, Oliver; Ciunci, Antonia M.; Claypool, Heather M.; Coary, Sean; Čolić, Marija V.; Matthew Collins, W.; Curran, Paul G.; Day, Chris R.; Dering, Benjamin; Dreber, Anna; Edlund, John E.; Falcão, Filipe; Fedor, Anna; Feinberg, Lily; Ferguson, Ian R.; Ford, Máire; Frank, Michael C.; Fryberger, Emily; Garinther, Alexander; Gawryluk, Katarzyna; Ashbaugh, Kayla; Giacomantonio, Mauro; Giessner, Steffen R.; Grahe, Jon E.; Guadagno, Rosanna E.; Hałasa, Ewa; Hancock, Peter J. B.; Hilliard, Rias A.; Hüffmeier, Joachim; Hughes, Sean; Idzikowska, Katarzyna; Inzlicht, Michael; Jern, Alan; Jiménez-Leal, William; Johannesson, Magnus; Joy-Gaba, Jennifer A.; Kauff, Mathias; Kellier, Danielle J.; Kessinger, Grecia; Kidwell, Mallory C.; Kimbrough, Amanda M.; King, Josiah P. J.; Kolb, Vanessa S.; Kołodziej, Sabina; Kovacs, Marton; Krasuska, Karolina; Kraus, Sue; Krueger, Lacy E.; Kuchno, Katarzyna; Ambrosio Lage, Caio; Langford, Eleanor V.; Levitan, Carmel A.; Jessé Souza de Lima, Tiago; Lin, Hause; Lins, Samuel; Loy, Jia E.; Manfredi, Dylan; Markiewicz, Łukasz; Menon, Madhavi; Mercier, Brett; Metzger, Mitchell; Meyet, Venus; Millen, Ailsa E.; Miller, Jeremy K.; Montealegre, Andres; Moore, Don A.; Muda, Rafał; Nave, Gideon; Lee Nichols, Austin; Novak, Sarah A.; Nunnally, Christian; Orlić, Ana; Palinkas, Anna; Panno, Angelo; Parks, Kimberly P.; Pedović, Ivana; Pękala, Emilian; Penner, Matthew R.; Pessers, Sebastiaan; Petrović, Boban; Pfeiffer, Thomas; Pieńkosz, Damian; Preti, Emanuele; Purić, Danka; Ramos, Tiago; Ravid, Jonathan; Razza, Timothy S.; Rentzsch, Katrin; Richetin, Juliette; Rife, Sean C.; Dalla Rosa, Anna; Hase Rudy, Kaylis; Salamon, Janos; Saunders, Blair; Sawicki, Przemysław; Schmidt, Kathleen; Schuepfer, Kurt; Schultze, Thomas; Schulz-Hardt, Stefan; Schütz, Astrid; Shabazian, Ani N.; Shubella, Rachel L.; Siegel, Adam; Silva, Rúben; Sioma, Barbara; Skorb, Lauren; Elayne Cunha de Souza, Luana; Steegen, Sara; Stein, L. A. R.; Weylin Sternglanz, R.; Stojilović, Darko; Storage, Daniel; Brent Sullivan, Gavin; Szaszi, Barnabas; Szecsi, Peter; Szöke, Orsolya; Szuts, Attila; Thomae, Manuela; Tidwell, Natasha D.; Tocco, Carly; Torka, Ann-Kathrin; Tuerlinckx, Francis; Vanpaemel, Wolf; Ann Vaughn, Leigh; Vianello, Michelangelo; Viganola, Domenico; Vlachou, Maria; Walker, Ryan J.; Weissgerber, Sophia C.; Wichman, Aaron L.; Wiggins, Bradford J.; Wolf, Daniel; Wood, Michael J.; Zealley, David; Žeželj, Iris; Zrubka, Mark; Nosek, and Brian A.. - In: ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE. - ISSN 2515-2459. - (2020).

Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability

Mauro Giacomantonio;
2020

Abstract

Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3–9; median total sample = 1,279.5, range = 276–3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (Δr = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00–.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19–.50).
2020
replication, reproducibility, metascience, peer review, Registered Reports, open data, preregistered
01 Pubblicazione su rivista::01a Articolo in rivista
Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability / Mathur, Charles R. Ebersole Maya B.; Baranski, Erica; Bart-Plange, Diane-Jo; Buttrick, Nicholas R.; Chartier, Christopher R.; Corker, Katherine S.; Corley, Martin; Hartshorne, Joshua K.; Ijzerman, Hans; Lazarević, Ljiljana B.; Rabagliati, Hugh; Ropovik, Ivan; Aczel, Balazs; Aeschbach, Lena F.; Andrighetto, Luca; Arnal, Jack D.; Arrow, Holly; Babincak, Peter; Bakos, Bence E.; Baník, Gabriel; Baskin, Ernest; Belopavlović, Radomir; Bernstein, Michael H.; Białek, Michał; Bloxsom, Nicholas G.; Bodroža, Bojana; Bonfiglio, Diane B. V.; Boucher, Leanne; Brühlmann, Florian; Brumbaugh, Claudia C.; Casini, Erica; Chen, Yiling; Chiorri, Carlo; Chopik, William J.; Christ, Oliver; Ciunci, Antonia M.; Claypool, Heather M.; Coary, Sean; Čolić, Marija V.; Matthew Collins, W.; Curran, Paul G.; Day, Chris R.; Dering, Benjamin; Dreber, Anna; Edlund, John E.; Falcão, Filipe; Fedor, Anna; Feinberg, Lily; Ferguson, Ian R.; Ford, Máire; Frank, Michael C.; Fryberger, Emily; Garinther, Alexander; Gawryluk, Katarzyna; Ashbaugh, Kayla; Giacomantonio, Mauro; Giessner, Steffen R.; Grahe, Jon E.; Guadagno, Rosanna E.; Hałasa, Ewa; Hancock, Peter J. B.; Hilliard, Rias A.; Hüffmeier, Joachim; Hughes, Sean; Idzikowska, Katarzyna; Inzlicht, Michael; Jern, Alan; Jiménez-Leal, William; Johannesson, Magnus; Joy-Gaba, Jennifer A.; Kauff, Mathias; Kellier, Danielle J.; Kessinger, Grecia; Kidwell, Mallory C.; Kimbrough, Amanda M.; King, Josiah P. J.; Kolb, Vanessa S.; Kołodziej, Sabina; Kovacs, Marton; Krasuska, Karolina; Kraus, Sue; Krueger, Lacy E.; Kuchno, Katarzyna; Ambrosio Lage, Caio; Langford, Eleanor V.; Levitan, Carmel A.; Jessé Souza de Lima, Tiago; Lin, Hause; Lins, Samuel; Loy, Jia E.; Manfredi, Dylan; Markiewicz, Łukasz; Menon, Madhavi; Mercier, Brett; Metzger, Mitchell; Meyet, Venus; Millen, Ailsa E.; Miller, Jeremy K.; Montealegre, Andres; Moore, Don A.; Muda, Rafał; Nave, Gideon; Lee Nichols, Austin; Novak, Sarah A.; Nunnally, Christian; Orlić, Ana; Palinkas, Anna; Panno, Angelo; Parks, Kimberly P.; Pedović, Ivana; Pękala, Emilian; Penner, Matthew R.; Pessers, Sebastiaan; Petrović, Boban; Pfeiffer, Thomas; Pieńkosz, Damian; Preti, Emanuele; Purić, Danka; Ramos, Tiago; Ravid, Jonathan; Razza, Timothy S.; Rentzsch, Katrin; Richetin, Juliette; Rife, Sean C.; Dalla Rosa, Anna; Hase Rudy, Kaylis; Salamon, Janos; Saunders, Blair; Sawicki, Przemysław; Schmidt, Kathleen; Schuepfer, Kurt; Schultze, Thomas; Schulz-Hardt, Stefan; Schütz, Astrid; Shabazian, Ani N.; Shubella, Rachel L.; Siegel, Adam; Silva, Rúben; Sioma, Barbara; Skorb, Lauren; Elayne Cunha de Souza, Luana; Steegen, Sara; Stein, L. A. R.; Weylin Sternglanz, R.; Stojilović, Darko; Storage, Daniel; Brent Sullivan, Gavin; Szaszi, Barnabas; Szecsi, Peter; Szöke, Orsolya; Szuts, Attila; Thomae, Manuela; Tidwell, Natasha D.; Tocco, Carly; Torka, Ann-Kathrin; Tuerlinckx, Francis; Vanpaemel, Wolf; Ann Vaughn, Leigh; Vianello, Michelangelo; Viganola, Domenico; Vlachou, Maria; Walker, Ryan J.; Weissgerber, Sophia C.; Wichman, Aaron L.; Wiggins, Bradford J.; Wolf, Daniel; Wood, Michael J.; Zealley, David; Žeželj, Iris; Zrubka, Mark; Nosek, and Brian A.. - In: ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE. - ISSN 2515-2459. - (2020).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1660637
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 56
  • ???jsp.display-item.citation.isi??? 42
social impact