Scientific research doesn’t always go according to plan. An experiment might have been brilliantly designed, but any number of things could still go wrong – a contaminated sample, a dip in temperature or the wrong chemical could play havoc with the results. Even well executed research may suffer from bias because researchers want to produce statistically significant results that confirm their theory or because journals prefer to publish novel results, to give a few examples.
At Elsevier, we’re committed to making reproducibility a reality and we actively support the proposals in ‘’A manifesto for reproducible science’’ by Munafo et al, 2017. We can help raise the bar on reproducibility by lowering barriers for researchers to publish replication studies, empowering researchers to share their methods and data, championing rigorous and transparent reporting, and creating outlets for research that upholds reproducibility.
Replication studies – experiments that replicate previous work by reanalyzing existing data or doing new experiments which recreate it – are a cornerstone of reproducibility. If research findings can be replicated, they are more trustworthy and reliable. This is how science can ‘’self-correct.’’
There are perceived barriers to disseminating replication studies, however. The idea that replication studies are only valuable if the results disagree with the original research is one misconception that inhibits submission of work that could be published as a replication study. Another significant barrier is the misconception that editors don’t want to publish replication studies.
To help break down some of these perceived barriers, Elsevier is taking various actions:
- We just published a series of virtual special issues highlighting replication studies.
- We have developed a new article type especially for replication studies.
- We are issuing several calls for papers to encourage submissions.
A reproducibility crisis?
Reproducibility promotes trust within our society – in individual research findings, in researchers, and in science more broadly. Science is made up of incremental steps: if inaccurate research findings become part of the literature, many more studies will attempt to build upon them. Not only is that a waste of research resources, it slows development of medical treatments and misguides government policies.
Earlier this year, a Nature survey of 1,576 scientists revealed that 90 percent think there is a “crisis” in reproducibility, with 52 percent calling it a “significant crisis.” Elsevier colleagues, including those at Mendeley, have been working to tackle this crisis from many angles, such as contributing to the verification of important research through partnerships.
Partnerships to support reproducibility
In 2012, Elsevier’s Dr. William Gunn co-founded The Reproducibility Initiative along with Dr. Elizabeth Iorns of ScienceExchange to promote replication of original research, improve reporting of experimental methods, and facilitate validation of experimental reagents such as antibodies. This led to the Reproducibility Project: Cancer Biology, a collaboration with the Center for Open Science that aims to replicate 29 landmark cancer biology studies. The RPCB just published the results of the first five replication attempts of important cancer studies. These studies helped shine a light on the gulf between what is reported in the original research paper and what is actually required to replicate a result – a gulf that many early-career researchers have gotten lost in when emails to the original researcher go ignored, when original reagents are no longer available, and sometimes when they face suspicion or derision from the often much more senior academics whose research they’re trying to build upon. In practical terms, the project also demonstrated the value of careful curation of data throughout the research lifecycle and of unambiguously identifying reagents, which Elsevier supports via Mendeley Data and our work with the Resource Identification Initiative.
In another partnership, with Humboldt-Universität zu Berlin, we have formed the Advanced Data & Text (HEADT) Centre to better serve the needs of the scientific community by improving the efficiency of text and data mining and by investigating research integrity issues.
Rigor and transparency
Scientific rigor forms the foundation of reproducibility. In 2016, Cell Press introduced STAR Methods, which outlines the features of a robust, reproducible method: structured, transparent, accessible and reporting. Several other Elsevier journals also have policies that promote reproducibility, including the Mandatory Replication Policy in Energy Economics and Review of Economic Dynamics,the Scientific Checklist in Biochemical Pharmacology and the Invited Reproducibility Paper in Information Systems.
In medical research, clinical trials must be publicly registered before the trial actually starts. Elsevier increases this transparency even further by linking the published trial results back to the original registration. Non-medical researchers can now also benefit from pre-registration by submitting a Registered Report and having it pre-accepted before the results are even known. Authors get the benefit of peer review and an editorial decision based on their research topic and method before they carry out the research, which improves article quality and relieves the pressure to show a significant positive result.
Once the research is complete, with the data repository Mendeley Data, journal data sharing policies, and the open access journal Data in Brief, researchers can make their data available, discoverable, trackable and verifiable.
Giving researchers the right incentives
However, even when studies can be reproduced, it doesn’t necessarily mean they will be. Scientists are often not motivated or empowered to carry out these studies, as John Oliver explained on an episode of Last Week Tonight:
There is no reward for being the second person to discover something in science. There’s no Nobel Prize for fact checking.
Recently, we worked with 125 researchers in the Innovation Explorers community to understand the changing appetite for publishing replication studies, identify drivers and barriers to publication, and explore ways Elsevier can foster validated, trustworthy science. They pointed to a lack of incentives and resources to put into replication studies. One participant commented:
Unless the original study appears to be flawed in some important way, why dedicate the time and resources to an experimental program which appears to be well executed and well interpreted?
Of course, every researcher who’s ever re-run an experiment to get “publication quality” data knows that the mere appearance of good execution isn’t enough. In fact, it hides all the necessary but messy work and troubleshooting, and sometimes even a bit of “p-hacking.” When replication studies are carried out, it is often for more practical reasons: for example, to test out different variables for a new experiment. This makes replication studies an integral part of the research cycle. Yet like data and methods, they are rarely published and recognized as valuable scientific contributions.
According to the researchers we interviewed recently, replications are rarely published in part because of the lack of a “breakthrough,” therefore drawing little recognition for the work in the form of citations, which tend to go to the original paper. There is also a perception that editors are not interested in replication studies, particularly those that confirm previous results.
As a leading publisher, Elsevier is in a good position to help lower this perceived barrier to publication. We provide homes for research that promotes reproducibility: journals like Contemporary Clinical Trials Communications and Heliyon welcome good quality studies no matter what the result. Heliyon Editor-in-Chief Dr. Claudia Lupp explained:
Heliyon’s scope includes all technically and ethically sound research – including replication studies. Although breakthroughs and new discoveries provide important steps forward in our knowledge, replication studies give credibility to the research and help us identify results that are not strong enough to build on. Replication is a vital part of the scientific process – one that’s important to highlight and disseminate in journals like Heliyon, providing the replications meet quality standards.
While this research is not suited to all journals, there is a strong history of Elsevier publishing replication studies. Working with data mining experts, we have put together a series of virtual special issues: three featuring replication studies and one with articles about the importance of reproducibility. In searching for these articles, we found that with no common format or label, replication studies can be hard to identify. We’re now working on a new article type dedicated to replication studies, which will be available across a range of journals starting in March.
Making sure your work can be reproduced is a massive step towards making it trustworthy and showing peers, funders and the public that science can be trusted. This is what we must do to safeguard science.
Learn more and get involved
Submit your studies
Submit your replication studies to our Energy Economics journal: Call for papers/ special issue on replication.
Read the virtual special issues
Elsevier has created four virtual special issues, which are free to read:
- Economics and Finance
- Business and Marketing
- Neuroscience, Neurology , psychology and psychiatry
- The importance of replication (below)
The importance of replication
By William Gunn, PhD
In August of 2011, the Head of the Department of Social Psychology at Tilburg University, Prof. Marcel Zeelenberg, was in an increasingly uncomfortable meeting. He was listening to three young researchers describe detailed evidence suggesting that the brilliant and well-liked dean of the department, Prof. Diederik Stapel, had committed fraud.
As an investigation would later discover, at least 55 publications spanning many years were based on fabricated data. As the events unfolded, the question everyone kept asking was: “How had this gone undetected for so long?” The report from the committees that investigated his work indicate that part of it was because he was well-liked, and thus the deception was easier to swallow. But they go on to comment on the research culture in which it occurred, noting that “not infrequently reviews (of social psychology journal articles) were strongly in favour of telling an interesting, elegant, concise and compelling story, possibly at the expense of the necessary scientific diligence.”
Strikingly, the report notes that on the few occasions when replications were done, they failed but “were never revealed, because this outcome was not publishable.” In fact, they are indeed publishable, and there’s a growing awareness among reviewers and editors of the critical role replication studies play in increasing the confidence of researchers in the reports they read.
With that in mind, we have put together a virtual special issue focusing on replication studies in the fields of Business, Economics, and Information Technology.
This collection shows that replication studies have a long history in the social sciences, even if they remain relatively rare, making up less than 2 percent of publications in the fields addressed. The papers highlighted here discuss best practices for doing replication studies and introduce terminology to help frame the methods and motivations for replication studies. Hopefully, these works will not only encourage more replication studies to be done and submitted, but will help the replication studies fulfill their intended role: to provide more confidence that the magnitude of the effects reported in the original study has been accurately estimated. In addition, there are several studies examining editorial perceptions about replication studies, how those perceptions have changed over time, and discussing practical things editors can do to encourage submission of more replication studies.
We hope these collected works offer interesting and useful insights. Elsevier has made them freely available until May 14, 2017. As we publish more replication studies, we will continue to highlight topical collections of them in future posts.
- Richard W. Easley, Charles S. Madden, Mark G. Dunn: Conducting Marketing Science: The Role of Replication in the Research Process, Journal Business Research (April 2000)
- Raymond Hubbard and Daniel E. Vetter: An empirical comparison of published replication research in accounting, economics, finance, management, and marketing, Journal Business Research (February 1996)
- Richard W. Easley, Charles S. Madden, Van Gray: A tale of two cultures: Revisiting journal editors' views of replication research, Journal Business Research (February 1996)
- Richard W. Easley and Charles S. Madden: Replication revisited: Introduction to the special section on replication in business research, Journal Business Research (September 2013)
- Mark D. Uncles and Simon Kwok: Designing research with in-built differentiated replication, Journal Business Research (September 2013)
- Robert Rosenthal: Some issues in the replication of social science research, Labour Economics (June 1997)
- Daniel S. Hamermesh: Some thoughts on replications and reviews, Labour Economics (June 1997)
- Larry V. Hedges: The promise of replication in labour economics, Labour Economics (June 1997)
- Thomas J. Kniesner: Replication? Yes. But how? Labour Economics (June 1997)
- Wiji Arulampalam et al: Replication and re-analysis, Labour Economics (June 1997)
- T.D. Stanley and Ume Tran: Economics students need not be greedy: Fairness and the ultimatum game, Journal of Socio-Economics (1998)
- David M. Levy: Testing the replication hypothesis: When the data set is subject to gross error, Economics Letters (September 1990)
- Natalia Juristo: A process for managing interaction between experimenters to get useful similar replications, Information and Software Technology, (February 2013)
- Fabio Q.B. da Silva et al: Team building criteria in software projects: A mix-method replicated study, Information and Software Technology, (June 2013)
- Cleyton V.C. de Magalhães et al: Investigations about replication of empirical studies in software engineering: A systematic mapping study, Information and Software Technology, (August 2015)