Scientific credibility and reproducibility
The CEO of Cell Press writes about the recent recommendations for maximizing reproducibility and how Cell Press is encouraging credibility and rigor
By Emilie Marcus, PhD Posted on 20 November 2014
This editorial appears in the new issue of Cell, published online on November 20.
Credibility is everything for science, and it is built over time in both obvious and subtle ways. It is how we interact with colleagues and collaborators. It is how generously and openly we share reagents and how we mentor students and postdocs. It is how we review each other's papers, and it is how we credit others' work. It is the way we educate and inform the public that funds us. It is the way we document and store our data. And it is the rigor, transparency, and attention we invest in designing, conducting, and reporting experiments.
Without credibility, others can't/won't build on our work, and as a result, the pace of scientific advance is slowed. Most importantly, science contaminated with a lack of credibility is a house with crumbling walls that engenders little trust and provides minimal value to our global society, present and future.
As CEO of Cell Press, Dr. Emilie Marcus manages Cell Press's portfolio of journals, conferences and online tools for biomedical researchers. She has also launched creative new initiatives such as Cell Reports– the first "author-pays" open access Cell Press title – and the original Article of the Future online format. As the Editor of Cell, she is charged with crafting an editorial vision for one of the world's leading biomedical research journals. She came to the position in 2003 after five years on the editorial team of Neuron and a successful graduate and postdoctoral research career, first at Yale University, where she received her PhD in biology/neuroscience, and then at the Salk Institute.
But everyone reading this already knows the importance of credibility in science, so why are we discussing it here? Within the last 12 months, the reproducibility of science, a lynchpin of credibility, has come under intense scrutiny, both from the NIH (Nature 505, 612–613) and other government funding bodies, as well as in the lay (The Economist, October 19, 2013, 23–28) and scientific press (Nature 483, 531-533 — though many of these reports themselves would benefit from greater transparency in reporting and still require robust demonstrations of reproducibility).
Hearing the word "reproducibility," most of us think immediately of fraud or data and image manipulation, but it is much broader than that. Many of the current concerns about reproducibility, particularly the successful scalability of preclinical data into robust drug targets for treating human disease, are focused on the rigor of the experimental design (inclusion of all appropriate controls, blinded experimental conditions, gender balance in experimental populations, a priori determination of n's and statistical power, appropriate statistical analyses, etc.) and on complete transparency in reporting of these parameters and all collected data (for a recent Perspective on this topic, see Neuron 84, 572–581.)
In June of this year, Francis Collins, NIH Director, Marcia McNutt, Editor-in-Chief of Science, and Philip Campbell, Editor-in-Chief of Nature, organized a meeting of journal editors and other contributors to collaborate on approaches to ensuring and improving reproducibility. Maximizing reproducibility clearly is an initiative involving many stakeholders, with scientists front and center and funding bodies, universities, journals, pharmaceutical and biotech companies, patient advocacy groups, and society at large all taking a leading role as well. Out of the discussions at this meeting came a set of recommendations for how journals and journal editors can do their part. The main focus of the guidelines is to ensure rigorous experimental design and transparency in reporting the specifics about how experiments were performed and how data were collected and analyzed. Cell Press participated in the meeting and is a signatory on the recommendations that were recently posted. Many of the items in the guidelines Cell and its sister journals are already doing and have been doing for quite some time (providing space for lengthy methods sections in print and unlimited supplemental methods online, requiring the sharing of reagents as a condition of publication, providing a forum for refutation in our Matters Arising format, requiring authors to clearly state their statistical measures.) Other items in the guidelines, like developing a way to facilitate clear reporting in the paper of details about how experiments were designed and performed, will be valuable additions to what we already do.
Journals are encouraged to adopt a checklist of specific reporting criteria as a standard form for authors to complete and editors and/or reviewers to verify. While we at Cell and the other Cell Press journals are not yet sure that an author checklist per se will be the most effective implementation for our authors, reviewers and readers, we do wholly embrace the importance of the goals of the guidelines and will be taking steps to adapt our editorial processes and author instructions to ensure consistent standards for appropriate experimental design and transparency in reporting. For example, Developmental Cell has recently introduced supplemental protocols, where authors of a paper with noteworthy, new, or particularly challenging methods are encouraged to provide a detailed protocol in a separate supplemental PDF. We view these steps as an important part of the value that we add through the editorial and peer review process.
Enhanced attention to these elements will also help protect the authors' credibility. With increased clarity about how experiments were performed and collected, editors, authors and reviewers will all be better able to spot and rectify concerns before the paper is published, hopefully reducing the number of corrections and retractions required post-publication. To this end, Celland our sister journals are also introducing an image screening process to help ensure adherence to community standards as outlined in our data processing policies. More and more, we are finding that the concerns that arise regarding published data are often the result of avoidable errors. For example, copying and pasting the same image into two different figures or failing to indicate where lanes of a gel have been spliced together. (Oddly, the most pervasive challenges to published data we see at Cellrelate to loading controls. There seems to be some misalignment among scientists regarding the importance and meaning of the actin bands in a standard western blot.)
When potential problems are brought to our attention by a concerned reader, we ask the authors to provide us with the original unprocessed data, together with a detailed explanation of how they conducted the experiment. Most of the time, we can see from the raw data that the problems have been introduced through simple mistakes and can be addressed with an erratum. But a scientific literature peppered with corrections does not build credibility, and worse still is when the avoidable errors are sufficiently extensive that they undermine the reliability of the entire body of work and necessitate a retraction. So, as we at Cell invest in checking figures and working with authors to fix any correctable mistakes before the paper is published, we ask that authors renew their focus on preparing their manuscripts and reviewing the final figures with the same attentive eyes their readers will. By combining enhanced clarity of reporting as recommended by the new guidelines with prepublication image screening, our intent is to ensure that every paper we publish meets not only the highest standards of interest and importance but also of credibility and reproducibility.
With increased vigilance from authors, funders and journals and attention to standards for experimental design and accurate careful reporting, we will collectively increase the public trust and support for research and build a stronger pipeline for converting our understanding of the basic processes and mechanisms of biology into improved diagnostics, treatments and potentially cures for the myriad of global health challenges.
By Hylke Koers, PhD, and Rachel Martin | Posted on 21 Oct 2014
Research Data Alliance meeting draws 500+ — and Elsevier announces its Open Data pilotBy Sacha Boucherie | Posted on 15 Oct 2014
The science laureates have all published with Elsevier; download some of their high-impact papersBy Paige Shaklee, PhD | Posted on 09 Sep 2014
Data in Brief enables researchers to publish reproducible data – and get credit for itBy Emilie Marcus, PhD | Posted on 10 Jul 2014
Agreement was signed in Shanghai for journal dedicated to the growing societal impact of plant scienceBy Michael J. Caplan, MD, PhD | Posted on 25 Jun 2014
The Editor-in-Chief of Elsevier’s Reference Module in Biomedical Sciences writes about what inspired the projectBy Hylke Koers, PhD | Posted on 04 Jun 2014
Elsevier endorses the Joint Declaration of Data Citation Principles, working to ensure widespread adoption