Editor’s note: This month, Elsevier Connect is exploring open science — what it means and how to make it happen.
Open science describes a more inclusive, collaborative, transparent world of research. At Elsevier, we’re enabling open science through our approach to open access, open data, research integrity, knowledge exchange, metrics and more.
We do this because we believe open science can benefit research and society and drive research performance. Whether its freely available data, transparent benchmarking tools, or platforms that drive collaboration, bringing this world of open science to fruition means that we need to continue partner with the research community as we develop tools and platforms that support this vision.
So what’s the best way to do that and what will the future hold? Below, we’ve outlined some of the initiatives and platforms that support this vision, and also take a forward look. Please do share your own thoughts in the comments on what you see as the most important elements of open science, where open science should lead, and how we will get there. We’ll also be adding in depth pieces on many of these topics as the month goes on.
What is open science?
Open science is a term that has gained traction in recent years as the goal for all players in the research system. It’s hard to pin down the origins of the term, but the European Commission has been at the forefront of pushing an open science agenda, as has the Netherlands, Japan, the US and others. For example, the European Commission has set out policy priorities around open access and open data and has set up a clear, inclusive mechanism to deliver this through its Open Science Policy Platform and eight working groups.
Open science is often defined by what falls under this term: open access and open data are seen as staples, but other areas such as alternative metrics, research integrity and citizen science often fall under this umbrella too. Open science is also anticipated to achieve some specific outcomes, such as public engagement in research, and other positive outcomes such as efficiencies in the research process, although a concrete way of tracking and measuring these is yet to evolve.
At Elsevier, we work closely with the research community to improve research performance, and we believe that open science has potential to contribute to that. During Open Access week at the end of October, we saw that many of you agree. Under the theme “Open in order to…,” promoted by OASPA, people from across the research community shared a range of views on how open science can benefit research and society.
We have continued to evolve our business to enable an inclusive experience across the spectrum of research outputs, providing more options for researchers to share more kinds of information and allowing more people to get involved in the scientific process. We provide platforms for preprint sharing, for example, and now that bepress has joined Elsevier, we have a Digital Commons research showcase that covers a broader spectrum of open scholarship, including library-published journals, technical reports, and open textbooks. Through these services, anyone can access an article without charge, irrespective of the underlying business model supporting publication.
The articles we publish and the research data we help researchers to manage are increasingly more open. We publish more gold open access articles every year – over 25,000 in 2016 – making us the second largest gold OA publisher. We facilitate green OA and are making it work in practice through our partnership with CHORUS and our IR services. We estimate by the end of 2017, 60,000 of the subscription articles we published in 2016 will be available green OA through our partnership with CHORUS and our open archive program, our partnership with PMC and author self archiving. We look forward to seeing this grow.
Our Research Data policy commits us to helping researchers make their research data freely available, and we have developed services such as Mendeley Data to support researchers doing this. There are 1,400 published datasets and an additional 32,000 created datasets on Mendeley, and more than 5 percent of Elsevier articles include open data. We are working to increase this percentage over time. For both articles and data, we offer the full range of licenses and researchers choose the level of openness.
A more collaborative world of research - which often flows from the inclusive nature of open science - includes others, be that fellow researchers or members of the public. Mendeley facilitates collaboration through its role as a Scholarly Collaboration Network, enabling the exchange of ideas and articles. And SSRN, with its 2.2 million users and 9.1 million references, provides a platform for researchers to exchange ideas, give feedback on early drafts and comment on the work of others. Programs such as Atlas and Audioslides bring research closer to society, and tools such as Newsflo help measure this interaction. This kind of collaboration is good for science.
Reproducibility is crucial to transparency in science because it verifies that the claims of experiments are true and therefore useful in further research. Recent surveys have revealed that 52 percent of researchers believe there is a "significant reproducibility crisis" in science, and 38 percent feel there is a slight crisis.
We’ve helped raise the bar on reproducibility, facilitating and empowering researchers to share their methods and data. In 2016, Cell Press introduced STAR methods: structured, transparent, accessible reporting reflecting the importance of methods sections in papers. We’re also participating in a Crossref pilot to link registered clinical trials with published articles, enhancing readers’ understanding of the full results that came out of a trial, making scientific reporting more transparent. We participate in the Scholix framework to link data to scholarly research. And our open peer review reports pilot brings greater transparency to the peer review process. As some of our openness and transparency initiatives correspond to the Transparency and Openness Promotion (TOP) Guidelines, Elsevier became a TOP signatory in September. You can expect to hear more on transparency in the coming months.
Sometimes transparency means being open about not being open. One example is Data Statements, which enable researchers to explain how and where their data is shared – or where it can’t be and why.
We also believe that metrics should be comprehensive and transparent. Scopus’s development of CiteScore is designed to give a more comprehensive, transparent and current view of a journal’s impact. The work we are doing with PlumX and through services such as Mendeley Stats presents a broader perspective on research impact, increasing transparency around what we mean by “good” or impactful research and helping build a base from which to measure research impact in the future.
Improving research performance
Elsevier’s active role in open science reflects our shared belief – ours and that of the research community, governments, funders and others active in the research ecosystem – that there are elements of open science that can improve research performance, and this will be an area we focus on in the future. “Research performance” is itself a broad term that is increasingly being looked at across a range of measures and metrics, in addition to article citation counts or a journals impact factor. This makes it more challenging but also more exciting and potentially rewarding to look at the role open science might play on article usage, citations, readership, mentions and many other measures.
There is some evidence we can draw on already. For example, we know there is a link between more open publications and more usage/readership, but the impact of openness on citations is less clear: for gold OA, we do not tend to see a citation boost, yet articles that have earlier visibility, for example those available as preprints, tend to be more highly cited later on. Similarly, some studies have shown that higher degrees of collaboration correlate with more impactful (highly cited) research, but some have suggested there may be a “multiple author self-citation” factor here. We want to apply our analytic capabilities and the work we are doing on metrics to test assumptions further. Does data sharing save researchers time and lead to a more efficient research process? Does more open research have a greater societal impact? Do elements of transparency boost trust in science and support a healthier research process in some way?
We don’t have all of the answers yet, but are excited about the open science journey we are all on. We look forward to joining you as we continue on this journey.