Predatory journals are on the rise, and their proliferation poses a threat to researchers at every level, from early career to senior, as well as to the credibility of published science. STM, the global trade association for academic and professional publishers, called the increase “worrying” and affirmed in a statement posted on its website in August, that “high levels of trust are vital to ensuring that the publication and sharing of research results helps to advance research, the global pool of knowledge and the careers of researchers and investigators.”
Predatory journals destroy that trust.
There is no standard definition of “predatory journal,” a term first used by librarian Jeffrey Beal to describe what he called “counterfeit journals.” Elsevier and others use the more process-driven definition by Shamseer et al, who describe predatory journals as those that “actively solicit manuscripts and charge publication fees without providing robust peer review and editorial services.”
In their comparison of potential predatory journals identified in Beall’s list (this is the latest version of the controversial list; the original one was discontinued) and presumed to be legitimate open-access and subscription-based journals, the research team found spelling errors on predatory journals’ homepages, distorted or possibly unauthorized images, fake impact factors, and editors or editorial board members with unverified affiliations.
Senior researchers at risk, too
Vulnerable early-stage researchers are presumed to be the victims of predatory publishers, yet recent publications have shown that, globally, hundreds of thousands of researchers at all levels have published in predatory journals — many from renowned research institutes and universities, as well as employees of government funding organizations.
“When senior researchers author articles that appear in questionable journals, often it is the result of delegating details related to submissions of ‘lesser’ papers to junior authors, who may unwittingly consign a solid paper to a questionable journal,” says Catriona Fennell, Director of Publishing Services at Elsevier. “If a senior author isn’t paying attention to where the work is submitted, it may well be lost among papers of poor quality.”
Further, she said, “even a senior researcher may be duped by a journal with a name or look close to that of a reputable journal.”
Cross-industry collaboration is vital
Many factors contribute to the rise of predatory publishing, from the pressure on academics to “publish or perish” to the readily available tools that enable anyone to easily set up a website and an electronic payment system, and to email en mass thousands of potential authors.
The ease with which such journals can proliferate has prompted collaborative efforts among all stakeholders — researchers, publishers, academic and government institutions, industry, standards organizations and nonprofits — to stem the tide and reinforce the value of research integrity.
A key collaboration is the cross-industry initiative Think. Check. Submit., led by professional societies as well as individual publishers. Think. Check. Submit provides simple guidelines that authors can use to assess a journal or publisher before submitting an article and that readers can use to identify trusted sources.
“Importantly,” Fennell notes, “tools such as Think. Check. Submit can help distinguish between a predatory journal and a new journal that hasn’t yet built up its reputation but is following the principles and standards of ethical scientific publishing.”
Other author education efforts include Elsevier’s Researcher Academy, which has resources to support authors in preparing a submission, writing an article, selecting a journal, navigating peer review and ensuring visibility and social impact.
The Committee on Publication Ethics (COPE) is another cross-industry initiative that aims to “move the culture of publishing towards one where ethical practices become the norm, part of the publishing culture.” To reflect those aims, the organization recently consolidated its code of conduct and best practice guidelines for editors and code of conduct for journal publishers into a single, 10-point “core practices” document.
Improving peer review – and exploring open peer review
“There’s no question that the peer review system—whether single, double or open--isn’t perfect,” says Dr. Bahar Mehmani, Reviewer Experience Lead at Elsevier. However, regardless of the type of peer review, at a minimum, “reputable journals provide authors with information on how the process works, the average time it takes, and the rejection/acceptance rate, which you don’t see with predatory journals.”
The concept of open peer review is gaining traction; however, there are at least 22 different definitions of what that means, according to Dr. Mehmani. A recent survey showed a mostly favorable response to some traits generally associated with open peer review, including open interaction (the wider community can contribute to the process), open reports (reviews published alongside the article) and final-version commenting — although more than half of respondents were against opening reviewer identities to authors.
Right now, fewer than 3 percent of scientific journals allow peer reviews to be published, but despite the challenges, that percentage is predicted to increase. Recently, Wiley announced a new open peer review workflow.
“One transdisciplinary, cross-sector collaborative initiative, New Frontiers of Peer Review (PEERE), will be analyzing peer review in different scientific areas and evaluating the implications of different models of peer review,” Dr. Mehmani says. PEERE is a recent initiative from COST, the longest running framework supporting transnational cooperation among researchers, engineers and scholars across Europe.
To help PEERE achieve its goal of improving various facets of peer review through collaborative efforts, Elsevier, Springer Nature and Wiley have agreed on a data-sharing protocol that enables the organization to access peer review data for some of their journals.
Other initiatives are providing additional levels of transparency to peer review, according to Dr. Mehmani. One example is F1000, a platform that shows authors how peer review took place and what reviewers said about their article. Other publishers are publishing the names of reviewers on the article page. Elsevier supports greater transparency in peer review and ran a pilot that indicated many of its journal editors and authors are interested in this approach.
Another collaborative initiative, ORCID, provides an Open Researcher and Contributor ID — a unique, persistent digital identifier — to every researcher. As a founding sponsor and member of ORCID, Elsevier collaborates with this consortium on a wide variety of initiatives, including supporting automated links between manuscript and grant submissions as well as peer review contributions through Mendeley.
Sense About Science, an independent charitable trust, is working to promote an understanding of peer review among journalists, policymakers and the public. Elsevier has partnered with the organization since 2006.
Fostering data sharing
Collaborative initiatives around data sharing are also growing, as funding organizations in the US and Europe increasingly require public access to the data underpinning published research. Elsevier and other publishers support the STM principle that “raw research data should be made freely available to all researchers.” Collaborative organizations working in this direction include Force11, which has set up guidelines to facilitate data citation, so that data can be cited like other forms of evidence; the Research Data Alliance , which has working groups and interest groups focused on building “data bridges” to overcome data fragmentation by disciplines or domains; and Scholix, which is developing a set of interoperability guidelines to improve the exchange of data-literature links among publishers, data centers and domain service providers.
The bottom line on predatory publishers
“Predatory publishers are unlikely to perform rigorous processes, respect scientific publishing standards, or join organizations in which publishers, researchers, and other stakeholders work together to effect meaningful change,” Fennell says. “Ongoing and emerging collaborative initiatives are reinforcing the robust requirements for publication in reputable journals, and at the same time making it easier and faster for authors to do so.”
Free resources for authors and reviewers
The following freely available articles and tools can be used to support research accuracy and the quality of scholarly publications.
- Elsevier Reviewer Hub. Help for reviewers on how to conduct a review, manage it, structure it and receive credit for their work.
- Elsevier Researcher Academy. Compilation of numerous online resources to support researchers in preparing a submission, writing an article, selecting a journal navigating peer review, and ensuring visibility and social impact.
- Equator Network. Online library for health research reporting, including guidelines and toolkits.
- Reporting checklists for medical researchers. Checklists that help authors report research “clearly and fully.”
- StatCheck. To check a PDF, DOCX or HTML file for errors in statistical reporting.
- StatReviewer. Automated review of statistical and reporting integrity for scientific manuscripts, with a report that resembles an actual peer review or a checklist, depending on journal-specified guidelines.
- Think. Check. Submit. Provides a simple checklist that researchers can use to assess the credentials of a journal or publisher, and that readers can use to identify trusted sources of information.
What it takes to produce a reputable journal
Elsevier’s white paper Supporting Value: How Rigorous Processes & Collaborations Help Ensure Research Integrity provides an in-depth look at the processes that distinguish reputable from predatory publishers, and an expanded overview of cross-industry and cross-disciplinary collaborations that are helping to improve and safeguard scholarly publications.