Skip to main content

Unfortunately we don't fully support your browser. If you have the option to, please upgrade to a newer version or use Mozilla Firefox, Microsoft Edge, Google Chrome, or Safari 14 or newer. If you are unable to, and need support, please send us your feedback.

Elsevier
Publish with us
Connect

A round of applause for reviewers

June 27, 2018

By Bahar Mehmani, PhD

A round of applause for reviewers

Highlighting some of the Elsevier systems and tools aimed at giving referees due recognition

The lynchpin of academia

Peer review is the cornerstone of the academic publication process. Researchers voluntarily review manuscripts which are sent to them by journal editors. In reviewing, referees provide feedback and recommendations to authors for further improvement of their research so that, if deemed accurate, relevant and valuable, it can be published in the best possible shape. A further, broader form of peer review happens after a manuscript is published, when researchers read and discuss the topic further in journal clubs, conferences and potentially in follow-up publications.

A thankless task?

Paradoxically, article publication which is universally recognized as a research output cannot be properly accomplished without formal peer review. Most researchers and research institutions agree that peer review provides essential control(opens in new tab/window). Despite this, and the fact that researchers spend a considerable amount of time and effort performing peer review, there is no universal appreciation for this essential activity. According to results from a 2015 survey conducted jointly by Elsevier’s Customer Insights team and the Publishing Research Consortium, researchers spend a mean time of more than a full working day on reviewing a single manuscript (8.4 hours) while the modal number of manuscripts reviewed per month is 1-2. It goes without saying that the least journals can do is to properly demonstrate their appreciation for such an invaluable effort.

Acknowledging our reviewers

Elsevier launched “My Elsevier reviews profile” on Elsevier’s Reviewer Recognition Platform (RRP) in April 2014 with the aim of making peer review a measurable research output. On these private profiles researchers can see the list of Elsevier journals for which they have reviewed during the past five years and collect:

  • Review certificates associated to their status based on the number of submitted reviews per journal

  • A yearly overview of their peer review performance

  • Their signature, listing journals for which they have acted as referees

  • Vouchers for Elsevier books and article publication services

“My Elsevier reviews” profile gets automatically updated each time a researcher submits a referee report via one of Elsevier’s journal submission systems, removing the need for manually logging and claiming activities. Access to the private profile page is via means of an encrypted hyperlink, sent to the reviewer by email.

Currently some 2,000 Elsevier journals offer “My Elsevier review profiles” to their journal reviewers, generating around 800,000 profiles on the RRP. Another well appreciated feature on RRP is that editors can hand-pick their top reviewers based on the quality of their performance, which generates a new “excellent” status and a certificate of excellence for the reviewer in question.

Researchers can also register their interest for reviewing for their journal(s) of choice and indicate their preferred subject area(s) via the platform. Editors can then invite these volunteers when suitable manuscripts are submitted. As has been reported previously, this initiative has proved to be a successful one and is being adopted in around 300 Elsevier journals.

Enabling public (and universal) recognition

Of course, researchers referee for journals published by many different academic publishers. A complete profile such as the publication history tool provided by Scopus paves the way to a universal acknowledgement of researchers’ peer review activities. Together with our colleagues in Mendeley we have embarked on the challenging task of transforming this vision into a reality. To do so we have been working on adding a new feature to Mendeley profiles which lists both the peer review activities that are captured by ORCiD’s peer review history tool (which captures referees’ activities for journals directly without their having to manually “claim”), as well as those from Elsevier’s submission systems. (Of course in doing so we have taken extra care in respecting the confidentiality of peer review process.)

We are delighted to say that we have solved the first part of the puzzle and as a result, Mendeley users can now showcase their ORCiD-based peer review activities on their profile. (Check out the author’s profile for example: https://www.mendeley.com/profiles/bahar-mehmani/). As of the time of writing, 2,770 Mendeley users have reviewing timelines on their profiles (with a total of 72,135 peer review entries) and we anticipate this will prove an invaluable tool for researchers wanting to showcase their reviewing activities. We are now actively working on the second part of the puzzle (incorporating Elsevier-based reviews) so stay tuned for news on this.

"We are delighted that Mendeley users can now connect their peer review activities in ORCID to their Mendeley records. Helping researchers get recognition for all their contributions, including peer review service, is at the heart of what ORCID does; this is a valuable step toward achieving this goal." - Alice Meadows, director of community engagement and support at ORCiD But our efforts won’t stop there, of course. The next step will be to further connect published peer review reports for journals practicing open peer review to Mendeley profiles. And you can rest assured that we will continue to innovate and develop the tool to ensure reviewers get the recognition they deserve.

Conclusion

Such efforts are merely the first steps towards making peer review a measurable research output and we clearly have a long way to go until peer review activities are acknowledged in all academic evaluation processes. It is important to note that other organizations are also working towards similar aims, albeit using different methodologies (Publons by Clarivate Analytics and ReviewerCredit are two examples), showing that this is a universal need. Further collaboration with ORCiD ultimately enables all such initiatives to coherently align their activities for the benefit of reviewers and to help provide just recognition for their efforts. Stay tuned to find out more!

Contributor

Portrait photo of Bahar Mehmani, PhD

BMP

Bahar Mehmani, PhD

Reviewer Experience Lead

Elsevier