Reviewing the work of others is a significant part of any academic’s job. It’s easy to feel bombarded by review requests, and guilty when we decline them – or even guiltier still when we are late with our reviews! And we wonder if we are doing too many reviews or too few.
For editors seeking high-quality reviews, understanding who accepts and completes reviews — and whether they reflect the diversity of scientists whose work they are reviewing — is an important and ongoing job.
In a recent study, we sought to identify just who these valuable reviewers are for the journal Biological Conservation – where they are from, their gender and seniority, and whether any of that affects whether they accept an invitation to review a paper and how they went about their reviews. We examined 11,840 invitations to review articles sent to 6,555 reviewers for our journal, and analyzed in detail a random subsample of 600 of these.
Gender balance among reviewers?
Of all the invitations to review papers, reviewers accepted on average 37% of the invitations, which seems pretty reasonable. Men and women were equally likely to accept an invitation.
But what about gender balance among reviewers? Many journals have identified a strong gender imbalance within their reviewer community, an issue that is gaining more attention recently (e.g., Lerback and Hanson 2017). We found that Biological Conservation reviewers are also mostly men – 2.2 men were invited to review for each woman. This seems likely to be due in part to a predominance of more senior males publishing in the field, combined with more invitations going to senior researchers; the male reviewers invited were more senior than the female reviewers, with 19 vs. 12 years since their first published paper. Improving gender balance among reviewers — as well as cultural diversity — is still an unfinished job.
Academic productivity – and “pulling our weight”
Academic productivity (number of papers published) and seniority (years since they published their first paper) also did not affect a reviewer’s likelihood of agreeing to do a review. This is interesting because one might expect that those who publish more and are more senior are also busier, more likely to receive more invitations to review, and therefore more inclined to decline invitations. On the other hand, people who publish a lot and are senior in their field should also be those who review a lot. Scientific refereeing relies on reciprocal altruism; for every paper you publish, depending on the number of journals it has been submitted to before acceptance, perhaps two to four reviewers have had to read it and write a review. So for every paper you publish or submit for review, a fair system might expect you to review multiple papers – especially if you have few co-authors, or the co-authors are students or others who are not be expected to play as large a role in the reviewing system.
Perhaps within the academic world, we should develop a simple calculation that we each can use to work out just how much reviewing we ought to be doing to “pull our weight” (see Didham et al 2017 for an interesting discussion of this idea).
What about repeat invites?
We also found that a reviewer who accepted an invitation to review was much more likely to accept a second invitation – 51% of recent past reviewers were willing to review again. This is useful information for editors, letting them know that past reviewers are willing to review more papers for the journal and are not avoiding a second invitation to review. Our study also shows that those who were sent an invitation but did not accept it only accepted a second invitation 21% of the time, so it is not so worthwhile to send a second invitation to someone who recently declined or did not respond to an invitation.
What countries are the reviewers from?
While there are advantages for editors to target prospective reviewers who are more likely to accept, the down side of this strategy is that it risks cultivating a band of core reviewers who might not represent the diversity of researchers in a field. Indeed, the vast majority of reviewers in our sample were from wealthy, English-speaking countries; 60% of reviewers were from just four countries – the UK, Australia, the US and Canada. In contrast, other populous but less affluent countries, such as India, China, Indonesia and Nigeria, contributed less than 1% of reviewers each, despite their great biological diversity, large populations and large numbers of universities. Interestingly, the reviewers most likely to accept invitations were from China (67% of invitations accepted), Portugal (54%) and India (54%) in contrast with the overall 37% acceptance rate. The lack of cultural diversity of reviewers is a widespread issue in reviewing and is similarly reflected in the low percentage of published conservation biology papers from countries such as India and China.
It seems evident that editors need to make a greater effort to invite reviewers from under-represented countries. Editors also need to encourage authors from these countries to submit their papers to international journals, and perhaps provide authors with more editorial support in preparing their papers for publication.
How difficult is it to find reviewers?
Our study also highlighted many positive aspects of the review process. Of the people who responded to an invitation, half of the responses were received in 3.5 days, and 81% within one week. So the initial phase of the review process is fast.
The review process also has a surprisingly high success rate, with 92% of reviews turned in by reviewers following acceptance, and 80% completed within 4 weeks; so once a reviewer accepts an invitation, the reviewer has a very high likelihood of turning in a review on time.
Given these positive features, why do journal editors often worry about the future of the peer review process, and complain about finding enough reviewers? It turns out that on average, editors need to invite 6 to 7 reviewers to get 2 to 3 acceptances to review, but for about one in eight papers, editors have to invite 10 or more reviewers to get enough acceptances. It may be that editors tend to focus on these problem papers rather the more typical papers. Perhaps seeking more reviewers from outside of a few wealthy, English-speaking countries could help improve diversity while reducing the workloads of editors.
Read the study
Elsevier has made this article freely available for six months, until September 21, 2017:
- Primack RB, Maron M, Campos-Arceiz A: Who are our reviewers and how do they review? The profile and work of Biological Conservation reviewers. Biological Conservation, in press (2017)
- Campos-Arceiz A, Primack RB, Koh LP: Reviewer recommendations and editors' decisions for a conservation journal: Is it just a crapshoot? And do Chinese authors get a fair shot? Biological Conservation 186: 22-27 (2015)
- Didham RK, Leather SR, Basset Y: Don't be a zero-sum reviewer. Insect Conservation and Diversity, 10: 1–4. doi:10.1111/icad.1220 (2017)
- Kearney MH, Baggs JG, Broom ME, Dougherty MC, Freda MC: Experience, time investment, and motivators of nursing journal peer reviewers. J. Nurs. Scholarsh. 40, 395–400 (2008)
- Lerback J, Brooks H: Journals invite too few women to referee. Nature 541: 26. (2017)
- Primack RB, Ellwood E, Miller-Rushing AJ, Marrs R, Mulligan A: Do gender, nationality, or academic age affect review decisions? An analysis of submissions to the journal Biological Conservation. Biological Conservation 142, 2415–2418. (2009)
- Souder L: The ethics of scholarly peer review: a review of the literature. Learned
- Publishing 24 (1), 55–72 (2011)
- Tite L, Schroter S: Why do peer reviewers decline to review? A survey. Journal of Epidemiology & Community Health 61, 9–12 (2007)