Improving the review times on our journal

How small changes can influence reviewer behavior

Raj Chetty, PhD, is the Bloomberg Professor of Economics at Harvard University Department of Economics and Editor-in-Chief of Journal of Public Economics.

Together with two colleagues - Emmanuel Saez, E. Morris Cox Professor of Economics and Director of the Center for Equitable Growth at the University of California Berkeley, and László Sándor, a PhD candidate in Economics at Harvard University - Chetty ran an experiment evaluating the effects of cash incentives, social incentives, and nudges on the behavior of referees at the Journal of Public Economics. The interesting, and sometimes surprising, results of their study will be published this summer in the Journal of Economic Perspectives. Here Chetty talks about the rationale behind the trial and lessons learnt.

Since I took on the role of Editor-in-Chief, I have been interested in how we can better serve authors and improve the review times on our journal. And, as an author myself, I would love to see my papers reviewed more quickly.

To design the trial, we began by reading the literature and thought about what might prove effective in motivating reviewers to submit high-quality reports more quickly.  We formulated three interventions.

Journal of Public EconomicsFirst, naturally, as economists, we thought that paying people might prove an incentive. But psychologists suggest that payment can crowd out “intrinsic motivation” and actually lead to worse performance.  So these contradictory hypotheses seemed very natural to test.

Second, the psychology literature suggests that simple nudges and reminders can affect people’s behavior, so we decided to try changing the deadline by which reports were due.

Third, sociologists have suggested that social incentives – namely, how people are perceived by their peers – may be a key determinant of behavior.

To test these hypotheses, we randomly assigned referees to four groups:

  • Group one was the control and participants had a six-week deadline to submit their reports.
  • Group two was given only four weeks to provide their reports.
  • Group three also had only four weeks and if they met that deadline they received $100.
  • Group four was offered a social incentive, i.e. we informed them that their turnaround times would be publicly posted.

In total, the experiment included 1,500 referees who submitted nearly 2,500 reports from February 2010 to October 2011.

I should mention that all the interventions we tested have been used by other journals, but until now there has never really been a clear examination of which factors work best.

Key takeaways

First, a change in timeframe is very effective; if you shorten the deadline by two weeks you receive reviews two weeks earlier on average. In fact, we noticed that whatever timeframe you give, most people submit their review just prior to the deadline. Editors might worry that if you ask reviewers to review more quickly, they submit lower-quality reviews.  However, we found no significant changes in the quality of referee reports, as judged, for instance, by the editor’s propensity to follow the advice in the report.

Second, if a journal has the money available, cash incentives also work very well. The $100 payment reduced review times by about 10 days on average.  Hence, it is clear that the “crowd-out of intrinsic motivation” that psychologists have been concerned about is actually not a serious concern in this context.

Third, the social incentive was less effective but still surprisingly successful in reducing review times, particularly with tenured professors, who were less sensitive to cash and deadlines.  This confirms that people care about how they are perceived and suggests that gentle, personalized reminders from editors could be very effective in improving referee performance.

Overall, my biggest takeaway was that, as editors, we shouldn’t believe that the performance of our journals is something we can’t change.  We can greatly improve the quality of our journals’ review process through simple policy changes and active editorial management.

Personally, I was surprised by how effective the shorter deadline was. There was no consequence for reviewers who didn’t meet it, yet they were still very receptive. The advantage for journals is that this approach is cost-free. I would probably be less responsive to the cash incentive, so I was also quite surprised by how successful that proved to be. However, if I have to do something anyway and by doing it today I get $100 then perhaps it’s not so surprising it has some effect.

Going forward, we would love to see other journals adopting some of these policies.  And for reviewers, I would suggest that often what is useful to editors, especially if you are going to recommend rejection, is a short, clear - and on time - report, rather than something that is more detailed which takes longer to draft. By focusing on the big picture, you not only save yourself time but better serve editors and the author community too.

Author biography

ChettyChetty's research combines empirical evidence and economic theory to help design more effective government policies. His work on tax policy, unemployment insurance, and education has been widely cited in media outlets and Congressional testimony.

Chetty was recently awarded a MacArthur "Genius" Fellowship and the John Bates Clark medal, given by the American Economic Association to the best American economist under age 40. He received his PhD from Harvard in 2003 at the age of 23 and is one of the youngest tenured professors in the university's history.

Archived comments

Donald Myers says: May 15, 2014 at 7:32 pm
One aspect you did not touch on is the quality of the reviews, both from the perspective of the editor(s) and from the perspective of the author(s). Clearly it is easier and quicker for a reviewer to simply do a quick appraisal and write a review that does not include any information from the reviewer as to why/how they reached their conclusions.

A second point is that I review for journals across multiple disciplines and the various editors have little idea as to how many or how often I am sent ms’s. I have found that when I get a ms it seems to attract more, i.e. review requests come in bunches.

Finally, reviewer time often depends on the “season”. I am retired and so free of the usual academic schedule but still affected by holidays, travel and even health factors.

In summary I agree that reviewers need to be quick in doing their work but I think your experiment leaves out a lot of unaccounted for factors that can have a significant effect on how quickly reviews get done.

I also find that since ms’s are almost exclusively in English, many by authors whose command of English is not as good, the time it takes to complete a review is substantially affected by the quality of the English in the ms

comments powered by Disqus