Scientific review is at the core of both the publishing [of research] and the decision of what work gets funded. And yet in any room of academics, it probably causes more attention, more excitement and more disagreement than anything I know.
With those words, Dr. Lesley Thompson introduced the Scientific Review panel she moderated at Elsevier’s Research Funders Summit last November. She is well known to both worlds – funding and publishing – having spent 26 years overseeing research at the Engineering and Physical Sciences Research Council (EPSRC) before joining Elsevier four years ago as VP of Academic Relations.
Top issues facing NIH peer review
In her presentation on the top issues facing NIH peer review, Dr. Sally Amero, Review Policy Officer and Extramural Research Integrity Liaison Officer for the National Institutes of Health (NIH), explained a number of ways people have breached review integrity:
Some of the things we’re dealing with now include threats and bribes to reviewers or against reviewers … cabals and networks of people across the country who are looking out for each other’s welfare, embellished bio sketches, reciprocal and requested favors. Incomplete conflict of interest certifications is becoming more and more of a concern. Leaks of information before the meeting, inappropriate access to our secure review site, and applications being shared outside of our review meeting.
“The reason this is so critically important,” Dr. Amero said, “is that patients are at stake; so if we make a mistake in peer review and a study gets funded and a patient gets hurt, then that’s really on us.”
Working with their lawyers, she said, they have developed a repertoire of ways to deal with these issues, including interviewing witnesses, notifying institutional officials, withholding or terminating an award, pursuing suspension or debarment and terminating a reviewer’s term of service.
How to find reviewers
Elsevier's Expert Lookup helps funding organizations identify scientific reviewers for grant applications across disciplines and from around the world. The online tool suggests researchers who meet organizations’ funding priorities while identifying potential conflicts of interest around co-authorship and funding streams. It's powered by Elsevier’s Fingerprint Engine algorithms run on Scopus’s citation database of over 75 million records and 10 discipline-specific taxonomies.
Safeguarding peer review in the UK
Dr. Neil Viner spoke of similar challenges as Director of Programme Delivery for one of the the UK’s largest funding organizations: the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation.
Peer review remains the “gold standard,” he said, assuring government, citizens and the research community that public money is being allocated effectively. But it faces major challenges:
- Sustainability, with increasing competition for scarce funds, and reviewer fatigue.
- Low success rates risk conservatism
- Funding requirements becoming complex and driven by increasing time pressures.
- Increasing volume of interdisciplinary and collaborative research, making it necessary to find interdisciplinary reviewers.
- Unconscious and systemic bias.
In order to maintain the trust of citizens, government and researchers, he said, we must reduce the burden on applicants and reviewers, leverage technology to make peer review more accountable, transparent and faster, and ensure we fund the highest quality research that benefits society.
Do researchers trust the research they read?
Learn more in Elsevier's new Trust in Research study.
He said they were looking into ways to use technology to make the system faster and more efficient – while being mindful of the limitations of technology.
“We have to manage people’s enthusiasms for technology to take away all their problems and manage the unintended consequences of system design,” he said, showing a slide of a recent news item in which Amazon’s discovered that its new recruiting engine was biased against women.
“If Amazon don’t understand how AI works, what hope have we as mere research funders got to deploy artificial intelligence or any form of data into our work?” he said, drawing laughter from the audience.
Humor aside, the presence of gender bias in automated technology has also been an issue for the UK research councils. Dr. Viner said EPSRC reviews the occurrence of unconscious bias in peer review, sharing the key findings, of one study:
We have a keyword driven system to select reviewers, but then we found that women, for whatever reason, tend to use more precise and fewer keywords to describe their expertise, whereas men tend to be far more expansive, and claim a much wider range of knowledge, and using more keywords. And in a keyword-driven system, what will happen then is that male reviewers are being returned more often, with a result that we had a great bunch of female reviewers that weren’t utilizing as much as the men.
So that’s how easy it is to get it wrong and design biases into your own system without even trying.
Dr. Viner said ESPRC has taken various approaches to safeguard peer review. Changes include:
- Mitigating the contrast effect by avoiding comparative scoring in panels.
- Reducing time pressure in panel meetings and interviews by reducing workload of assessors.
- Improving the quality and clarity of assessment criteria and interview questions.
- Implementing a training programme for EPSRC staff and panel members to raise awareness of, and challenge, bias.
- Including ‘fairness’ as one of EPSRC’s peer review principles.
- Leading a UK Research and Innovaton (UKRI) working group involving 8 UK universities to build the evidence base, share ideas and best practice, and develop plans for further interventions.
“Governance of research, by researchers for researchers …”
Dr. Thompson closed this session by sharing an inspiration she received from touring the historic sites of Washington, DC, the day before.
I was really taken at the Lincoln Memorial by his Gettysburg address and its ‘Government of the people, by the people, for the people.’ I think scientific review should be governance of research, by researchers, for researchers – or for society.