Q&A: What’s on the horizon for research evaluation in Europe?
3 February 2022
By Federica Rosetta, Zoë Genova
Showing the societal impact of research is crucial for funding and policy; Elsevier’s Dr Andrew Plume talks about recommendations from the European Commission
Dr Andrew Plume, VP of Research Evaluation and President of the International Center for the Study of Research (ICSR) at Elsevier, reflects on recommendations in reports by Elsevier–Science|Business and the European Commission. Scientific research is crucial for advancing societal progress. The ability to provide the appropriate funding, infrastructure and people is necessary to form a healthy, innovative research ecosystem. And in order to promote such high-quality research, the question of how impact is measured and assessed must be explored. With Horizon Europe(opens in new tab/window) underway, the European Commission(opens in new tab/window) and other leading research and innovation funders can focus on the collective impact science and scientists will make to key policy goals over the next decade and how to assess this impact. Meanwhile, the European Council(opens in new tab/window) has called for a more qualitative approach to understanding “what researchers deliver and how” in the European Research Area(opens in new tab/window), placing more emphasis on talent, diversity, skills, and contributions to societal progress.
In October 2021, 23 senior leaders and experts from across the European research and innovation spectrum convened in a virtual roundtable, hosted by Science|Business(opens in new tab/window) and Elsevier, to debate the true value of science to society and develop concrete recommendations for action. Speakers explored what models, indicators and metrics mean for researchers and institutions, and conditions for evaluating future scientific work.
The outcome of the roundtable is a set of takeaways and recommendations in a report titled Point of Impact: What is the true value of science to society(opens in new tab/window)? The report conveys the complexity of the issue and a variety of perspectives.
The recommendations fit well with the latest developments around multi-stakeholder engagement to advance research assessment, as proposed by the European Commission in November 2021. In their scoping report, Towards a reform of the research assessment system(opens in new tab/window), the Commission proposed a coalition approach by research organizations, funders and national/regional assessment authorities to facilitate and accelerate changes to research assessment. Recently, we interviewed Dr Andrew Plume, VP of Research Evaluation and President of the International Center for the Study of Research (ICSR) at Elsevier, about his reflections on the Commission’s scoping report and the points that emerged during the Elsevier–Science|Business roundtable and in the subsequent report.
Q&A with Dr Andrew Plume
Publishers, represented by the International Association of Scientific, Technical and Medical Publishers (STM), contributed to the consultation meetings that resulted in the Commission’s scoping report. In your view, how can publishers concretely help advance the report’s principles of quality, impact and diversity?
First, we welcome the Commission’s initiative to reform research assessment. As publishers, we support their multi-stakeholder approach to advance discussions and to form a coalition to work toward change in the research assessment system.
Elsevier is experimenting in all three principles of quality, impact and diversity. Honing in on the diversity component as an example, Elsevier is taking action to create a more inclusive research community. However, we can be doing even more in terms of implementing ‘nudges’ into our solutions and products that promote representation and reduce unconscious bias.
We have been collaborating with Queensland University of Technology(opens in new tab/window) to create a fairer and more equitable way of thinking about academic recruitment. This work reveals in real time how the gender diversity of a pool of candidates is affected when applying filters for traditional indicators of career success, such as publications and citations, versus indicators that nurture the next generation of research talent. Such indicators facilitate fair and unbiased recruitment, promotion and rewards that reflect the roles of the modern researcher.
One takeaway of the roundtable relates to the complexity of the impact of research. It requires a nuanced understanding of what goes into creating good research as well as an appreciation of what the outputs of good research look like. With this in mind, how can Elsevier help navigate this complex space?
This was a big topic in Elsevier’s own funder workshop series in late 2021. In this series, we discussed and identified approaches and actions that could be taken in the research ecosystem to improve the ability to demonstrate impact. We plan to take this forward into fresh research in 2022.
Already, ICSR is collaborating with the University of Tasmania(opens in new tab/window) to develop and deploy a 5-step approach to thinking about impact. This approach aligns societal needs with university research strengths to identify ways to prioritize and then amplify impact. The model is as follows:
Frame the societal problems to solve.
Calibrate the importance to places and people.
Assess the ability to influence.
Select strategic priorities.
Build initiatives and measure change.
This model provides a chain of logic for amplifying societal impact through a consistent and replicable approach. It also highlights that partnerships with societal stakeholders are key at every stage and ultimately enable the university to demonstrate impact.
Another takeaway is that too much evaluation and limited criteria can discourage risky research and therefore inhibit groundbreaking innovation. How can assessment be modified to both encourage innovative research and cater to the limits of researcher funding and career progression?
Current career incentives remain predicated on the idea of the ‘lone genius’ researcher, while in practice, research is increasingly a “team sport.” ICSR is working with researchers in China and Norway to develop a way of sharing the credit for research publications across co-authors. This would both recognize researchers’ complex and overlapping contributions in a team and highlight that their contributions can result in research outputs that add up to more than the sum of their parts. Prof Lin Zhang of Wuhan University(opens in new tab/window) in China and Dr Gunnar Sivertsen of the Nordic Institute for Studies in Innovation, Research and Education(opens in new tab/window) in Norway are working with the ICSR research team on a project to validate a fairer indicator of author contributions to publications. By building on the fractionalized approach to counting publications(opens in new tab/window) proposed by Prof Zhang, Dr Sivertsen and Prof Ronald Rousseau(opens in new tab/window) of the Department of Management, Strategy and Innovation at KU Leuven(opens in new tab/window), we hope to assess the scalability and desirability of a new metric to support the credibility and utility of research evaluation.
While improving indicators for assessment, this work also opens opportunities for inclusion and diversity in the community by expanding what it means to share credit and contribute to research.