New center aims to create a more transparent approach to research assessment

At Elsevier’s International Center for the Study of Research, experts will examine research using metrics and other qualitative and quantitative methods

ICSR image
Aspect from the website of the International Center for the Study of Research.

How do you measure the impact of research? It’s not an easy question to answer, but it’s one that academics are increasingly faced with. Whether it’s part of a performance-based funding system that will determine how much money your institution receives – like the UK’s Research Excellence Framework – or part of your career development plan or a research bid, success means being able to show that what you did was worth doing.

To help create a more transparent approach to research assessment, Elsevier has launched the International Center for the Study of Research (ICSR). Its mission is to encourage the examination of research using an array of metrics and a variety of qualitative and quantative methods.

Andrew Plume, PhDDr. Andrew Plume, Senior Director of Research Evaluation at Elsevier and Chair of the ICSR Advisory Board, explained:

There are increasing calls for public understanding of and engagement with research – and to understand the return on investment for research that’s paid for from the public purse. But we cannot measure and understand that which we cannot describe accurately. So the center wants to participate – humbly because many people have tried to tackle this problem, and we don’t pretend that we have all the answers – using Elsevier’s unique perspective as both a publisher and an information analytics provider with vast data sets from which we can share findings.

Éric Archambault, PhDDr. Éric Archambault, Head of the ICSR and General Manager of 1science, and founder of Science-Metrix which was acquired by Elsevier in December, added:

The evaluation of research has always been heavily-debated. Our aim with this center is to initiate coherence across these discussions and provide a progressive momentum and a critical examination of evaluation approaches and metrics which hold value for both researchers, and the public more broadly. The creation of the ICSR will advance the study of research information and evaluation, ensure the appropriate use of performance indicators and promote evaluation best practices.

Some of the outputs researchers and research managers envision from the ICSR will be practical, technical developments in the way research measurements are presented.

“We’re going to have a lot of technical discussions about the appropriate way of counting publications and related indicators in journals and books and other outputs,” Andrew said. “One of the things we’d expect the center to influence directly is the way metrics are displayed and presented in context – in the platforms and solutions Elsevier provides. So we will look at what’s the most useful way to present metrics for someone in their Scopus profile page, or in Mendeley or Pure, for example.

"We’ll also be looking at how those numbers are created, so when – for example – you go to ScienceDirect and see the PlumX metrics around your paper, those numbers will be as accurate and clearly defined as they can possibly be. We’re in this to make positive changes.”

To help steer those positive changes, the center will have an Advisory Board populated by external experts in research, research evaluation, policy and research management, as well as representatives from Elsevier. There will be 16 board members representing all continents, with no more than four Elsevier officials serving at a time.

The ICSR Advisory Board

Currently confirmed members are:

“These experts will address how we most appropriately reflect the enterprise of research and evaluate whether it’s producing the knowledge that’s having an appropriate and positive impact on human society,” Andrew explained. “That’s where everyone would like us to get to.

“Societal impact is talked about a lot in research valuation circles, but it’s a slippery concept and there’s no universally-agreed definition. We may not come up with one, but what we do want to do is operationalize what can be thought of as societal impact, and then approximate that through metrics.

“How do you draw causal links between published research outputs and the ultimate downstream impact? We want to be part of the conversation that is looking for that solution.”

Holly J. Falk-Krzesinski, PhDThe Advisory Board will be at arm’s length from Elsevier and ICSR at what Andrew describes as a respectful distance and with mutually non-binding obligations. The board will have full academic and professional freedoms, meaning that should it disagree with a metric, process, or guideline enacted by the ICSR, it will be free to publish its own recommendations.

“We absolutely don’t want to suppress informed dissent,” Andrew said. “In fact, the point of the board is to bring these different perspectives so that we can have a constructive discussion.”

The Advisory Board has been formed to include a geographically diverse range of voices, reflecting the fact that there is no single “research culture” but rather many cultures around the world. Concepts of research metrics, evaluation, policy, and so forth will be rooted in those cultures and vary accordingly, said board member Holly J. Falk-Krzesinski, VP of Research Intelligence at Elsevier:

We can’t examine these things in isolation – we have to consider them in the context of the prevailing research cultures, both regional and disciplinary. Certainly, across the world, these research cultures are not without challenges – there are questions over career advancement incentives, gender diversity and representation of minorities. There are questions over the appropriate balance between curiosity-led research and research with a practical application. In much of the world, we have phrases like ‘publish or perish’; is that healthy? Are hyper-competitive research cultures holding back advancements to open science because we’re not seeing widespread sharing of research data or early sharing of results through preprints?

These aren’t, strictly speaking, research metrics problems – but they are important to those issues, and the core of the center is to understand the modern enterprise of research.

Quick question for you

Which terms do you most associate with Elsevier? (check all that apply)

Data and analytics
Research platforms
Technology
Decision support tools
Publishing
Books and journals
Scientific articles
Healthcare content

Tags


Contributors


Comments


comments powered by Disqus