Editor’s note: This month, Elsevier Connect is exploring “the personalization of technology in science and health.” Here, research managers from Queen’s University Belfast, Ireland, talk about how they used SciVal to get personalized insights to improve research performance.
For an institution like the research-intensive Queen’s University Belfast, publications are a fundamental part of the work being done. While helping scientists disseminate their findings, they attract vital funding and high quality students and help deliver broad economic, social and environmental impact.
But not all publications are created equal – especially when it comes to the world university ranking exercises, such as those produced by QS and Times Higher Education. The research management team at Queen’s noticed that despite being ahead of their peers on growth in output, they were falling behind in citations – which accounts for a large portion of the scores used in the rankings. What was happening, and how could they turn it around?
During a 3-year collaboration, Queen’s University Belfast worked with Elsevier’s Research Metrics team to undertake a deep dive analysis of the university’s published research across several schools. They looked at the citation impact and quantity of publications as well as the proportion of international collaborations. The results not only gave them insights into how the university compares to its peers; it provided the evidence that helped change the culture and conversations around research metrics that support their new publication strategy and research ambition.
In the webinar Understanding and Utilizing Publication Metrics to Enhance Research Performance, held on June 19, Scott Rutherford, Director of Research and Enterprise at Queen’s University Belfast, and his colleagues shared the story with more than 1,100 viewers, explaining how they and the Elsevier team worked to produce tailored data and uncover the trends underpinning the university’s research output.
Understanding research rankings
Queen’s knew its publication record was strong but hadn’t seen the level of progress in citation scores they had hoped for to match their significant aspirations. Indeed, despite experiencing a faster growth in publication output, Queen’s was lagging behind its peers (a selection of leading universities in the UK Russell Group of research-intensive institutions) in citation performance, which was impacting on the university’s position in the global university rankings.
Queen’s wanted to understand how they could improve its performance against their peers, as Dr. Jonathan Greer, Research Information Manager at Queen’s, explained:
We were below where we wanted to be in the university rankings, and it’s difficult to change that proactively. We knew from other national research assessment exercises in the UK that our publication output was relatively good, but why was it not being reflected in citation measures? We wanted to find out if the university was missing a trick and if there was something we could do to improve.
By 2012, Queen’s was looking for a way to dig into its publication record to address some of these questions, and Elsevier was developing SciVal – a tool that every user can personalize to provide the insights about research performance they are looking for. After initial conversations, the two organizations decided to collaborate to see if both could learn from an in-depth 3-year program.
Rutherford is responsible for strategic oversight and leadership for the research development and knowledge exchange activities of the university. He initiated the project, aiming to answer a number of questions, which he outlined in the webinar:
We came up with a series of questions that were fundamental in this partnership with Elsevier. The first of those was why weren’t we seeing an increase in citation performance? The second was is there a large gap in the quality of output at Queen’s in relation to our peer group? The third was what influences citation activity and behaviour? And, most importantly, the fourth was what informed actions might we introduce as an institution to enhance our performance in terms of publications and citation levels?
Personalizing the data
Rutherford and his colleagues worked with Elsevier’s Chief Academic Officer, Dr. Nick Fowler, and the Research Metrics team to develop a program. Then they held a series of workshops to present and discuss the metrics and position of Queen's relative to their peers with selected academics from various schools. This was helpful for identifying who to work with in the next phase of the program, Dr. Greer said:
To begin with, we worked with the academics who were most interested. We then invited them to volunteer for the deep dive; at that stage, it wasn’t a case of us picking who we thought would be interested, but rather it was those who were keen who volunteered. When information started to come out, it was easier to engage more people, especially when other academics started standing up and talking about this – it opened up a pathway to roll it out.
The team created a database of publications from the bottom up, covering their research output between 2008 and 2012. Reflecting the university’s ambition, they chose leading peer institutions from around the world to compare their performance against and collected their data. This information could then be used to compare Queen’s to its peers in a number of dimensions, such as the rate of citations, the extent of international collaborations and the journal publication patterns.
The data could also be more deeply contextualized: the team used SciVal to build clusters of individuals working within a school and asked the academic leads whom they wanted to benchmark these teams against. They then built similar structures for peer organizations, creating a customized research group structure.
This personalization is integral to the project – whoever is using the platform can customize the information they work with, whether it’s at an individual, departmental or university level. Dr. Lisa Colledge, Elsevier’s Director of Research Metrics, who worked on the project, explained:
SciVal has been built so it can be personalized by anyone who’s using it. What an academic needs will be different from what a research manager needs, and they will have different ways of using it. They each have an account to customize, and there’s a whole menu of metrics they can choose from.
Answering questions with data
Some clear pictures emerged that helped the Queen’s team understand their performance and make plans to improve it. Their first finding was that their 5-year average citation count was good, but that figure was masking one particular problem: 16 percent of the papers had not been cited at all. For academic lead Professor Theresa McCormack of the School of Psychology, this has serious implications: “That science is disappearing off the radar,” she said in the webinar.
Queen’s was lagging behind its peers in terms of citation growth, and the analysis showed this may be due to the journals its scientists were publishing in – they had a lower proportion of publications in the highest citation impact journals compared to their peers. Prof. McCormack continued:
Staff at Queen’s tended to be focused on publishing in subject-specific empirical journals – the sort of outputs that are particularly valued in the national research assessment exercise in the UK, known as the Research Excellence Framework (REF). This analysis suggested we have to think bigger than the REF exercise if we want to increase the impact of our work. Researchers should also be aiming to have some of their publications in high profile review and interdisciplinary journals.
They also looked at international co-authorship and found a relatively high level of international collaboration. Diving deeper, they discovered that while overall international collaboration was relatively high, it was mostly with European authors, and they had far fewer collaborations than their peers with institutions in North America.
Changing the culture
By sharing and tailoring this information, the research management team at Queen’s has been able to engage a growing number of researchers, which has led to a culture change across the university, which Rutherford described in the webinar:
In many ways, our most fundamental achievement has been to move the conversation on from indifference and scepticism to one where we acknowledge that there is a role for citations and that there are important ways of using data to understand your performance in relation to your peers.
Part of this comes from a better understanding of the metrics available and what they can show. Not only are researchers more aware of citation metrics and how they can inform decisions, but also how they work in combination with other metrics. In particular, the program highlighted the importance of traditional publication metrics, Dr. Colledge said:
This project has proven to us that just because new metrics become available, it doesn’t make traditional metrics obsolete. Citation metrics have been around for a long time, and while alternative metrics new metrics such as alternative metrics are needed to provide the broadest and most complete insights, there is still an important role for the citation data. Using new alongside old – not instead of old – delivers the strongest outcome.
Using the results of the program, the Queen’s team developed a publication strategy to help researchers target the right journals, shifting the focus from quantity to quality. The metrics provided the evidence they need to encourage a change in behavior, as Dr. Greer explained:
There is a pressure on staff to publish. Traditionally, if a researcher completed a large project, they might try to publish four or five papers from the study in different journals. On foot of the findings from the SciVal analysis, you can suggest they take a step back and think about the quality; now the thinking is to aim high and publish fewer papers which are in quality journals. This program helped bring these issues to the fore, open discussions and raise awareness.
The result? The university has already seen higher levels of publications in top-tier journals, continued development of international networks and collaborations and a higher field-weighted citation impact (FWCI) compared to many of its peer institutions. (FWCI is a normalized indicator of research impact that accounts for differences in citation levels across fields, article types and publication ages.)
As such, the university is also narrowing the gap in citation measures compared to peers, including the citation scores as published in the global university rankings. With the increased awareness of performance measures and its use in the overall research strategy, the team has made proactive changes to performance.
Although the project with Elsevier has finished, there is still a lot the Queen’s team plans to do; the data is informing strategy at the institutional level, but the real changes are made by individuals. As they continue to engage more academic staff and encourage people to use SciVal to analyze their own research, the results will continue to shape the development of the university’s research output.
Watch a video about the collaboration
The personalization of technology in science and health
For universities to improve their research performance and rise in the rankings, it helps to have metrics to benchmark their performance against that of peer institutions and discover their strengths and weaknesses. Elsevier’s SciVal gives institutions access to the research performance of 8,500 research institutions and 220 nations worldwide. To further personalize its research assessment program, Queen’s Univeristy Belfast worked with Elsevier’s Research Metrics team.
- Find more stories on the personalization of technology in science and health.
- For more stories about people and projects empowered by knowledge, we invite you to visit Empowering Knowledge.