Research performance collaborations

Researchers and research institutions are under increasing pressure to demonstrate both their academic and broader societal impact. Elsevier partners with the scientific community through programs such as Snowball Metrics to develop a basket of metrics that is comprehensive, relevant and usable at all stages of the research workflow.

Whether it’s via citations, teaching assessments, compliance scores or grant success rate, there is a growing mandate to measure every element of a researcher’s work across the entire research workflow. Universities, too, have to address national assessment exercises, rankings and the pressure to show return on investment for research programs.

Collaborative projects with teams at Elsevier are looking more deeply at ways to measure impact of research and ways to optimize performance at the institutional level.

Measuring research performance

Assessing research impact in SDG 2 (Zero Hunger) and 17 (Partnerships for the Goals)

Measuring research performance

A key sustainable development goal (SDG) of the United Nations is in Zero Hunger. In an ongoing collaboration at Wageningen University and Research the WUR-Elsevier team is looking at how data from Scopus, Scival, NewsfloPlumX and LexisNexis NewsDesk can help demonstrate how Wageningen University’s SDG2 research and researchers are used and taken up by policymakers and the media. Not only will the holistic reporting show the institution’s contribution to Zero Hunger, it also supports SDG 17: Partnerships to address the SDG Goals.

Read more about the outcomes of the WUR project and see the partnership in action.

Text mining for links between research and policy

 RAISE research lab

Researchers work to make an impact on the world, but does research influence legislation?  The RAISE research lab at North Carolina State used the LexisNexis HPCC supercomputing platform to mine Scopus and US legislation text to assess the correlation between research and laws.

Social metric modelling attribution

Research Associate Waqas Khawaja (left) in the Semantic Web group and PhD Student Mohan Timilsina in the Machine Learning & Statistics group at the Insight Center for Data Analytics, National University of Ireland, Galway.

Citations aren’t the only way to measure research impact. Researchers at the Insight Centre for Data Analytics at the National University of Ireland, Galway looked at how online news and social media could be attributed to published research, and how that could be weighted and modelled. Using data from Scopus and PlumX, the team developed a framework for Social Metrics modelling and attribution.


Predicting outcomes

Evidence-based planning for research rerformance

Queen's University Belfast (photo (C) istock.com/RobertMayne)

If a university knows where they stand, they can plan their future. Queen’s University Belfast worked with Elsevier’s Research Metrics team to undertake a deep dive analysis of the university’s published research across several schools. They looked at the citation impact and quantity of publications as well as the proportion of international collaborations. The results not only gave them insights into how the university compares to its peers; it provided the evidence that helped change the culture and conversations around research metrics that support their new publication strategy and research ambition.

Elsevier-UC Davis Data Science Program

Left to Right: Duncan Temple Lang, Director of the Data Science Initiative; MacKenzie Smith, University Librarian; Brad Fenwick, Senior VP of Global Strategic Alliances at Elsevier; and Paul Dodd, Associate Vice Chancellor for Research. Photo credit: Kar

Universities benefit from assessing the competitive positioning of their programs. The joint Elsevier-UC Davis Data Science Program will utilize Elsevier’s support, data and tools to evaluate the effectiveness, competitiveness and impact of UC Davis’ research, teaching and other academic activities using a data-driven approach. Their goals include improving institutional effectiveness and building more effective models for interdisciplinary collaboration among researchers.

Improving the evaluation process

Can reviewers really predict the impact of an article? In collaboration with Cell Press, a team from the Crowd Innovation Lab at Harvard Business School is investigating whether there are detectable elements in a manuscript that can be used to predict the future impact of a research article. The research team will measure how key characteristics of submitted manuscripts (for example combinatorial novelty, multidisciplinary coverage, topic currency) correlate with editorial and reviewer evaluations, and published article metrics.


Additional resources