Research Evaluation & Funding

A fresh look at the socio-economic impact of research

How SciVal’s new metrics can show institutions their economic impact – with upcoming webinars

Print Friendly and PDF
Share story:  

Universities, funding bodies, governments and companies all want to know how the research they’re conducting and paying for is contributing to society. Each has its own approach to gathering and analyzing data to determine the socio-economic impact of their work.

This can be challenging. How can research metrics be used alongside qualitative input such as case studies? Which metrics are important? How can they be used effectively to provide valuable input into strategy, and to demonstrate impact to support decisions about future funding and research?

SciVal webinars

SciVal webinars with guest speakers

Update: For the release of SciVal’s new metrics, SciVal held two webinars featuring customers who were involved in beta testing the new release. Dr. Amberyn Thomas, Director of Scholarly Communication and Digitization Services at University of Queensland, Australia, and Dr. Michelle Hutnik, Science Analyst for the Office of the Vice President at Penn State University, will shared their experiences on the new metrics and their practical applications.

You can watch the webinars here.

A golden rule of using research metrics is that no metric should be used in isolation. Putting undue emphasis on any one metric can give a biased view of the socio-economic impact of research, or any other type of impact. In turn, it draws disproportionate attention to a single aspect of research, which can have a deleterious long-term effect on an institution’s – or country’s – research output.

Metrics beyond the traditional citation metrics, like the number of times a publication is downloaded (“usage”) or the media coverage a piece of research has attracted, are increasingly being added to the pool, helping to recognize diverse types of excellence and reduce the risk of potential bias. Snowball Metrics, run by universities around the world, aims to align approaches to help institutions benchmark and get the most out of these metrics.

It can also be tricky to define “impact.” Measuring the socio-economic impact of research involves looking at partnerships with businesses, reputation in the research community and the positive effect on society beyond academia. Snowball Metrics measures partnerships in several ways, such as academic-corporate collaboration, consultancy activities and spin-outs. Altmetrics provide insights into reputation and positive effects on society, with activity in academic networks like Mendeley and social networks like Twitter being popular data sources.

Elsevier is taking a fresh look at socio-economic impact metrics and is adding insightful data sources to SciVal to help institutions navigate metrics more effectively. The new release features patent article citations. The wider availability of usage metrics throughout SciVal also supports versions of impact beyond traditional citations.

Tracking publication usage

When someone downloads or views a scholarly publication online, this generates usage data. Knowing how many times a publication has been viewed can give researchers and institutions an immediate idea of the interest in the article and the research, in advance of citations being received, providing a valuable piece of the wider impact puzzle.

In February 2015, Elsevier launched the Trends module in SciVal to provide information on emerging and declining research topics, identify the most active and top-performing countries, institutions and authors, and show which research is being viewed the most. This module included usage data from both ScienceDirect and Scopus.

In last month’s release, we extended the Scopus usage data to the whole platform, so it is also available in the Overview, Benchmarking and Collaboration modules. Users can see how many times a publication has been viewed globally, giving an earlier and more complete indication of how much interest there is in a particular piece of research.

“The Scopus views data in the new release will make it very easy to see how much usage the publications from the university are getting,” said Dr. Olga Moskaleva, Advisor to the Library Director at Saint Petersburg State University and one of the customers who beta tested the new release. “The next step is to see whether we can use this data to help improve our citation counts.”

Scopus views data used for team benchmarking purposes.

Research in innovation

So usage data is an early reflection of the interest academics have in research, but how can you find out the effect research is having on innovation?

Patents give us one picture of innovation and can show how and where knowledge and business flows from research. They can be the result of academic-corporate collaboration and may indicate the commercial application of research.

When someone files a patent, they can cite research that supports it; the citation published in the granted patent is called a patent-Scholarly Output citation. Unlike standard publication-to-publication citations, these can take much longer to register because there can be a significant time delay between the filing and granting of a patent.

These patent–Scholarly Output citations are tracked by LexisNexis, part of Elsevier’s parent company RELX Group, and the new SciVal release includes this data from five patent offices: WIPO (World), USPTO (US), EPO (EU), JPO (Japan) and IPO (UK). The data shows the number of patents citing an institution’s publications from 1996 onwards. It also shows the number of an institution’s publications that are cited in patents. This gives institutions a clearer idea of the innovative applications of their research output, helping them plan and allocate future resources.

SciVal has added four new patent-Scholarly Output citation metrics to allow users to see how many patents are citing their publications, how much of their scholarly output has been cited and how often.

A 5-year overview of patents citing the scholarly output published by Athena University.

Michelle Hutnik, Science Analyst for the Office of the Vice President at Penn State University, commented during beta testing:

Patent-citation metrics extend SciVal beyond scholarly output – they create a link between research and economic or commercial potential. I am looking forward to seeing how this information can be used by our tech transfer office.

As institutions, funding bodies and governments are under increasing pressure to show the socio-economic impact of their research, it’s becoming more and more important to pay attention to a wider variety of metrics. With the new release of SciVal, we’re starting to incorporate such metrics to help users understand the wider effects of the research they carry out and fund.

About SciVal

SciVal provides ready-to-use tools to analyze the world of research and to help establish, execute and evaluate the best strategies for research organizations. It’s part of the Elsevier Research Intelligence portfolio of tools and services.

Elsevier Connect Contributor

Lisa Colledge, DPhilDr. Lisa Colledge (@LisaColledge1) is an expert in the use of research metrics. She is responsible for developing and defining Elsevier’s research metrics strategy.

She started by working with editors and learned societies to develop strategies to improve journals’ standings. She then joined the Elsevier Research Intelligence product group, which most recently launched SciVal Trends. Lisa also represents Elsevier in Snowball Metrics, in which universities agree on methods to generate metrics, from all data sources available, to support international benchmarking.

Prior to joining Elsevier, she conducted postdoctoral research at the University of Edinburgh. She holds both a DPhil and an MA from the University of Oxford.

comments powered by Disqus

Share story:  

Related Stories