Tools & Metrics

How you are using usage data to measure research impact – or what’s stopping you?

On a recent webinar, Elsevier’s Research Intelligence team conducted a short survey; here are the results

Print Friendly and PDF
Share story:  

3d-glassesWe’ve watched with great interest the change in television development over the past 10 years. We’ve seen standard definition (SD) pictures replaced with high definition (HD), and our lowly 2D picture enhanced to 3D – although we’ve never really got used to the glasses you have to wear to get maximum impact.

In some ways the progress in the clarity of the picture we are seeing and number of dimensions available to us could be likened to the way we now measure research. The impact of research is always multi-dimensional; whether you are looking at the performance of a paper, person, group, institution or yourself, you should always look at it from more than one angle. With the introduction of additional data sources for metrics, such as usage data, we are not only improving the accuracy of our picture of research impact but also providing new dimensions of measurement.

We recently published a Usage Guidebook and an article about the usage-based metrics and tools Elsevier has made available to analyze and understand the impact of research more thoroughly.


Download the full research paper based on this article:

A 'basket of metrics”' — the best support for understanding journal merit,” by Lisa Colledge, PhD, and Chris James, published in European Science Editing.


Still, there is a lot still to be learned about your experience and the insights you can gain from this resource.

Here, we share the results of a survey we conducted during our recent webinar to investigate how usage metrics are currently used and perceived. (You can view the webinar below.) About 200 external (non-Elsevier) participants joined from all over the world; the most-represented countries were the United States, United Kingdom, Russia, Italy and Canada. The results shown in this article represent the input received from external participants only.

Many of the attendees had participated in the SciVal Trends launch webinars, where we introduced usage as a new data source in SciVal. They ranged from visiting professors to vice deans and librarians to research development coordinators, but they all had a common interest in learning more about the application of usage metrics in research evaluation.

In some ways, usage data have been underpinning decisions for a long time. Librarians have tracked usage data for years to manage their collections. However, the use of usage data beyond a single institution, to benchmark research, is perceived as being relatively novel.

Our first question, therefore, focused on the current use of usage metrics.

1. How often do you currently use usage metrics?

129 participants responded. Participants could select one option.

We were surprised to find that only 23% respondents never use usage metrics and that the highest proportion of respondents uses them monthly. Meanwhile, 5% of respondents already use usage metrics on a weekly basis.

[divider]

2. For those who answered “never” or “infrequently,” why not?

46 participants responded. Participants could select one option.

We then dug deeper into the reasons why participants might never or infrequently use usage metrics. The majority answered that they did not have access to, or did not know about, usage metrics.

Usage data are anecdotally perceived as relatively easy to manipulate, despite the clear guidelines of the industry standard COUNTER in this regard (see projectcounter.org/r4/COPR4.pdf, p. 25) so perhaps the most surprising outcome of this question was the very low proportion of respondents who selected the ease of manipulation as their reason.

[divider]

3. Which of these statements is the most important reason for you to use usage metrics?

125 participants responded. Participants could select one option.

We described five reasons usage data provide valuable input into decision making. Participants were asked to select the top three reasons they found the most important for incorporating usage metrics into their analyses; only the most important reason is displayed below since opinion was equally divided across all for the second and third reasons.

The highest proportion of respondents, 40%, stated that research is best quantified using multiple criteria.

[divider]

4. Which usage metric would be most useful to you?

122 participants responded. Participants could select one option.

Elsevier’s tools currently offer three metrics based on usage data.

  • Views Count shows you the total views that a set of publications have received. Views Count is the usage equivalent of Citation Count, which counts total citations from publication to date of data cut.
  • Views per Publication takes into account the number of publications (the output) an entity has. To calculate Views per Publication, the metric Views Count is divided by the metric Scholarly Output. This is the usage equivalent of Citations per Publication.
  • Field-Weighted Views Impact is the usage equivalent of Field-Weighted Citation Impact. This accounts for the different levels of activity that can be associated with publications in distinct disciplines, of different types, and published at different times. For example, the views received by a short survey published in immunology in 2013 are compared with the views of similar publications – other short surveys published in immunology in 2013 – to generate a ratio. If this ratio is 1, the short survey has average views compared to similar short surveys, if it is more than 1 then it is above average, and below 1 is below average views. This is done for every publication and the average of them all is the Field-Weighted Views Impact.

All of these metrics are seen as useful, with a reasonable proportion of votes, but Field-Weighted Views Impact was considered to give the most valuable input.

[divider]

5. If you had access to them, how likely would you be to use usage metrics in your evaluation of research?

123 participants responded. Participants could select one option.

At the close of the webinar, we asked participants whether they would use viewing metrics in their assessments of research excellence, bearing in mind what they had heard. The majority of respondents, 95%, stated that they would be likely or very likely to do so.

This small survey demonstrates a real appetite for extending the view on research performance metrics, through the inclusion of usage data in the measurement of research impact – an extra dimension, but without the need for the silly glasses.

[divider]

Watch the webinar on usage metrics

Download the presentation and polling results


Download the Usage GuidebookDownload the Usage Guidebook

Elsevier’s Research Intelligence group has created a Usage Guidebook with practical advice on usage data and the appropriate use of multiple metrics when analyzing research. You can download it here for free.


Elsevier Connect Contributors

Chris JamesLisa Colledge, DPhilDr. Lisa Colledge (@LisaColledge1) is an expert in the use of research metrics. She is responsible for developing and defining Elsevier’s research metrics strategy.

She started by working with editors and learned societies to develop strategies to improve journals’ standings. She then joined the Elsevier Research Intelligence product group, which most recently launched SciVal Trends. Lisa also represents Elsevier in Snowball Metrics, in which universities agree on methods to generate metrics, from all data sources available, to support international benchmarking.

Prior to joining Elsevier, she conducted postdoctoral research at the University of Edinburgh. She holds both a DPhil and an MA from the University of Oxford.

Chris James is Marketing Manager for Elsevier Research Intelligence, responsible for marketing and communication initiatives of SciVal and Analytical Services. He joined Elsevier in Amsterdam in 2004 and worked for four years as a Customer Marketing Manager, training his northern Europe customers on products such as ScienceDirect, Scopus and SciVal. Prior to joining Elsevier, he worked at an engineering consultancy in the UK.

comments powered by Disqus

Share story: