To read or not to read?
New Scopus Article Metrics can help you decide
By Michael Habib and Hans Zijlstra Posted on 7 October 2015
You’ve started reading this article meaning something in the above heading attracted you to it. You have weighed up the potential relevance with the time you have, and decided that it sounded interesting enough to at least, start reading it. This is – in its simplest form – what the new Scopus Article Metrics are about; helping scientists read the research that is most relevant to them within the limited time that they have.
In addition to applying the same rigor to altmetrics that we applied to citation metrics, the new module was designed (and user-tested) to aid Scopus users in two primary tasks: Determining which articles to read and gaining a deeper insight into how an article (possibly one’s own?) compares with similar articles.
Benefits of Scopus Article Metrics
- Percentile benchmarks give context by comparing to similar articles based on publication date, document type, and subject area
- Community based Snowball Metrics standards
- Comprehensive citation and alternative metrics
The Scopus Article Metrics module was designed in accordance with 12 principles for the responsible use of metrics (outlined here in Elsevier’s response to HEFCE in the UK) and includes new metrics based on 4 alternative metrics categories endorsed by the Snowball Metrics project:
- Scholarly Activity — Downloads and posts in common research tools such as Mendeley and CiteULike
- Social Activity — Mentions characterized by rapid, brief engagement on platforms used by the public, such as Twitter, Facebook and Google+
- Scholarly Commentary — Reviews, articles and blogs by experts and scholars, such as F1000 Prime, research blogs and Wikipedia
- Mass Media — Coverage of research output in the mass media (e.g. coverage in top tier media)
We believe no single metric tells the whole story — you need a basket of metrics to make informed decisions. By combining citation and alternative metrics, this new Article Metrics module provides a comprehensive view of both the impact of and community engagement with an article.
Using the Article Metrics Module
On the Scopus document details (article) page, a sidebar highlights the minimal number of meaningful metrics a researcher needs to evaluate both citation impact and levels of community engagement. This can help a researcher to determine how others have received the article and, along with reading the abstract, to decide whether to read the full article.
The module displays the following (available for each article):
- Citation count and percentile benchmark
- Field-Weighted Citation Impact (FWCI)
- Mendeley readership count and benchmark
- Count of one type of scholarly commentary (e.g. blog posts, Wikipedia)
- Count and benchmark of one type of social activity (e.g. Twitter, Facebook)
- Count of coverage in the Mass Media (e.g. newspapers, magazines)
- Total count of additional metrics and link to see breakdown by source
In addition to displaying these metrics, Scopus is introducing new percentile benchmarks to show how article citations or activity compare with the averages for similar articles, taking into account:
- Date of publication
- Document type
- Disciplines associated with its source
From the sidebar, clicking <View all metrics> opens the full Article Metrics module, providing an overview of all available metrics and the underlying content for further analysis and understanding.
Our aim is to empower users to make quick-fire read/don’t read decisions on the basis of the most authoritative research metrics available. Similarly, researchers will be able to benchmark articles (including their own), against similar works without compromising the speed of their workflows. By combining the citation-based and alternative approaches in a single, concise, embedded summary, Scopus has radically enhanced the usability of research metrics.
Michael Habib, Senior Product Manager, Scopus, is currently product lead for Elsevier’s altmetrics initiatives and the Scopus Author Feedback Wizard suite. Additionally, he co-chairs the NISO Alternative Metrics Initiative’s Working Group on Definitions and Use Cases and serves as an ORCID Ambassador. Having previously worked in both public and academic libraries and holding an MS in Library Science from the University of North Carolina at Chapel Hill, Michael’s background is in library services.
Hans Zijlstra works as a Marketing Project Manager in Elsevier’s Marketing Communications & Researcher Engagement department in Amsterdam. He is responsible for projects focusing on journal and article metrics with the aim of improving our service to researchers.