Texas A&M University is in the final phase of developing an ambitious plan to become one of the 10 best public universities in the US by the year 2020. The Vision 2020 initiative aims to strengthen the development of knowledge, innovations and creative works that make an impact on the world.
The scholars and researchers there want to know that their work makes a difference – and key to that is ensuring that they can tell their story in their own way.
“Excellence at Texas A&M is the sum of all its parts,” said Dr. Bruce Herbert, Director of Scholarly Communications. “It’s everyone rowing in the same direction. You need everyone engaged.”
The right tool for the job
Dr. Herbert is one of the people charged with putting the tools and systems in place that will help the university achieve its goal. He recognized that if departments had easily accessible, up-to-date knowledge about the way the world was engaging with their research, they could make decisions that would keep Texas A&M on track to turn Vision 2020 into a reality.
However, at a university with researchers and scholars of all kinds, not everyone produces information for scientific journals. The Performance Studies department, for example, features world-renowned composers whose work will never be cited in the traditional sense. Elsewhere, other departments might see research included in clinical guidelines long before traditional citations have a chance to accrue, or find that their work may not be widely cited but makes a huge difference to local communities.
“What happens with these groups is that the impact of their works looks weaker when measured with the traditional tools,” Dr. Herbert explained. “These scholars really benefit from altmetrics because now they have data or evidence of the impact their work has on the world that doesn’t exist in more traditional databases.”
Giving every department the means to showcase their work in a way that makes sense to them is crucial to achieving the university’s vision. “When the faculty is unable to tell their story; they disengage,” he said. “You need to care about their stories for them to care about your vision.”
Reclaiming the narrative
To that end, Texas A&M partnered with Plum Analytics, which was acquired by Elsevier earlier this year as one of the leading suppliers of altmetrics. As co-founder Andrea Michalek explained: “We wanted to work with Texas A&M to realize their goals in helping scholars reclaim their own narratives and tell their own stories around their research.”
For example, with the scholars in the Performance Studies department, Texas A&M used their PlumX products to find the digital indicators of people interacting with their work. For these scholars, their primary output is performances. Traditional measures like citations, do not represent this output at all. However, by measuring the usage of the YouTube videos of actual performances, it allowed these scholars to tell a story of this work for the first time.
“Texas A&M simply added the faculty of the department to our PlumX platform, and we were able to track this non-traditional research output,” Andrea said. “That gave the department head some quantitative evidence about the engagement around the work his department was producing.”
Broadening the metrics available to researchers ensures they have the information they need to tell the story they want to tell. The PlumX Dashboards track metrics in five broad categories:
- Usage tracks how many times people have engaged with an article, such as downloading.
- Captures indicate where people have expressed a desire to come back to an article – such as bookmarking it or adding it to their Mendeley library.
- Mentions indicate where research is referenced in blogs, news stories, Wikipedia and and other online media.
- Social Media tracks mentions in social channels.
- Citations refer to research cited in the scientific literature.
Elevating research metrics at Elsevier
With the acquisition of Plum Analytics, Andrea has assumed the role of VP of Research Metrics, Product Management. Under her guidance, Elsevier is helping define the strategy for research metrics on many levels, including understanding and measuring societal impact, metrics for researchers and journals, as well as continuing to steer the work started at Plum Analytics.
Evaluating research for its societal impact
“We’re really focusing on telling more of this story around societal impact,” Andrea says. “So we track clinical guidelines from PubMed in the US and NICE in the UK – that’s a separate and strong indicator that says this work is important in clinical practice.”
For some papers, that can also mean looking at regional citations. By tracking local citation indexes such as SciELO, PlumX Metrics can indicate whether a paper is making an impact outside of the traditionally ranked journals. In other instances, a reference in a government policy document can be used to show the impact of a research paper. “Funding bodies are increasingly wanting evidence about the impact research has on society,” Andrea said. “We can give them metrics that help tell that story.”
In many parts of the research community the traditional metrics remain entrenched, but altmetrics risen in prominence faster than many expected. As Andrea explained:
Plum Analytics has been around for five years, and when it started this was bleeding edge stuff. People said ‘you’re way too early’ – but things evolved very quickly. There’s no silver bullet in driving the acceptance of altmetrics, but it comes down to things like a researcher who is going up for tenure whose controversial article hasn’t been highly cited yet, but they see it as their seminal work. If they can show it’s been downloaded 1000’s of times, that it’s referenced in government policy or being used in clinical guidelines, that metric helps them tell the story they need to tell.
Research on science and health plays a huge role in our daily lives. This month of May 2017, we are featuring stories that showcase the theme "from science to society," beginning with how Texas A&M is using innovative tools and metrics to track the impact of its research to drive its 2020 vision. For more stories about people and projects empowered by knowledge, we invite you to visit Empowering Knowledge.
A new set of metrics for journal impact
In response to academia’s call for metrics that provide a broader, more transparent view of an academic journal’s citation impact, Scopus recently developed CiteScore metrics, a set of eight indicators that offer complementary views to analyze the impact of all serial titles — including journals — on Scopus.
CiteScore metrics are:
- Comprehensive – CiteScore reveals the citation impact of over 22,000 titles in 330 disciplines and includes all Scopus serial types.
- Transparent – The methodology behind the calculation is straightforward, and users are able to validate any CiteScore value by clicking into the numerator (citations) and denominator (documents). Read more about this on the Scopus blog.
- Current – CiteScore Tracker shows how the current year’s CiteScore builds up each month. Furthermore, new titles can receive CiteScore metrics the year after they are first indexed by Scopus.
- Free – There is no charge to use CiteScore metrics, and you don’t need a Scopus subscription to dive deeper into the metrics or underlying data for a specific title.
The CiteScore value, its monthly CiteScore Tracker, CiteScore Rank, CiteScore Quartile and CiteScore Percentile are part of the broader “basket of metrics.” They join SNIP (Source Normalized Impact per Paper) and SJR (SCImago Journal Rank) as metrics available for the journals indexed in Scopus. This set of metrics has complementary characteristics, providing a holistic view on journal performance. Read more about CiteScore.
Ian Rowlands, Research Information Specialist at King’s College London, said there has been much interest in CiteScore among his colleagues there:
For me, what’s really interesting about CiteScore is that you’re using a longer time window than the classic journal Impact Factor. And … that should mean that some research areas that are perhaps more slow moving in their citation rates would be seen in a more positive light. And certainly some of the interactions I’ve had with academics at King’s College London, they’ve been very interested in not just CiteScore but some of these new metrics that are coming along like the Scopus Views. They’re really starting to get engaged with this.
comments powered by Disqus