Apps & Technology

New tool helps authors highlight their research methods

The Author Annotation Tool allows researchers to control the methods keywords added to their publications

Print Friendly and PDF
Share story:  

Reading scientific literature and looking for specific information in papers is a very important task for researchers, but one that is also very time-consuming.

When we interviewed researchers, it soon became clear that - for neuroscientists in particular - the details of how experiments are conducted are often as important as what is studied in the experiments, or their results. However, abstracts usually only describe the topic of study and the key results, rarely the details of the experiment (e.g. animals used, experimental models).

We have been working to resolve that and have developed the Methods Summary - a widget that provides an at-a-glance overview of keywords extracted from the methods section of the paper. It is accompanied by the Author Annotation Tool, an interface that allows an author to edit the keywords that are extracted for the Methods Summary widget. While these are currently only available to a pilot group of neuroscience journals (Behavioural Brain Research, Brain Research, Experimental Neurology, Journal of Neuroscience Methods, Neuron, and Pharmacology Biochemistry and Behavior), our aim is to evaluate the take-up and then roll them out to other domains.

What we have learnt

One interesting observation was that the changes the author makes in the Author Annotation Tool do not necessarily correspond with what is written in the methods section of their paper. In other words, we regularly see discrepancies between what is actually in the paper and what the author considers to be the correct methods keywords for their paper. It is important to realize that this discrepancy may impact the discoverability and assessment of papers, therefore, this tool could also be very useful in guiding the authoring process, indicating when crucial information is missing from the methods section and standardizing the way experiments are being described in literature.

Next steps

We are currently running a second pilot for the Author Annotation Tool on a selected number of Elsevier neuroscience journals. As discussed, our goal is to eventually expand use of the tool to other disciplines.

So if you happen to have an article accepted in one of our neuroscience journals this year, and are prompted to review the methods concepts we extracted for it, we hope you take some time to do so! After we have collected a sizeable number of annotated methods, the Methods Summary will be released, so to see the result of your input on ScienceDirect you’ll need to be a little patient.

We would really be interested to hear your views in the comment section below.

Making papers easier to evaluate

In the Methods Summary (Figure 1), keywords extracted from an article’s methods section are presented in clear categories at the top of the ScienceDirect article page for participating journals. A prototype was tested on ScienceDirect last year and confirmed that readers are interested in easily-accessible and well-structured methods information.Figure 1: The Methods Summary.
The methods-related concepts presented in the Methods Summary are indexed from the article text by applying taxonomy-driven text-mining techniques. Although the results of this automated-extraction are already quite good, a lack of standardization in methods terminology and specificity, as well as references to other studies, complicate matters. We realized we could make a big improvement to the completeness of the taxonomy and drive machine learning of the text-mining algorithm by asking authors to review the automatically-generated summaries.

To that end, we developed the Author Annotation Tool (Figure 2), through which we ask authors to validate automatic annotations just after their paper is accepted for publication in a participating Elsevier neuroscience journal. Authors can not only accept or reject suggested keywords, but also add any terms that they think are missing from the list. In a pilot study, 60 percent of authors voluntarily filled out the tool for their paper. We also saw that this effort takes an author only eight minutes on average, and that authors find it an easy way to influence the presentation and visibility of their articles on ScienceDirect.

Figure 2: With the Author Annotation Tool, researchers will be able to accept or reject suggestions as well as add new keywords.

Author biography

Felisa Van HasseltFelisa van Hasselt is a Product Manager for the ScienceDirect Verticals & Engagement team in Elsevier’s Research Applications & Platform department. She has a PhD in Neuroscience and joined Elsevier in 2013 as a subject matter expert, organizing information, enriching content and developing discovery solutions for neuroscientists on the ScienceDirect article page. She is based in Amsterdam.

comments powered by Disqus

Share story: