Image alteration/duplication can be an extremely serious issue, more so than text plagiarism because images are often a direct representation of research data. While alteration of images may be well-intentioned or even accidental, some image issues can signify data falsification or fabrication, or even be indicative of the output of so-called papermills. Given scientific progress comes from “standing on the shoulders of giants”, corrupted research results can lead to money being wasted on invalid leads. Furthermore, research integrity issues can also undermine public trust in research, while practical application of flawed “results” can have consequences for society.
Identifying image manipulation cases before publication and assessing post-publication allegations are key to upholding the integrity of the published record and nurturing trust in research. However, detection and interpretation of problematic images is challenging and time-consuming. Editors frequently seek guidance on this topic, as is evident from the continued popularity of this earlier Editors’ Update article.
Last year, the STM Association, which works with its members to develop publishing industry standards to advance trusted research worldwide, established a Working Group on Image Alterations and Duplications. This group aims to create guidelines and training for editors, and collaborate on automated solutions for image alteration and/or duplication detection. It brings together experts from nine scholarly publishing organizations that believe editors, as well as researchers, science and society, can hugely benefit from recommendations being aligned across journals.
The Working Group experts have now drafted a series of recommendations covering: a structured approach for applying image integrity screening routinely; a three-tier classification system for types and severity of image-related issues; and guidance on what types of image alteration are permissible under what conditions, as well as what actions editors may take to protect or correct the scholarly record. Image types and issues vary considerably per subject field and the recommendations are kept sufficiently broad with the aim that they can be applied across journals and publishers. The recommendations were presented during a webinar last month, available as a recording here.
The recommendations are a living document and a final version, incorporating the community’s suggested improvements, will be published in December. They are already garnering interest in the publishing community (e.g. Publishers unite to tackle doctored images in research papers – nature.com) and have been well-received by experts on the topic of image manipulation, including Elisabeth Bik: “Fabulous document”, “very helpful guidance”: Sleuths react to recommendations for handling image integrity issues – Retraction Watch.
As members of Elsevier’s editor community, you are also strongly encouraged to provide comments to help improve the draft recommendations. These can be added following the instructions on this page (register/log in first to use the commenting feature) before the deadline: 31 October 2021.
Catriona Fennell, Director Journal Services, who is representing Elsevier in the Working Group, said “We recognize the challenge editors face with detecting and handling allegations of image manipulation and their need for guidance. As part of Elsevier’s mission to provide editors with the best resources to establish the trustworthiness of submissions, these draft recommendations for handling image integrity issues represent a crucial next step. Elsevier is proud to be part of this effort.”
We look forward to hearing your feedback.
comments powered by Disqus