Ready to optimize your funding strategy?

Unfortunately we don't fully support your browser. If you have the option to, please upgrade to a newer version or use Mozilla Firefox, Microsoft Edge, Google Chrome, or Safari 14 or newer. If you are unable to, and need support, please send us your feedback.
We'd appreciate your feedback.Tell us what you think!
Demonstrating the "so what?" of research is both a professional necessity and a public duty. Discover how Elsevier's suite of tools helps funders overcome key barriers to impact storytelling — and build compelling, evidence-backed narratives at every stage.
Research impact stories typically describe how academic research findings have produced tangible benefits that influence society, the economy, public health, the environment, or the wider cultural milieu. They combine qualitative (e.g., testimonials, news coverage, policy examples) and quantitative evidence (e.g., metrics such as patent citations, policy citations, media counts) in structured narratives that highlight what has been pithily described as the “so what?” of research. Many impact stories draw on classic storytelling techniques evolved to describe very different societal challenges – notably quests or the removal of troublesome monsters – in order to evoke emotion, foster empathy and drive action. In more contemporary forms, these narratives are used by researchers in institutional assessments and funding applications, and by funders to justify public investment, promote accountability and inform future strategies.
Although for much of the twentieth century researchers and policy makers theoretically worked in separate silos, an approach formulated in the well-known Haldane Principle, the two have gradually moved toward a more directly collaborative model. This shift first appeared as a response to global conflicts and Cold War-era national security concerns, then accelerated with the emergence of the first knowledge-based economies in the 1990s. By the 2010s, the concept of mission-oriented research – where government funding efforts are focused on addressing large-scale challenges like climate change, pandemics, or food insecurity – was already gaining traction. As a result, the communication of research impact is now crucial not just as a means of evaluating agency or university performance, but as a way of authenticating societal progress and maintaining democratic transparency around state-supported academic work. At a time when the wider role of academia is under intense discussion, with some commentators calling for a new social contract for the sector, bridging the “so what gap” has rarely been more important.
With the stakes so high, it is easy to see why impact storytelling has become a core skillset for government funding bodies. However, three types of barriers still prevent organizations from articulating the full extent of their influence on society:
Cultural issues – This category refers to entrenched organizational habits and beliefs that may be holding back progress in impact storytelling. A classic example is an enduring focus on citation-based metrics that, while valuable in purely academic terms, fail to capture the complex relationship between research and its broader outcomes. Indeed, because of their value in providing proof points for the reach of an impact (the “how”), there is often too much emphasis on quantitative measures and too little on the qualitative evidence that provides them with context and meaning (the “why”). While this is understandable – numbers often have an aura of objectivity that can belie subjective decisions about how and what to measure – the two evidence types are complementary and should be used together to support an engaging narrative and cross-verify the findings it describes.
Systemic issues – Funders are also held back by several more prosaic problems related to time and resource limitations, as well as the systems and processes through which they hope to address these issues. There may not always be the necessary bandwidth to identify and develop detailed impact stories, or efforts might be hampered by the dispersal of relevant information across multiple locations, with no data standardization and no overarching protocols to facilitate its retrieval. In this situation, problems relating to speed and administrative burden may matter less than the risk of getting things wrong – information might be missing or inaccurately recorded, eroding confidence and raising concerns of reputational damage. This leads to a conservative, safety-first approach that is reinforced by the tendency of some governments to provide money on a "little and often" basis, forcing organizations to divert their attention toward tasks like compliance and yet more administration. The bottom line: time really is money, and incoherent systems do not make for a coherent story.
Fundamental issues – The problems highlighted in this category relate to two enduring themes in the research impact space – identifying causality and dealing with time lags. The causality issue stems from the difficulty of proving that a specific study, program, or government intervention has directly caused a particular real-world outcome. This is often exacerbated in mission research where, despite the apparent unity of a program, its complex scope and the involvement of multiple stakeholders make it especially hard to track causal links between funding and change. Time lags – the fact that time-bound programs frequently support research whose impact can take many years to materialize – are another complicating factor. The most common remedy for this situation is to develop a Pathway to Impact Plan that systematically outlines how and when impact is expected to be achieved, which is itself a part of the storytelling exercise.
As a provider of advanced information and decision support, Elsevier brings competencies that can help address each of the three problem areas.
Cultural issues – Our tools include a comprehensive suite of real-world analytics, including patent-article citations, policy document mentions and PlumX metrics that track media and social media mentions. We can assist with the qualitative orientation process by helping you align your work with the United Nations Sustainable Development Goals (SDGs), or by providing authoritative overviews of the wider research landscape using responsible AI and visual mapping technologies.
Systemic issues – We can help you combine real-world analytics with your own data in a detailed knowledge graph that provides context by creating links and adding semantic metadata. The resulting evidence-backed insights then support sophisticated real-world impact tracking and strategic decision making, as well as areas like compliance. We also offer a comprehensive Research Information Management System (RIMS) that enables you to collect, manage and showcase your research portfolio, promoting an integrated, digital-first approach. In well-maintained systems, this can remove the need to gather data before beginning an impact story, leaving you free to concentrate on crafting persuasive narratives.
Fundamental issues – By consolidating and organizing information, an integrated RIMS helps address the fundamental issues of causality and time lags, making it significantly easier to accurately track and monitor real-world impact over time. Our tools also support Pathway to Impact planning by helping you map the wider landscape – identifying emerging trends, highlighting research gaps and developing impact projections. We also offer customized reports evaluating the real-world impact of funded research, helping demonstrate return on investment to stakeholders and taxpayers.
We’ve looked at how Elsevier’s capabilities can help you overcome the main obstacles to impact storytelling, but how do they align with the structure of the narrative itself? Drawing on the models described in How can funders tell better research impact stories?, the following table provides a chronological description of typical story stages and matches them to specific Elsevier solutions.
Story stage | Description | How we can help |
|---|---|---|
Context | Background of the research project and the problem it addresses. Explain how the study fits into the research landscape and delineate the real-world challenges behind it. | • InsightGraph shows how a program fits into the wider research landscape via a visual knowledge graph that integrates a range of data (journal citations, patents, grants, clinical trials). • LeapSpace, an AI workspace, highlights emerging patterns, contradictions and evidence gaps adjacent to the program. |
Objectives/Challenge | Specific goals of the program, anticipated outcomes, and how they relate to the wider objectives. | • Scopus and LeapSpace can help identify the research gaps that your program addresses, while foundational papers can be used as milestones from which progress can be charted. • Knowledge gaps can also be indicated visually with InsightGraph. |
Methodology | The methodology adopted, collaborations and how resources were used. Did the study involve any novel approaches? Were there any setbacks and if so, how were they resolved? | Pure, Elsevier’s Research Information Management System (RIMS), brings together disparate data—publications, projects, researcher profiles and impact activities—into a single, trusted environment. This ensures a comprehensive record of the methodologies, tools and resources used in a study or program is aways readily available. |
Results and Impact/Evidence | Presents the results of the research and ties them directly to their impact. The onus is on explaining what has changed or is liable to change because of this work. Who are the beneficiaries? The adjacent Evidence stage backs up these claims by utilizing both quantitative data (policy citations, media coverage, performance metrics) and more qualitative “human” evidence (feedback, testimonials, independent reports or case studies). | • Scopus transforms raw bibliometric data into narrative-driven insights, combining traditional citation tracking with AI-powered analysis and alternative metrics to demonstrate the real-world application of research. • SciVal, a research performance assessment tool, takes this process a step further by incorporating patent-article citations and policy document mentions, while funders can use the solution to align their research output with the United Nations Sustainable Development Goals (SDGs), showcasing contributions to global challenges. For evidence if what SciVal can do, see the report on UK research and innovation performance created for Universities UK. • InsightGraph can use this data to provide visual insights that can be added to an evidence-based narrative. • Pure includes modules designed to record engagement activities (like policy mentions, media coverage and collaborations) that occur when sharing research findings with non-academic audiences. This helps to document the "why" and "how" of research. For an example of what is possible with Pure, see this article describing the investment pilot dashboard created for the National Science Foundation’s (NSF) Directorate for Technology, Innovation and Partnerships, or TIP. |
Lessons learned | Reflects on lessons learned from the program and provides actionable insights that can help future projects maximize their own impact. | • This stage can be informed by evidence and data held in the Pure RIMS system. • Analytical Services provides customized, data-driven research performance analysis. See, for example, the report evaluating dementia and diabetes research undertaken by the Australian National Health and Medical Research Council (NHMRC). • InsightGraph can help to visually identify emerging topics. • Niche topics can be analyzed in even greater detail via the SciVal Topic Analysis feature, while the same solution can help to identify future collaboration partners from both academia and the corporate sector. |
Demonstrating research impact is a professional necessity, an academic obligation and a public duty. Grappling with administrative burdens, legacy systems and financial uncertainty, funders are nonetheless expected to be single-minded in their pursuit of this goal. In doing so, they need to highlight their own contributions, while balancing the input and long-term needs of governments, academia, industry and society. In the face of these complex demands, what do they do except tell stories?
This may seem like a strange response to a situation that requires a pragmatic, ROI-driven approach, but storytelling is a perfect way of combining the “how” of quantitative evidence with the “why” of qualitative evidence in an engaging, human-centred way, building a bridge between research and policy. Like all stories, impact narratives often share common structures, but each one has its own unique attributes and tone. There is no definitively “right” way to showcase real-world influence, although best practice dictates that you structure your story, prioritize change, and select as your “hero” either the body that allocates or authorizes your budget, or the beneficiaries of the research. Paradoxically, to successfully champion the impact of your portfolio, you need to remember that it’s not about you – at least not directly.
Whatever approach you choose, Elsevier offers a flexible range of capabilities and tools that can help you overcome entrenched organizational obstacles, then support you at every stage of your narrative, however complex or simple. As you embark on your story, you need to be confident that you have all the context, information and data points required to build a compelling case. The task is not always an easy one – there will be risks, there will be hazards, and there may even be monsters, but rest assured, Elsevier is here for the journey.
