The long and short game of research impact

Posted in: Communication, Impact, Research

In recent years, universities have placed more and more emphasis on impact, with academics increasingly being asked to demonstrate the effectiveness of their work in the wider world. Here David Ellis considers the importance of impact and shares his tips for affecting change through research. 

Research impact typically refers to what happens beyond the expected scholarly impact of publishing papers, speaking at conferences and contributing knowledge to the academic community. The REF defines it as "an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia". 

That might mean helping to shape government policy, supporting a business to innovate or become more profitable, through to producing some social or cultural change by sharing new knowledge.  

Why is impact important? 

As a behavioural scientist, I’ve always been drawn to solving practical problems, and I think it helps create a sense of clarity if you can focus on why you’re doing the research while you’re developing the project. 

The idea of doing social good can be slightly abstract, but the active pursuit of impact helps take down the barriers between the work and the people whose lives it might improve. Even the most theoretical projects can benefit from the researcher taking time to consider who this work is meant to help. 

More practically, impact is important as it’s increasingly a condition of research funding. In the UK, Research Councils ask for impact statements as part of the application process, meaning you have to think about the changes that work might achieve during the lifespan of a project.  

What impact looks like for me 

In 2019 we published a paper showing a link between multiple missed GP appointments and early death. We used data from GP practices across Scotland to map what happened to people when they missed a number of appointments and found that those with multiple comorbidities - especially those with long term mental health conditions - were more likely to die in the near future, often from preventable things like suicide or overdose. Significantly, we found these people were often homeless or in need of additional support.  

It showed how missed appointments could serve as something of an early warning system to identify vulnerable people before they fall off the system, leading to much worse health outcomes. In the past research on missingness in healthcare has focused on the financial burden of missed appointments. However, our work showed that there are serious repercussions for the patients, and that there’s a social imperative to ensure that the health service is supporting the most vulnerable in society.  

Recently, we were pleased to discover that this work had been picked up by policy makers, and will now be informing the NHS’ approach to missingness. The National Institute of Clinical Excellence (NICE), who set the guidelines for how the NHS operates, cited the research as part of their recommendations for how to deal with vulnerable patients. A report by the Scottish Government also referenced the work, advising that people experiencing homelessness shouldn’t be removed from waiting lists or GP registers too prematurely (a common consequence of missed appointments). The report further stressed the need to develop a better understanding of missingness in healthcare to better serve patients from marginalised groups. 

Based on my experiences, these are some things to keep in mind if you want to do research with impact. 

  1. Embed impact into the research process from the beginning

Often this means finding project partners who aren’t academic. It’s good to be working with people who can shape the research to be useful but who are also in a position to do something with it afterwards. Sometimes this will be fairly easy as – if you’re researching applied problems – practitioners or policy makers may be involved from the start. All our work on NHS issues has been done in partnership with healthcare practitioners from around 2015 onwards. This makes it so much easier to generate impact, because they’re in the best position to affect change. Not only can they talk with authority to people in power about how to facilitate improvements, but they’re at the coal face and can even implement findings within their day-to-day job. 

Baking impact in from the start also has implications for how you do your research. If you want to affect change, then assume your research is going to end up in the media and will be subject to an incredible level of scrutiny. In that case, it needs to be watertight. This is even more of an incentive to get everything absolutely right – get people to check and double check your numbers and figures! Even if nothing comes of it, you’re still doing better research.  

  1. Think about how you’re going to reach the right people

If you want to have impact outside of academia, choosing the right place to publish may be essential. An obvious consideration is whether the journal is open access or not – you can’t rely on your audience being able to get behind the paywall. Taking advantage of pre-print servers can also get around paywalls. Beyond that, think about what people outside of academia might read. If you’re publishing something with implications for healthcare, then you might want to publish in places that healthcare professionals read on a weekly basis (e.g., the BMJ etc). This might mean looking beyond what’s a “prestigious” journal in your field, and thinking practically about how you can get your work in front of the right audience. Alternatively, this might mean taking findings from a variety of academic sources and distilling them down into something that someone outside of academic will find engaging. CREST’s Security Review has been extremely successful in this regard. This magazine provides a gateway to the latest behavioural and social science research into security threats. 

  1. Data is important but storytelling is essential

As scientists, we have a tendency to think the key to persuading people of something is really good quantitative data. However, what I’ve found is that qualitative outputs are equally important when it comes to driving impact. The human stories that you can tell from the research will convince non-academics to do something with the findings. This is another reason it’s so important to have external partners – they live and breathe the things that we’re researching so they can embody the story that’s being told and create a more powerful argument. 

However, in simpler terms storytelling can just mean using other media to get your message out. Press releases or blogs are a lot shorter than academic papers, which are often written in a way that the average person (or even government ministers!) can’t easily understand. Communicating in accessible language and having to boil down your findings into a few key points makes your research much more digestible, and so makes it much more likely that the busy people who are able to bring about change will read them. While papers are our currency as academics, these other qualitative communication methods are, to me, probably the things that lead to quantifiable research impact. 

Some caveats: The dangers of focusing on short-term impact 

Despite impact being core to some of my work, it is not and should not be an essential component for everything we do. Doing research simply because it is interesting should also be protected. This is sometimes referred to as blue-sky research.  

That might appear self-indulgent, but there must be room for research that has no specific, discernible impact. Not only because universities should always be a place to indulge curiosity for curiosity’s sake and explore ideas, but also because we can’t always predict what’s going to be important or impactful in the long term. 

Many scientific advancements came about because of an accumulation of knowledge, rather than single research finding – tiny discoveries that aren’t significant until someone else has developed them further. These discoveries wouldn’t have been considered impactful in the first instance but without these building blocks, major innovations can’t happen. 

For example, no one wrote a grant that said “I’m going to build a smartphone”, however, their existence relies on a host of other technologies that are hidden within the device itself. 

Indeed, every technology that makes a smartphone owes its funding to the state. The internet, GPS, touchscreen displays, and voice-activated assistants all came from research that could never have foreseen the future impacts.   

And even if the impact route is clear, it can be incredibly difficult to measure, at least in the short-term. This has been a point of concern for researchers and institutions who worry about how impact can best be encouraged, but also judged across comparatively short periods of time. 

Despite all the above, impact can often appear random as networks develop. I did some research looking at people’s perceptions of weekdays and found that Monday was viewed as more negative and Friday as more positive. We then demonstrated that patients are more likely to miss appointments at the start of the week and that this effect was particularly pronounced in younger patients. Sometime after we published the work, a student from our lab who was aware of the work presented these findings at a hospital clinic and a practitioner took this idea and moved more appointments to later in the week, reducing the non-attendance rate by over 10%. That’s a lot of appointments when magnified over several years! Luckily the student told us and put us in touch with others who could substantiate this impact with raw data.  Without that contact we would never have known or been able to quantify such a positive outcome!  

Posted in: Communication, Impact, Research


  • (we won't publish this)

Write a response

  • Excellent article. Very true. We need to review our publications 5 years down the line and reflect on what difference has it made.