The UK Medical Research Council and the National Institute for Health Research have updated their framework for developing and evaluating complex interventions. The new version moves away from just focusing on whether an intervention is effective, and confronts issues of implementation, acceptability and cost-effectiveness. This to avoid wasting time and money evaluating interventions that are theoretically amazing, but that could never be delivered. Rob Calder summarises the key points.
What is complexity?
This itself is a complex question. Think about running an open-access drop-in service. There are so many things going on at once: referrals, signposting, tea, peer-support, brief interventions, food, needle exchange, acupuncture, shelter and ongoing therapeutic interventions. All these elements form the intervention that is ‘drop-in’. Interventions with multiple elements are considered complex.
At each phase, six core elements should be considered to answer the following questions:
- How does the intervention interact with its context?
- What is the underpinning programme theory?
- How can diverse stakeholder perspectives be included in the research?
- What are the key uncertainties?
- How can the intervention be refined?
- What are the comparative resource and outcome consequences of the intervention?
Sure, you can isolate any single aspect of a drop-in service – peer support for example – but the extent to which that works will be influenced by the welcome, the time the drop-in opens, the tea, the availability of needle exchange, the other people present. Things like peer support, in this context, are rarely delivered in isolation. They are complex.
An intervention can also be complex because of the skills needed to deliver it. But also because of the people to whom it is delivered. Most people I know could be considered ‘complex’ in one way or another. Among people who use drugs, complexity can quickly accumulate with medical, psychological, social and physical needs all needing consideration.
Finally, something can be considered complex if it is highly adaptable. A core element of the Youth in Iceland model, for example, is that it is adapted to meet locally identified needs. This level of flexibility and personalisation is essential for the success of such interventions, but it can make them difficult to evaluate.
The updated guidance from the Medical Research Council and the National Institute for Health Research on evaluating complex interventions is particularly important for the inherently complex area of addiction.
Research perspectives
The new framework suggests that researchers look at complex interventions from four points of view: efficacy, effectiveness, theory and systems. Previous frameworks focused on efficacy and effectiveness (efficacy is whether something works in ideal or controlled settings, effectiveness is whether this translates to real-world settings). The new approach recommends asking the following questions:
Context is everything. How does the context change the intervention being delivered? And does that context change over time?
- Will this effective intervention reproduce the effects found in the trial when implemented here?
- Is the intervention cost effective?
- What are the most important things we need to do that will collectively improve health outcomes?
- In the absence of evidence from randomised trials and the infeasibility of conducting such a trial, what does the existing evidence suggest is the best option now and how can this be evaluated?
- What wider changes will occur as a result of this intervention?
- How are the intervention effects mediated by different settings and contexts?
The four phases of complex intervention research
These phases are not prescriptive, and the framework recommends starting wherever is most important – although who decides what is most important is less clear.
Development or identification
See the INDEX study. This phase involves documenting what the intervention is – always a useful starting point. If you are adapting an intervention from a different setting, it will probably change. For example, if you study a tobacco relapse prevention technique that was developed for people who use cocaine, it will need to be adapted. This process will help you develop your ‘programme theory’ by identifying what it is, what the core concepts are and how they are supposed to work.
Feasibility
Some feasibility testing will focus on the design, and some will focus on whether the intervention is acceptable. These considerations were less well emphasised in previous guidance, and their addition is central to the new framework.
Evaluation
The framework suggests looking beyond single outcomes and considering a range of outcomes. Researchers should also consider sub-group analyses to identify people for whom this intervention might or might not work. The framework emphasises the importance of mixed-methods research and looking beyond randomised controlled trials. A purely quantitative approach will rarely be good enough.
Among people who use drugs, complexity can quickly accumulate with medical, psychological, social and physical needs all needing consideration
Implementation
Could this intervention be adopted across the world? Does its delivery require little more that a set of instructions and a 5-minute meeting? Or does it require fundraising, policy change, purchase of expensive machines and a year-long training course on how to use a 500-page manualised treatment guide? Here you identify what might prevent and aid implementation.
Phase considerations
Within each of those four stages above, the framework suggests considering the following points:
Context
Context is everything. How does the context change the intervention being delivered? And does that context change over time? The framework gives the example of banning smoking in public places, whereby that intervention decreased visibility of smoking and changed norms around smoking. In doing so it altered the environment targeted by tobacco control policies.
Programme theory
This is where you make sure you know how the intervention works, which if you’re going to study it is probably not a bad thing. Which elements work, and how? What are the core concepts?
Stakeholders
If you are going to assess the real-world effectiveness of something, you need to involve those who are the target of an intervention, those who will deliver it, and those who will approve the policy. But do be aware that stakeholders are not required to be independent or free from bias. Many are passionate advocates for something or another.
Key uncertainties
What isn’t known? And who needs to know what? The distinction between the highly-controlled setting of the randomised controlled trial and the messy complexity of implementation run throughout this section of the framework. When studying complexity, you need to produce important information for decision makers that will contain a lot of limitations and caveats. A randomised controlled trial won’t answer all those questions; it is important to choose the right research method for the questions you have.
Intervention refinement
Make the intervention better, get feedback, see what works and what doesn’t. Pay attention to what your stakeholders said. Make notes – lots of notes. Any refinements you make can be crucial to the success of a complex intervention. And it is important to know about that kind of refinement.
Economic considerations
I’m not sure it’s necessary to describe why economic issues are important. But they are. Questions about value for money are likely to be near the top of decision makers’ question lists.
The guidance will likely continue to develop as research methods increase the ability to study ever more complex interventions. All changes to guidelines, however, raise questions. I explore some of these in an accompanying piece here.
by Rob Calder
The opinions expressed in this post reflect the views of the author(s) and do not necessarily represent the opinions or official positions of the SSA.
The SSA does not endorse or guarantee the accuracy of the information in external sources or links and accepts no responsibility or liability for any consequences arising from the use of such information.