Article Text
Abstract
Background There are unique challenges to the implementation and evaluation of public health interventions that are implemented in community settings. These limit our ability to evaluate intervention impact, yet the mechanisms through which challenges and barriers can operate remain unclear. In this study, we investigated challenges and barriers to implementation and evaluation for five different interventions from a health improvement programme. We used the Pillar Integration Process (PIP), a novel technique, to combine and analyse mixed methods data.
Methods We used a sequential explanatory study design and a ‘subtle realist’ epistemological approach. Project interventions were selected for study inclusion using explicit criteria. Ethics approval was obtained. First we conducted before-and-after quantitative evaluations of five recognised public health improvement interventions addressing physical activity, diet, mental wellbeing and alcohol misuse in 2011/12. Second, we conducted fifteen qualitative, semi-structured interviews with staff who implemented and managed the interventions. Third, PIP was developed to analyse the extent of congruence between barriers to implementation and barriers to evaluation identified within the datasets. PIP was constructed using a matrix built in six stages: ‘listing’ ‘matching’ ‘expanding’ ‘clustering’ ‘re-matching’ and ‘pillar-building’ between qualitative and quantitative datasets. The central ‘pillar’ issues which emerged were compared with theories of public health evaluation and implementation. We synthesised these findings and produced a logic model to illustrate them.
Results Three interventions generated both implementation and process evaluation data. Synthesis and integration of findings revealed personal, intersubjective, and socio-political barriers important to both the evaluation and implementation of the interventions. Contextual issues (staff capacity, team cohesion, setting compatibility, history, time, communication styles, co-production, understanding and use of evidence) facilitated implementation in some settings whilst hindering it in others. The logic model illustrated how the convergence of some contextual issues (e.g. timing, organisational history, staff capacity and beliefs about evidence) exacerbated challenges to evaluation. PIP highlighted differences in practices and beliefs which caused inter-organisational conflicts. These conflicts are likely to have contributed to barriers to evaluation fidelity, lower intervention uptake rates, and unequal provision of services.
Conclusion We found parallel barriers to evaluation and implementation of public health interventions which lowered their effectiveness in a real world situation. Using a mixed methods approach - PIP - we allowed greater insights into the context and mechanisms of those barriers than either qualitative or quantitative research methods alone would have afforded. PIP provides demonstrable methodological rigour that could be used in other mixed methods research and practice contexts.
- methodology
- evaluation
- implementation