How do you carry out a realist synthesis of an intervention when there's 'no evidence'?
Abstract
In this presentation, we will draw on our experiences of conducting a realist synthesis of the feedback of aggregated patient reported outcome measure (PROMs) data to improve patient care to address two methodological... [ view full abstract ]
In this presentation, we will draw on our experiences of conducting a realist synthesis of the feedback of aggregated patient reported outcome measure (PROMs) data to improve patient care to address two methodological questions (1) how do you carry out a realist synthesis of an intervention when there's 'no evidence'? and (2) how can you deal with the complexity of ‘context’?
The answer to question one, of course, is that in realist synthesis, it is the programme theory, not the intervention, which is the unit of analysis. Despite their relatively recent introduction to the NHS and the paucity of evidence evaluating this intervention, the underlying reasoning about how PROMs data will be mobilised is familiar and has a long and somewhat chequered history. For example, the use of aggregated PROM data to benchmark provider performance and the public reporting of these data to inform consumer choice shares many of the assumptions and some of the drawbacks of other ‘feedback’ or ‘public disclosure’ interventions (e.g. hospital ‘star’ ratings, patient experience surveys and surgical mortality report cards). These interventions all share similar programme theories and therefore, we were able to draw on the large body of evidence that had explored the fate of these interventions to carry out our synthesis. In our presentation we will discuss how we moved from the specific ideas underlying the use of PROMs data to generate more middle range theories of ‘audit and feedback’, ‘benchmarking’ and ‘public disclosure’ to guide our review.
During the review, we also encountered other challenges, most notably, in understanding ‘context’, hence our second methodological question. We found that ‘context’ was complex and rarely occurred as ‘neat single packages’. For example, feedback and public disclosure of performance data programmes varied widely in how they were implemented (e.g. whether they were case-mix adjusted, who had mandated data collection, how quickly these data were feedback to providers) and were often implemented in conjunction with or alongside a whole host of other interventions (e.g. the use of financial incentives, how much support providers were given to interpret these data). Different feedback and public reporting programmes embodied different combinations of all these contextual conditions that were impossible to disentangle and thus isolate. Therefore our review focused on understanding how different contextual configurations shaped the mechanisms through which such programmes worked. In this presentation, we will provide an example of how we explored this in testing theories about the role of financial incentives.
Functionality and Feedback: a realist synthesis of the collation, interpretation and use of PROMs data to improve patient care was funded by National Institute for Health Research Health Services and Delivery Research Programme 12/136/31. The views and opinions expressed therein are those of the authors and do not necessarily reflect those of the HS&DR Programme, NIHR, NHS or the Department of Health.
Authors
-
Joanne Greenhalgh
(university of Leeds)
-
Sonia Dalkin
(Northumbria University)
-
Kate Gooding
(University of Liverpool)
-
Elizabeth Gibbons
(University of Oxford)
Topic Areas
Please select one of the following:: Realist synthesis , Please select a maximum of two themes from the following list:: Theory in Realist Approach , Please select a maximum of two themes from the following list:: Debates in Realist Inquiry
Session
OS-10 » Theory and Evidence (09:45 - Wednesday, 5th October, Frobisher Room 1)
Presentation Files
The presenter has not uploaded any presentation files.