28.09.14
Paying for performance in the NHS – what can we learn from the latest evidence?
Source: National Health Executive September/October 2014
Ruth McDonald, professor of governance and public management at Warwick Business School, argues that hospital pay for performance schemes need more evaluation.
Linking financial rewards to health outcomes or proxy outcomes has become increasingly popular in the NHS in recent years. This increased adoption of ‘Pay for Performance’ (PfP) is occurring despite a scant evidence base. The enthusiasm for ‘PfP’ is not confined to the English NHS; across the globe, various health systems are experimenting with schemes that seek to align financial incentives with policy goals.
The question asked by a lot of people is ‘do PfP schemes work?’
But this is the wrong question, since it fails to recognise that these initiatives are all different. What we need to do is to look at the features of the individual schemes – how they are designed and implemented and in what context – and assess impact to work out how and to what extent a scheme works. Independent evaluation of this sort is crucial if we are to identify lessons about particular initiatives and ‘PfP’ approaches more generally. And to be fair to the Department of Health, it has commissioned a number of such evaluations of its recent national ‘PfP’ initiatives.
‘Not exactly a resounding success’
Our evaluation of new incentive structures in contracts for GPs, pharmacists and dentists suggested these were not exactly a resounding success. Incentives are blunt instruments and in a context of complex and multiple policy goals, they can have unintended consequences. Nevertheless the policy of ‘Pay for Performance’ was continued and expanded, aiming to change the practice of other groups of staff, such as hospital doctors and nurses. Unlike the primary care professionals who were on the receiving end of new incentive contracts and who work in the independent sector, these are public sector employees. Here the approach has been a little different, with organisational (as opposed to personal) income linked to performance.
The Commissioning for Quality and Innovation Payment Framework, or CQUIN as it is known, makes a proportion of income conditional on the achievement of quality improvement and innovation goals. In the first year this proportion was 0.5%. This was increased to 1.5% in the second year of the scheme, which makes some sense as there is evidence that at around this level, the people who manage organisations really pay attention. (Though whether suddenly increasing their exposure to financial risk results in thoughtful and considered responses designed to improve quality is a moot point!) More recently the percentage was increased to 2.5% and David Nicholson, the former NHS chief executive, said before he retired that he wanted to see this rise to 4 or 5%, presumably due to fears that the 2.5% wasn’t having the desired effect.
Best Practice Tariffs
Best Practice Tariffs (BPTs) were introduced in 2010. These are designed to improve care by paying more for care which is in line with ‘best practice’ and less for care which is not. Our evaluations of CQUIN and BPTs found that the former did not appear to improve quality in the way policy makers intended. BPTs appeared to be more promising, but in both cases, our evaluations were limited by data issues, which made drawing robust conclusions difficult.
In contrast, our evaluation of the Advancing Quality PfP scheme in the NHS North West used robust data and methods. Publishing the findings from the first phase of the evaluation in the prestigious New England Journal of Medicine (NEJM) required rigorous review by subject experts who demanded further analysis and our research paper underwent much scrutiny before it was finally accepted for publication.
Advancing Quality is a voluntary programme that provides financial incentives for improvement in the quality of care provided to patients. It has been implemented in the north west of England since 2008. The programme is based closely on a PfP project implemented in the USA.
Advancing Quality was initially designed and supported by a non-profit US organisation (Premier Inc.) and involved similar quality indicators and financial incentive structures. However, it differed from the US variant in a number of respects. Importantly, it involved universal participation of eligible providers and implementation in a different health system.
The detail of the Advancing Quality scheme
The Advancing Quality scheme involved incentives to improve care and reporting of performance on quality measures for five clinical conditions: acute myocardial infarction, heart failure, pneumonia, coronary-artery bypass grafting and hip and knee surgery. The first year was run as a pure tournament. Hospitals were ranked according to performance and those in the top half (i.e. 12 of 24 hospitals) received a bonus payment. For the next six months, financial incentives were awarded based on performance ranking, but also providers whose performance was above the median score from the first year were awarded an ‘attainment’ bonus. There were no penalties for poor performers during these first 18 months.
In our CQUIN evaluation we found that quality indicators were often developed locally, changed annually and introduced in a hurried manner. One result of this was the large number of unique indicators, which often made benchmarking of performance impossible. Additionally, indicators were interpreted differently by different stakeholders.
In contrast, Advancing Quality involved standardised data definitions and bespoke software, underpinned by data assurance provided by the Audit Commission to ensure that comparisons (before and after, between trusts and with the rest of the country) were on a like-with-like basis.
A gruelling struggle
But aside from these technical aspects, collaborative events brought together staff from all 24 participating organisations to share their learning and work through common problems. In addition to shared learning, the development of this Advancing Quality ‘community’ appears to have been really important in providing emotional support for what was a gruelling and often uphill struggle for the staff involved.
We compared mortality in the north west for Advancing Quality incentivised conditions with the rest of England. We also chose a number of conditions for which performance improvement was not incentivised and compared these in both groups.
Findings
Our evaluation of what happened during the first 18 months of Advancing Quality found that it was associated with a reduction in mortality in incentivised conditions and that this reduction was significantly greater than in hospitals in the rest of England. Mortality reduced for the non-incentivised conditions in both the north west and the rest of England. But this reduction was not significantly different between the north west and the rest of England. Advancing Quality apparently saved almost 900 lives across the north west during its first 18 months. Based on the first 18 months of the scheme, Advancing Quality appeared to be a worthwhile and cost-effective intervention.
However, when we analysed results after 42 months, the picture was somewhat different. Mortality for the incentivised conditions continued to fall, but the reduction in mortality was greater in the rest of England than the north west. So does that mean that the scheme was a failure? Well that’s one way of looking at it. But it could be more complex than that. After the initial success of Advancing Quality, two other regions in England implemented a variant of the scheme. It might be that the improved performance in the rest of England was due to this.
Spillover effects
Furthermore, when we looked at the non-incentivised conditions, we found that the reduction in mortality for these conditions was significantly greater in the north west than in the rest of England. Again, we published our results in the New England Journal of Medicine after rigorous peer review, which suggests that our analyses are correct.
So if it’s not flawed research that is to blame, how can we account for our findings? One interpretation is that there were positive spillovers from the Advancing Quality initiative. Changes to care delivery for conditions incentivised under Advancing Quality may have resulted in more general improvements in care which explains the superior performance in the north west in relation to these conditions. However, it may be that the observed improvements in mortality in the non-incentivised conditions within hospitals participating in Advancing Quality were unrelated to Advancing Quality.
During the study Advancing Quality payment rules changed following the introduction of CQUIN. But we don’t know whether, how and to what extent this impacted on mortality.
Challenging context
The changing context of the NHS presents challenges for researchers attempting to evaluate policy innovations. In our bid to funders we outlined our plan to combine quantitative data analysis with more in-depth investigation, talking to people and observing what they did during the first phase. We would then negotiate with funders and decide on where to focus our resources for the subsequent phase. Having shown an impact in the initial evaluation phase, with an accompanying explanation of how changes were made, the funders were keen for us to focus our resources on measuring and quantifying impact in the follow-up phase of the study. This meant that although we used interviews and observation during the initial period to understand how and why things happened, we did much less of this in the second phase. The result is that we don’t know whether Advancing Quality was a roaring success or a failure. What we need now is further research to help us answer that question.
The paper
The paper, entitled ‘Long-Term Effect of Hospital Pay for Performance on Mortality in England’, was authored by Søren Rud Kristensen, Rachel Meacock, Alex James Turner, Ruth Boaden and Matt Sutton of the University of Manchester;Ruth McDonald of Warwick Business School; and Martin Roland of the University of Cambridge.
Tell us what you think – have your say below, or email us directly at [email protected]