2. Why do we measure hospital performance?
1. To ‘sack the Board’
Aggregate, or aggregated performance measure
Under direct control of those being assessed
– Costs OR Process indicators
Not so interested in magnitude of difference
– Identify general poor performance
2. To inform target areas for quality improvement
Condition-specific
Not necessarily under direct control
– Costs, outcomes, and processes
Value of improvement important
5. Not a new idea
“a programme of work not only to identify causes of variation at
specific local level, but also to prioritise those variations and causes
that have the most important impact on equity, effectiveness,
efficiency and patient health outcomes”?
Variations in health care: The good, the bad and the inexplicable, The King’s Fund 2011
6. Outline
Evidence of variation;
The information: costs, outcomes, and processes
The incentives: potential strategies for using the information
7. Variation in process
RAND Corp (McGlynn et al, NEJM 2003)
– 30 acute and chronic conditions and preventive care
– 10 to 80% participants receiving recommended care
CareTrack (Runciman et al, MJA 2012)
– 22 common conditions
– 32 to 86% compliance with appropriate care
8. Variation in costs
Duckett & Breadon, Controlling costly care: a billion-dollar hospital opportunity, Grattan Institute 2014
$ per admission (2010/11)
9. Variation in outcomes
*adjusted by age, sex, indigenous origin and Diagnostic Group Hierarchical Condition Category (HCC) risk score
30-Day Readmission Rates by Area Health Service of Residence
(NSW: 1 July 2005 – 30 June 2008)*
10. Policy/Practice Relevance?
Appropriate processes in theory ≈ cost-effective care in practice?
– Costs? Timeliness?
Best outcomes?
– Meaningful outcomes? At what cost?
Lowest cost?
– With what outcomes?
Which providers are providing cost-effective care, and how are they
doing it?
11. Using comparative condition-specific health service data
Systematic review
– Feedback of comparative performance data alone does not work
Anecdotal
– Service change works best when common recognition of a problem
Theory
– Costs and post-discharge outcomes data demonstrates problems
Despite risk adjustment ‘my patients are sicker’ syndrome
– Process data provides
Additional rationale for the existence of a problem
Starting points for identifying solutions
11
12. Case study: ED chest pain presentations
Four public hospital in South Australia
Clinical context, underlying diagnosis could be:
– ST Elevated MI
– Non-ST Elevated MI
– Unstable angina
– Non-cardiac chest pain
Aims:
– Identify benchmark performer(s) on basis of costs and outcomes
– Assess the potential value of improved performance at non-
benchmark hospitals
– Inform targets for investigation – variation in clinical pathways
12
13. The data
Clinical data extracted from common data warehouse
– key procedures, pathology test results, movement between hospital
departments and wards, etc.
– automated linkages to population-based mortality data.
Administrative data, linked to index events to identify other
inpatient separation (episodes) at all South Australian
hospitals
– age, gender, and postcode (SEIFA), co-morbidities
13
14. Missing data
ED capacity
– ED beds, personnel
– Other presentations: rates and severity
Inpatient capacity
– Bed occupancy rates
Cardiac
Non-cardiac
– Staffing ratios
14
15. The dependent variables
Costs: Bottom-up patient-level costs available for all inpatient
separations
Outcomes: 30 day/12 month related admission (unstable
angina, MI, or stroke) or mortality
Process, not quality indicators
– Pr(admission)
– Time to admission
– Pr(PCI | angiogram)
– Inpatient LoS
15
16. The analysis
Separate multiple regression models fitted
– Cost, outcome, and process variables
Hospital interaction terms tested
– Identify patient sub-groups driving variation
Mean covariate values used to generate predicted outputs
– Bootstrapping, stratified by hospital, to represent uncertainty
16
24. How to compare larger numbers of providers?
Stratify by clinical process and identify best performing strata?
– Hypothesis driven?
– Process mining driven?
Group hospitals by performance and compare processes?
– Empirical stratification
25. Informing action
“Publicising the existence of unwarranted variations and their
causes does not guarantee that they will be tackled.”
“local health organisations… be required to publicly justify and
explain in a consistent way their relative position on key aspects of
health care variation.
…it may also be necessary to explore the development of harder-
edged, locally focused incentives to encourage action to deal with
unwarranted variation.”
Variations in health care: The good, the bad and the inexplicable, The King’s Fund 2011
26. The incentives
26
1. Sticks
a) Public Reporting
b) Mandated action plans
2. Carrots
a) Pay-for-Performance?
b) External Services Improvement Fund
Mindful of the NHS Improvement Fund…
1b + 2b
– Externally identified areas for improvement
– Externally reviewed and supported applications to fund improvement
projects
27. Prioritisation criteria
27
Expected Value of Removing Variation (EVRV)
– Hospital 2, 1527 patients per annum
– Costs per patient could reduce by $630
– Mortality could decrease by 1% (x $200,000?)
– Readmissions decrease by 2% (x $50,000?)
Annual EVRV = 1527 x $(630 + 2000 + 1000) = $5.5 million
x5? x10?
28. Zombies and Paradoxes
• “political paradox of rationing”
• “the appeal for transparency in medical decision-making is
like a zombie, an idea that refuses to die despite its limited
utility”
• Do the benefits of open and explicit quality improvement
outweigh the:
Financial costs and Political risks of unrealistic expectations?
Oberlander et al, Rationing medical care: rhetoric and reality in the Oregon Health Plan, CMAJ, 2001; 164(11):1583-7
29. Summary
29
Huge scope for service evaluation and improvement
– Electronic data systems
– Linkage facilities
Post-improvement evaluation: ICER estimates to inform the
– Design of future improvement processes
– Balance of spending on new technologies and existing services
30. Acknowledgements
Clarabelle Pham, Andrew Partington, Orla Caffrey, Jason Gordon,
Brenton Hordacre, David Ben-Tovim, Paul Hakendorf, Maria Crotty
Funders: SA Health, NHMRC, HCF Foundation
Editor's Notes
providing the information and creating the incentives for a self improving health care system
Should we pursue compliance with clinical guidelines at all costs? What are those costs?
Should we pursue compliance with clinical guidelines at all costs? What are those costs?