07 February 2023

VAHI Analytic Fellow Dr Graeme Duke

Recently, my granddaughter brought home her first school report. Aware of her literacy and numeracy skills, I expected plenty of A's and B’s. To my horror I discovered D’s and plenty of E’s. I was bewildered and looking for someone to blame. Was it her parents? Or her teacher? The school?

“Grampa hasn’t seen a school report in two decades,” my son whispered to his daughter. “He doesn’t understand that D means ‘developing as expected’ and E means ‘exceeded expected level’.”

Have you ever had a similar reaction to your hospital’s performance report? Despite your hospital’s skilled and dedicated staff, the report card is emblazoned with red and yellow warning lights. How did you react? Shock, denial, blame? Here is a brief guide on correctly interpreting hospital performance measures so you can better understand what is really going on.

As outlined in the December edition of VAHI news, performance indicators fall into one of four categories - activity, resource, process and outcome. They can often be distinguished by the units of measure they employ.

Activity indicators report patient numbers (e.g. wait lists) or units of time (e.g. wait times) and primarily reflect the balance of demand and healthcare resources. High rates usually indicate insufficient resources rather than (poor) clinical performance. Solutions should address the relevant resource deficiency.

Examples of outcome-based performance measures include mortality, complication rates, and duration of treatment. Each of these reflect the net effect of patient complexity (casemix), services, treatment, and clinical management, and are easy to misinterpret.

Crude (unadjusted) hospital mortality is driven primarily by patient complexity - acuity, frailty, and comorbid disease. A hospital with a high number of deaths is usually caring for sicker patients; rarely, is there a deficiency in the quality of care. For this reason, mortality rates must be adjusted for patient-related factors (present before arrival) that influence survival.

Risk-adjusted hospital mortality rates, such as hospital standardised mortality ratio (HSMR), are driven by two factors: the quality of care and the quality of the risk-adjustment methodology. An increase in HSMR means more observed deaths than the number expected (predicted).

This may be due to an issue in the provision of care; or an unreliable risk-adjustor. Statistical tools are available to measure this ‘reliability’. For example, the Health Round Table HSMR is reasonably reliable; whereas the national (CHBOI) mortality risk-adjustors are less reliable.

The rate of hospital acquired complications (HAC) is frequently misinterpreted as indicating poor quality of care. Clinical evidence suggests otherwise. HAC rates are driven by casemix, treatment required, and the efficiency of the health service to identify clinical deterioration (Standard-8, ACHS). A hospital with a high HAC rate is likely to be providing high quality care to sicker patients at higher risk of clinical deterioration.

Average length of hospital stay (aLOS) is largely driven by casemix and access to healthcare resources. Rarely is aLOS an indicator of the quality of clinical care. Longer than expected aLOS indicates sicker patients who require longer recovery times. Delayed access (exit block) to the next stage of care, due to a downstream resource deficiency, is also a source of prolonged aLOS.

So, the next time you read your hospital report card don’t make my mistake. All performance measures are screening tools, not diagnostic tests. They are formative tools, not summative assessments. Ask ‘what does this indicator really measure?’ and, ‘what drives this indicator?’ Answering these questions will help avoid ineffective solutions, unnecessary anxiety and effort. Effective solutions require us to identify the true cause.

Dr Graeme Duke has joined VAHI as Data Analytic Fellow, in addition to his roles as Deputy Director, Eastern Health Intensive Care Services in Melbourne, and clinical lead for intensive care research. To get in touch with Graeme about how we can collaborate to use data more effectively, contact [email protected]