04 April 2023

Graeme Duke photo

It’s the title of a favourite children’s story in our house. Five friends go for a ride in a rowboat. They climb in one-by-one until the smallest, the lightest, and the last causes the boat to sink. Monitoring your hospital’s performance can feel similar. All is going well, the service grows, until an unknown element ‘sinks the ship’. Help! My hospital has been declared an outlier.

Who? What? Why? Possible answers are numerous. Without a defined process it is easy to ignore a key element or miss a unique opportunity. Or, worse still, fix the wrong problem.

No signal of a potential outlier should be ignored. What should be ignored are excuses from doctors (like myself) saying their patients are sicker or more complex than anywhere else. If a real problem exists, should we ignore it? Even if the signal is a false alarm, diagnosing the cause may help us minimise its repetition.

A good first step to investigating a ‘signal of concern’ involves going back to the source and reviewing the primary data source and definitions. More than one-half of apparent outliers arise from missing data or methodological errors. Ask your health information or decision support team to extract data (for the same period) and an interested clinician to review them. Are the reported numbers similar? Are there missing records or data elements?

Some examples that triggered false high mortality signals include one hospital, late for its data submission deadline, only submitting data for those who died (because they were easier to code). Two other hospitals under-reported secondary diagnoses, creating the appearance its patients were not so sick. Another intermittently transitioned acute care beds to the palliative care stream, without informing the coding staff.

The second step is to review the performance indicator methodology - often available in the report footnotes. What inclusion/exclusion criteria were applied? Is the method statistically and clinically sound? Were major sources of bias taken into account? How was the benchmark derived? Where possible, ask local experts for their insights. If not, ask me.

The next step is to compare with contemporaneous results from internal audits and other performance indicators. Do they also contain ‘signals of concern’? More than one signal begs further investigation.

The final step is to engage in dialogue and seek insights from stakeholders, including clinicians at the coalface. Has demand increased? Has access, egress, or patient flow been compromised? Has casemix shifted? Has the model of care changed? Has service delivery or personnel changed? Have referral pathways expanded or altered?

Even identifying the cause of a false alarm may us help improve the quality of documentation, coding, and future reports.

In the absence of data or methodological errors, the most common cause of a ‘signal of concern’ and possible outlier are a shift in casemix, or demand or resource gaps. This includes access to diagnostic or therapeutic interventions.

Poor clinical care is a less common cause for outlier signals. Moreover, analysis of these signals even when the evidence supports a true outlier may provide compelling evidence for additional resources - no one likes to be the captain of a sinking ship.

Dr Graeme Duke is VAHI’s Data Analytic Fellow, in addition to his roles as Deputy Director, Eastern Health Intensive Care Services in Melbourne, and clinical lead for intensive care research. To get in touch with Graeme about how we can collaborate to use data more effectively, contact [email protected]