Making hospital death rates public is of questionable value, researchers maintain.
In a report published online in the BMJ, the investigators said a previous study of mortality relating to congenital heart surgery had used routinely available but misleading hospital data.
The researchers, led by consultant cardiothoracic surgeon Professor Stephen Westaby, of John Radcliffe Hospital in Oxford, UK, found that the data gathering system used in the earlier study – published in the BMJ in 2004 – had underestimated the number of infant deaths.
The 2004 study singled out the Oxford assessment unit as having significantly higher mortality than the national average for open-heart surgery on infants. Yet the current paper, using data from a different source – the Central Cardiac Audit Database – shows that Oxford’s mortality statistics did not actually differ from the mean for all centres (10% compared to 8% in 2000–2).
The authors looked at the 2004 report produced by the Dr Foster Unit at Imperial College, London. The unit’s stated aim is to develop methods to explain variations in mortality in hospital trusts in England. Its research is funded by Dr Foster Intelligence, a joint venture between healthcare data company Dr Foster Ltd and the NHS Information Centre.
The 2004 report was published in the wake of a high-profile inquiry into congenital heart surgery deaths in Bristol, UK, which had a profound effect on surgical practice in the UK. The inquiry used hospital episode statistics (HESs) to compare mortality between cardiac surgical units across the UK.
The 2004 study, by Dr Paul Aylin of the Dr Foster Unit, described these mortality statistics.
Professor Westaby and colleagues compared mortality figures reported in the administrative HES database and using an alternative system, the clinically based Central Cardiac Audit Database, for infants under 12 months undergoing cardiac operations. The statistics were gathered between 1 April 2000 and 31 March 2002.
The researchers found HESs did not provide reliable patient numbers or 30-day mortality data. On average, HESs recorded 20% fewer cases than the CCAD and only captured between 27% and 78% of 30-day deaths, with a median shortfall of 40%.
In Centre A, with the largest number of operations, 38% of all patients were missed by HES and only 27% of the total deaths were recorded. Overall, mortality statistics were underestimated by 4% using HES data.
The authors of the current study say publishing inaccurate statistics detracts from public confidence. “If mortality statistics are to be released their quality must be beyond reproach,” they caution.
They acknowledge that the media are keen to publish such statistics, and pinpoint the Dr Foster Unit, which has provided newspapers with information on heart disease, for example, in return for a fee.
They conclude: “Given the problems with data quality, the imprecision of risk stratification models and the confrontational agenda in the media, we question the value of placing mortality statistics in the public domain.”