David Paton, professor of industrial economics at Nottingham University Business School, has regularly published figures based on the actual date of death in the English health service. He reckons the figures are “reasonably complete” only five or six days afterwards. By 6 May, NHS England had reported 281 deaths taking place on 30 April, 90% of Paton’s current total of 313 – which includes one report made as late as 20 September. The government now publishes its own UK-wide “date of death” data, reporting 548 deaths on 30 April. Looking at when deaths actually occurred provides a different view of the pandemic’s spring peak.
According to both the original and revised version of the government’s data, coronavirus death reports peaked on 21 April, a Tuesday. Both Paton’s and the government’s numbers based on date of death peaked almost a fortnight earlier, on Wednesday 8 April. However, the “date of report” daily figure remains the one most used, probably because it is immediately available and doesn’t change as more reports are made.
More data means greater accuracy
Paton advocates using a holistic range of data, including the different measures of deaths and case numbers, NHS data on hospital admissions and triage assessments, and data from a symptom-reporting app run by health science company Zoe. All have their flaws, and the more reliable ones – hospitalisations and deaths – are slower to react. They work best together.
“When all those things are moving in the same direction, that probably tells you an increase in cases is genuine, not just due to testing,” he says.
“If you rely on any one of those, you may make false policy decisions,” he says. Nigel Marriott, an independent statistician who provides consultancy and training, says using a range of sources will be vital in tackling increasingly localised outbreaks, with urban areas outside southern England worst affected: “The effort needs to be focused on those areas,” he says. But most of the data published by the government is not localised, so Marriott looks for rising case numbers in neighbouring areas as corroborating evidence.
Using data visualisation for impact
Freeguard, who writes an email newsletter on data visualisation, says the government has made effective use of graphs to communicate data on coronavirus, but that there has been “a lack of uncertainty”.
People may prefer a simpler message – “I think there is something about we the public trying to reach for certainty in uncertain times” – but this can be misleading. It can be done: the ONS, which Freeguard praises for its work on Covid-19, illustrates the uncertainty with its estimate that 103,600 people in England had the virus in the week from 13-19 September by also including 85,600 and 123,400, the range it is 95% confident the true figure is in.
In this e-guide: The public has been drenched in data during the Covid-19 pandemic. And the issue of the quality of that data has increasingly come to the fore.
Data quality is crucial to all business organisations, but its importance in healthcare has concentrated minds more widely, and the ways data has been managed and quality assured, or not, through the pandemic is illustrative for other sectors. For example, in this feature, the classic trade-off between fast data and best data is discussed, along with the need to explain the limitations of using the former, however necessary and “good enough” it might be for a particular business purpose.
And this article cites a global crisis survey by PwC, in 2019, showing the general importance of accurate data during crisis management. That survey found that three-quarters of those organisations who said they were in a better place following a crisis had come to strongly recognise the importance of establishing facts accurately during a crisis.