An HSJ webinar discovers how Greater Manchester’s innovative data approach is transforming patient care and system efficiency, heralding a new era in healthcare analytics
Improving population health will be key to helping the NHS moderate and meet demand over the next few years. Yet, it is often a challenge for NHS organisations to have much impact in this area — often because they do not know who they need to target.
In association with
In a data-heavy environment, it can still be hard to deliver the benefits potentially offered by the massive amount of information the NHS collects. The Federated Data Platform, for example, does not include information from primary care.
An HSJ webinar, in association with Snowflake, looked at how data is being used in Greater Manchester to drive improvements which could have both long-term and short-term benefits for the NHS and the populations it serves.
Harnessing GP data for population health insights
The project in Greater Manchester has access to a large amount of data from providers, GPs and even organisations outside the NHS such as local hospices, explained Graham Beales, deputy director of data insights and intelligence for the integrated care system.
However, getting that data requires collaborative work. “Through the devolution journey we’ve been able to concentrate on building a lot of those direct relationships with our providers,” he said. Part of getting organisations to share data was establishing trust around how it would be used — that it would be used to aid day-to-day decision-making rather than performance management.
“We’ve worked with our acute providers to look at discharges and virtual wards with data that can be linked rather than in silos or aggregate data, and that will allow us to have a much richer picture of what’s going on in those spaces.”
Working with mental health providers had led to daily updates on where patients were placed which enabled a Greater Manchester-wide picture of the need for outsourced beds, he said. Information from across the urgent and emergency care services — including ambulance and out-of-hours providers — gave up-to-date information on flow across the system.
“Everyone gets the same view of the system as we move forward,” he said. Data was validated and accepted by all — effectively ensuring there was “one version of the truth.”
But developing this took a lot of preliminary work in terms of data quality, checking and looking at coding, as well as building in changes such as new drug lists
One benefit for providers was that once data was shared, the ICS could then submit some of the data demanded by the centre on behalf of all the trusts — they did not have to do it all themselves.
However, it is bringing in information from GP practices which has made a big difference, allowing a different approach to population health. Matthew Conroy, analytical service lead for primary care for the ICS, said there was access to coded data covering every patient in the ICS area from their birth to the present day — a staggering 4 billion records.
But this could now be sifted through extremely quickly. Previously, it took over an hour to calculate all the long-term condition registers for the quality outcomes framework for just 700,000 people but now it takes just 10 to 15 minutes to do more registers for 3.3 million people in the ICS area, he said.
“I think the approach that we’re trying to take is that the GP data can underpin every piece of analysis that we do. There’s so much richness in terms of the understanding of the diagnosis of long-term conditions, how those are managed and the impact that that has on the wider system,” he said.
Enhancing patient care through data quality
One example of this was work being done around the elective waiting list. The most up-to-date information on areas such as BMI, blood pressure and smoking status could be shared with secondary care colleagues to assess fitness for procedures — and then the patient could be encouraged to lose weight or stop smoking before they were admitted.
It also allowed for risk stratification of patients. “We are in a situation now where, for each of the different cardiovascular diseases, we can understand who are our patients at the highest risk…and how we can support those and identify those within GP practice systems with a health inequality lens,” he said.
So, for example, the system allowed the identification of Asian patients who are three times more likely to have uncontrolled blood pressure and the management of these patients could be seen over time. Hopefully, there would be an improvement, not only in the management of all patients with blood pressure but also a narrowing of the gap between Asian patients and those of other ethnicities, he said.
Practices were keen to use this data analysis to pinpoint groups of patients at particular risks — such as middle-aged patients from the Bangladeshi community who did not come forward for bowel cancer screening, he added. The Greater Manchester team could build them a dashboard looking at their specific focus areas and allowing them to see what impact they were having. They could also build flowcharts so that GPs could easily see what action they should take next with individual patients such as a change in medication thus saving them time.
“If I’m honest, we’re only really starting to scratch the surface of what’s possible with the GP data,” he said. “We’re not saying you’ve missed all these patients. We’re trying to help them identify opportunities for how they can support the patient better.”
But developing this took a lot of preliminary work in terms of data quality, checking and looking at coding, as well as building in changes such as new drug lists. Mr Conroy said that some data was wrong — for example when someone’s height was measured in centimetres rather than metres and that gave a very high BMI — and that could be easily corrected. But other problems involved close working with providers to sort out the errors.
Reducing silos, boosting efficiency
Reducing data silos is one of the key things Greater Manchester has done, said Janet Broome, account director for the NHS at Snowflake. It helped data security and governance because data was not being moved around so much.
But it also enabled the NHS to make better use of the data it had. “One area that Snowflake excels in is the ability to handle all types of data — structured, semi-structured and unstructured. So if you think about electronic health record systems, 80 per cent of the data within them is unstructured. It’s clinical notes, it’s images. And I would go as far as to say, 80 per cent of the data in the NHS is unstructured,” she said. “So imagine being able to harness things like your clinical images, clinical notes, sentiment, analysis. PDFs, emails. All of this is a game changer.”
She stressed the cost of sticking with the status quo and not taking steps to reduce administrative costs and improve patient outcomes — especially at a time when the government was making some funding available for technology.
This webinar is now available on demand.
To access the recording, visit here and click play.
If you had previously registered as a viewer for the event, you will be able to view the recording immediately.
If you have not previously registered, you can do so here to get access to the recording.