Responding to Matt Hancock’s argument that the NHS will fall behind the curve if it waits to evaluate benefits before implementing new technology, Adam Steventon says evaluation is crucial and outlines four measures to ensure timely evaluation of new technology is achieved
Matt Hancock has been sharply critical this week of the tendency of the NHS to “pilot everything to death”, arguing that it should just get on with implementing new technology, as long as it is safe, rather than waiting for its benefit to be proven.
In response, GPs have been emphasising the need for independent, evidence based appraisal of technology, pointing out that it does not always produce its intended benefits. One of the technologies that Matt Hancock has championed is videoconferencing with GPs, of the type delivered by GP at Hand.
These may be convenient and could improve access to general practice, but come with risks. Depending on how they are implemented, videoconferences might stimulate demand for additional consultations among people who are generally well. They might also divert resources away from patients with higher levels of need.
Learning from the past
There are opportunities to learn from previous experiences with technology in the NHS.
Ten years ago, we were in a similar position with telehealth, a technology designed to allow patients with long term health conditions like diabetes to monitor aspects of their own health. The theory was that by relaying information like blood sugar to healthcare professionals working remotely, problems could be detected earlier allowing preventive care to be offered.
Advocates claimed that telehealth would improve patient care and reduce pressures on emergency departments. The need was so clear, advocates said, why evaluate? A familiar refrain at the time was that “the NHS has more pilots than British Airways”.
In that instance, the Department of Health decided to conduct an evaluation – in fact the largest randomised controlled trial that it had ever directly commissioned, with a budget of £32mn. The Whole Systems Demonstrator tested telehealth with 3,000 people between 2008 and 2012.
When it reported, the evaluation found that telehealth increased NHS costs without any benefits to patients. The evaluation challenged the assumptions that many had been making, and was immensely valuable at preventing the NHS from investing in an approach that might have diverted resources from elsewhere without benefit.
The problem is that the NHS still badly needs information about what impact new technology is having, as it is delivered
I was a member of the Whole Systems Demonstrator evaluation team and proud of what we accomplished. Yet, having spent four years working on the study, it was plain that it is not practical to implement technology in the NHS through randomised controlled trials.
One of the problems was that teams participating in the Whole Systems Demonstrator were not supported to adapt their approach in response to learning. Another is that progress in other parts of the country was largely put on hold for years waiting for the evaluation result.
Matt Hancock is right to point out that waiting for indepth evaluation of every new technological solution before implementing it, could result in the NHS staying behind the curve. The problem, however, is that the NHS still badly needs information about what impact new technology is having, as it is delivered.
Without access to such information, the NHS could not hope to learn from experience and improve patient outcomes over time. Nor could it monitor safety or provide assurance to the taxpayer that money is being well spent.
Alternative evaluation approaches
Thankfully, since the Whole Systems Demonstrator, alternative evaluation approaches have become available, taking account of the huge growth in the availability of data sets. The NHS has been investing in its ability to produce data analysis itself.
One example is the Improvement Analytics Unit, a partnership between NHS England and the Health Foundation.
The secretary of state has an opportunity to push for a truly data informed health service by insisting that the impacts of new significant technologies are tracked in close to real time
Evaluation results using control groups are being produced in nine months or less, rather than four years as was the case with the Whole Systems Demonstrator, with the potential to speed up the process still further. Analyses can be produced regularly to help NHS teams understand the impact of technologies and improve their use over time.
The secretary of state has an opportunity to push for a truly data informed health service by insisting that the impacts of new significant technologies are tracked in close to real time, with the information made available to inform better decision making.
Steps to enable technology uptake
To turn this vision into a reality, four actions are needed.
First and foremost, a clearer articulation is needed about the intended impacts of the technology, and how they are expected to materialise. This is needed to structure a good evaluation, as well as to ensure that the technology meets the need of the NHS.
There seems to be a bias towards improving access to general practice care, at the expense of ensuring that patients see the same GP regularly.This might not be the priority for the growing number of people living with multiple long term health conditions, who, our research has shown, could benefit more from continuity of care with the same GP over time.
Since the latter account for nearly two thirds of NHS spending and experience variable quality care, they should be a priority.
Second, the right data sets must be made available to support the evaluation. Currently, some of the data required for evaluation is being held by private companies, and not shared with the NHS in the ways required to support evaluation.
Third, the NHS must invest in its data analytics capability. Having the data is not enough: people are needed to turn it into action. The Health Foundation review of data analytics capability in the NHS found that the NHS is not good enough at turning data into insight, and more initiatives are needed like the Improvement Analytics Unit.
A clearer articulation is needed about the intended impacts of the technology, and how they are expected to materialise
Finally, if the NHS is going to embrace technology, then NHS teams are going to need the time and resources to test out new approaches, learn from experience, and embed new approaches.
Moreover, it is important that the benefits of new technology are not restricted to certain parts of the country, but that solutions work well everywhere, which will also help tackle health inequalities.
This requires dedicated investment, with a focus on helping local healthcare providers take on existing innovations and make them work well in their own organisations.
The NHS is right to continue to test new technologies. Rather than see evaluation as a barrier, it should embrace the new opportunities presented by data analytics to conduct rapid evaluations of approaches such as GP at Hand as they are being implemented.
Done this way, it can quickly learn and make adjustments to the way services are delivered as needed, without delaying progress.
1 Readers' comment