When a patient goes to the NHS, the information surrounding that event is private, often only discussed with family and close friends. Now imagine that the hospital, without the patient’s knowledge, shares this information with an unfamiliar organisation to train an AI system. How would that patient feel?
Different patients would have different answers, but many would have questions and concerns. Yet the use of data to train AI is a growing priority for many public bodies, including the NHS. AI can help the NHS survive and improve, but it should not be built at the cost of patient trust. Operating in a safe, legal and ethical way is paramount.
To support public bodies navigating this complex terrain, our Head of AI Governance and Policy, Robin Carpenter, has identified five critical considerations:
1 - Have clarity over where the data rivers flow:
Data will start in a system, move, change, and ultimately be deleted. You need sight of all of that in a document. Be aware of your archiving requirements early: AI datasets can be massive, so archiving for research for repeatability can be costly. Also ensure the flow is built with privacy and security in mind. You will probably learn as you map your data flows that you are missing technical infrastructure. Again, setting this up will take resources.
There are plans to standardise most data flows with NHS Secure Data Environments, but these face a few challenges, like sustainability, so some organisations are moving forward with their own processes. It is up to your organisation what approach is appropriate.
2 - Keep it legal:
The data will start as identifiable health data, and if you can achieve your aims whilst reducing the identifiability and size of the dataset, you should. However, to get there, you need someone that can legally see the data. A lot of the time this will be the Direct Care Team, though a Health Research Authority (HRA) Confidentiality Advisory Group approval or patient consent are also valid options.
If you have power over why and how the data is used, then you likely have responsibility to uphold data protection requirements. Most of this should be covered by your existing information governance policies, transparency statements, standard operating procedures, contract templates, etc, but you may need to update them for this work.
3 - Be clear if it is research:
You are probably doing research. The lines can be blurry between audit, service evaluation, and research. The definitions work 95% of the time. However, most of the time, if you are working with an external organisation to develop something new then it is research. If you’re doing research on NHS data, then you should seek the oversight of the HRA, with whom you will define limits of your programme's scope. It is a lengthy process, and either you or the supporting organisation will have to do it. Factor in time and resources for this.
4 - Support stakeholders in project review:
Thinking about the scenario at the top, where we considered who should have knowledge of a healthcare event, this is one of the functions of an oversight committee. They should review a project proposal before and after the data starts flowing - and you should let the world know they’re doing it.
Say a project team wants to automate monitoring of fatigued patients: the finance team may think this is great, the clinicians may be sceptical that relevant information can be collected, the patients may dread that social contact being removed. Some project proposals should not be pursued.
To make their decisions the committee will need relevant information like contracts (eg for processing and intellectual property), the story of the product (eg to understand utility), and Data Protection Impact Assessment (in some cases this may be covered by a system wide one, but in most cases it will need to be project specific).
As an organisation you will also confront new questions in your work (like addressing data and model bias) and this group will help you find an acceptable solution.
5 - Clearly define your remit:
Finally, be aware of how your process fits into the bigger picture. If someone proposes developing a medical device then they will need a quality management system (due to this, organisations often start by solely supporting exploratory research as a side step). Also, producing a product and using a product are two different things. Different education, infrastructure, risk management, and regulatory needs will come throughout the AI lifecycle. It is up to the organisation how much of the lifecycle you want to encompass.
Final word
At Newton’s Tree, our mission is to advance healthcare by providing evidence-driven insights and support. While our expertise lies in these areas, our commitment extends beyond any single project or service. We are dedicated to fostering improvements across the entire healthcare system. If you’re part of the NHS and want to discuss AI development or any other topic, get in touch, and we’ll do our best to support your efforts, no matter the focus.