When machines take care of patients
“Technology can be better than humans at analysing big data sets and finding patterns that can detect and explain certain health conditions,” says Tom Hester, Vice President Healthcare at Lockton. “This has the potential to transform healthcare and create a vast number of new solutions, but liability insurance products will need to address the new risks that come with it,” Hester notes.
AI, for example, is helping to detect diseases such as cancer more accurately and in earlier stages than humans can, significantly improving medical assessments in breast cancer detection, to name just one application. About 10 percent of women receiving mammography are recalled for additional tests because their screening is determined positive, but only 0.5 percent of those screened are found to have cancer, according to the US-based National Cancer Institute. This means that about 9.5 percent of the women screened needed additional tests but had a false-positive exam.
Several research groups are utilizing AI to study radiological images and determine breast cancer risk as well as diagnosis and treatment, identification of molecular targets and drug discovery.
Another project which is already in the application phase is IBM’s Watson for Health, a computer system that is helping healthcare organizations apply cognitive technology to unlock health data and power diagnosis. Watson can review and store large amounts of medical information such as journal articles, symptoms, and case study of treatment and response. It can help gain insight using data-analytics, advisory services and advanced technologies such as AI in the fields of oncology, genomics, clinical trials, radiology and medication management, thus freeing up more time for healthcare professionals.
But if machines make an incorrect assessment this might no longer be covered by the medical indemnity protection of the healthcare provider.
“If a machine makes a wrong decision, the consequences may only be covered by the liability insurance policy if the decision has been reviewed and approved by a doctor,” Hester says.
If a machine makes a wrong decision, the consequences may only be covered by the liability insurance policy if the decision has been reviewed and approved by a doctor.
“Otherwise, it could be a case of product liability, and sometimes, it might be difficult to determine what caused the error, if it was the machine or the healthcare professional,” he notes.
The issue also applies to the deployment of robots in medicine which ranges from simple laboratory robots to highly complex surgical robots that can either aid a human surgeon or execute operations by themselves. In addition to surgery, robots are used in hospitals and laboratories for repetitive tasks, in rehabilitation or physical therapy.
“Already today surgeons perform operations via surgical robots without physically touching the patient once,” says Hester. “Through technology, highly specialised and respected surgeons can perform operations remotely far away from the patient using surgical robots”, he says.
The practice has gained traction among surgeons in performing complicated surgeries. Its use is transforming the delivery of patient care in places that severely lack access to expertise owing to geographical or time constraints.
If a physical presence is no longer required, a surgeon may perform an operation on a patient somewhere in Africa from a clinic in New York. Top surgeons would no longer waste valuable time traveling around the globe to perform operations. Similarly, patients would be spared the burden associated with travelling with a health condition. Such a scenario does, however, raise some elementary questions for liability insurance providers in case something goes wrong.
“The question of liability will need to be clarified in such cases,” Hester says. “The surgeons’ liability will usually only cover the damage if they have definitely caused the negative outcome. Any other issues such as performance issues of the machine or connectivity problems will have to be addressed separately,” he explains.
“Another critical aspect could be to determine which party should be considered liable – those in the robot-operating theatre in Africa or the surgeon working in New York,” Hester adds.
Meanwhile, an increasing number of online applications offer symptoms checker that analyse the condition of patients. They assess known symptoms and risk factors to provide medical information. While such services can be very helpful and save both patients’ and doctors’ time, the question of liability needs to be revisited here to prepare for the case that the IT platform fails or the patient takes the advice without speaking to a doctor.
“While innovation is absolutely welcome, they often require adjustments of the insurance programme,” Hester says.
Depending on the insurance policy, doctors may also not be covered for employee liability outside of the organization which could technically exclude online examination or advice.
For further information please contact:
Tel: 0207 933 2206
In the face of the coronavirus outbreak, the spotlight falls on the front line staff who are tasked with treating potentially tens of thousands of patients suffering various degrees of respiratory illness.
The UK government introduced a state-backed indemnity scheme in England on 1 April 2019, covering clinical negligence claims for general practitioners (GPs). But GPs should be aware that there are gaps in the coverage that could leave them without adequate protection in specific eventualities.
Pricing indemnity is a difficult task owing to the legal environment, medical claims inflation, discount rate and the prospect of a state-backed indemnity scheme.
The private healthcare sector in the UK is increasingly working closely with the National Health Service (NHS) to offer support in the treatment of coronavirus patients or to care for other patients, raising a number of liability issues that need to be addressed.