MedTech

A leading MedTech expert has issued a stark warning about the risks posed to patient safety and data privacy from widespread AI tools used by GPs in NHS consultations.

The advice comes after the news that bosses demanded a stop to hospitals and GPs using AI which could breach data intelligence rules and potentially harm patients.

Dr Andrew Whiteley, a former GP and founder of Lexacom – one of the UK’s longest-established developers of next generation speech-powered products – has called for urgent action.

He has requested that ambient voice technology (AVT) tools meet NHS compliance standards before being deployed in patient care.

“Harnessing efficiency gains in primary care via AI is important, but this should not come at the cost of patient safety or data security,” he said.

“The reality is that GPs are increasingly under pressure and time poor. Despite this, there is a clear hunger to interrogate emerging solutions, and embrace new systems that could be transformative for the clinicians themselves, and for patients too. 

“So the task at hand is to offer them guidance to make choices that are compliant with NHS regulations.

“In particular, any solution that processes patient consultations through AI must, at minimum, automatically redact personal data before processing – otherwise the risk of breaches is simply too great. 

“And crucially, we believe data must be stored in the United Kingdom, where it is protected under UK law – providing greater clarity and reassurance around accountability and legal safeguard.”

Former Social Chain MD joins board at Mr Investa

Several unapproved AVT tools are currently in use, often on free trials or without formal commissioning. 

In April, NHS England promoted the benefits of automated voice transcription (AVT) to doctors and outlined national minimum standards for its use.

But, as reported and seen by Sky News, a letter revealed NHS leaders later warned clinicians about the risks of using unapproved tools that don’t meet those standards.

The letter read : “We are now aware of a number of AVT solutions which, despite being non-compliant … are still being widely used in clinical practice.

“Several AVT suppliers are approaching NHS organisations … many of these vendors have not complied with basic NHS governance standards.

“Proceeding with non-compliant solutions risks clinical safety, data protection breaches, financial exposure, and fragmentation of broader NHS digital strategy.”

While NHS England sets baseline standards, it does not mandate which software suppliers can be used. 

Whiteley added: “There’s a tension we must address. The public is open to AI in healthcare – but only if it’s used responsibly. 

“Trust is fragile. We owe it to patients to show that innovation can go hand-in-hand with robust protections.”

Northcoders to offer 24 bootcamp places to Lancashire residents