The challenge of AI in the doctor’s office

Published on the 13/08/2025 | Written by Heather Wright


The challenge of AI in the doctor's office

Unregulated, but time-saving and what about consent?…

Visit your doctor these days and chances are AI will be listening in on your conversation.

But while the AI scribes which transcribe and document conversations, are increasingly seen as a great way to remove some of the administrative burden and up time to focus on the patient, experts on both side of the Tasman have issued some notes of caution.

“Prior research suggests that skipping the manual process of writing notes may lead to missed clinical insights.”

Digital scribes – aka speech recognition software – have been around for a while in the medical sector, but today’s offerings harness AI and large language models to summarise the text.

Along with transcribing consultations in real-time and generating clinical notes for a doctor to review and add to a patient’s medical record, some AI scribes can generate patient letters or summaries in plain English, or propose follow-up tasks.

There benefits are highlighted in a new University of Otago survey of 197 primary care health professionals.

Lead researcher Professor Angela Ballantyne, a bioethicist at the university’s Department of Primary Health Care and General Practice, says 40 percent of those surveyed were using AI scribes to take patient notes, with 47 percent of them estimating that using them in every consultation could save between 30 minutes and two hours a day.

But a significant minority said the length of time required to edit and correct AI-generated notes meant there was no time saving.

Key advantages for those using the scribes were reducing multi-tasking, time savings, reduction in cognitive load and improved rapport with patients – the research paper notes that in New Zealand 20 percent of consultation time is spent interacting with a computer, with 12 percent completely excluding the patient.

On the flip side, key concerns included compliance with legal and ethical frameworks, security of patient data, errors or omissions in clinical notes and the risk of patient data leaving New Zealand.

Health professionals noted concerns about accuracy, completeness and conciseness of the patient notes produced, with comments about the scribes missing ‘critical negative findings’ and having a ‘quite high’ rate of ‘often quite subtle’ hallucinations. Coping with Kiwi accents and te reo Māori was a challenge, and one respondent noted they paused recordings if they needed to discuss information which identified the patient.

“Most AI scribes rely on international cloud-based platforms – often privately owned and controlled – for processing and storing data, which raises questions about where data is stored, who has access to it and how it can be protected from cyber threats,” Ballantyne says.

The research paper highlights the need for resources, guidance and training to ensure accuracy of notes, and support of consent mechanisms with the ability to opt-out by patients and still access clinical care. It found that of those using AI scribes, just 66 percent had read the terms and conditions on the use of the software, and only 59 percent had sought patient consent.

That opt out clause is something of a sore point for Australian AI expert Kobi Leins.

She recently noted on LinkedIn that a specialist told her that if she wanted to see them they would be using an AI transcription tool – one Leins, whose roles include technical expert with the ISO and Standards Australia, had reviewed ‘and would not want my kids’ data anywhere near’ –  and there was no option to opt-out.

She cancelled the appointment and approached the Australian Health Practitioner Regulation Agency to ask about their position that patients have no right to refuse.

“Most health AI is not regulated by the Therapeutics Goods Act,” Leins says. “I have reviewed many of these tools. Most have not previously had AI Impact Assessments, nor comply with ISO/IEC 42001 (AI management standard), let alone privacy o other legal compliance.”

She says of particular concern is when the tools are used on children or in the case of specialists, where there are no other options available.

She says she can’t think of any other situation in medical care where patients can’t opt out of an optional process, noting even the medical health records scheme has an opt-out provision.

As Lein notes, unlike most medical devices, AI scribes are largely unregulated with it up to individual practices to decide what tools to use, and how.

Health New Zealand approved the use of two ambient AI scribe tools – Heidi Health and iMedX – for Kiwi clinicians last month. A number of other ambient scribes, which automatically capture and transcribe patient-provider conversations and put the information into clinical notes, are currently being reviewed for approval.

Heidi Health says a pilot at Hawke’s Bay District saw average documentation time reduced from around 17 minutes to just over four minutes per patient.

The Medical Council of New Zealand is also expected to release guidance for the use of AI in health later this year. That guidance is likely to include a requirement for patients to provide consent, Ballantyne says.

Dr Saeed Akhlaghpour, associate professor of information systems at the University of Queensland’s Centre for the Business and Economics of Health, says he’s ‘cautiously optimistic’ about AI scribes in healthcare, saying they have the potential to benefit both doctors and patients – but it depends on how they are implemented by individual doctors and health clinics.

“Research has shown that AI scribes can help doctors stay more present in the consultation, listening more carefully, maintaining eye contact and focusing on the patient rather than the keyboard. This can lead to a more human, attentive and reassuring experience for patients,” Akhlaghpour says.

Australian reports suggest nearly one in four GPs are using the scribes.

But while Akhlaghpour sees plenty of benefits, he says the risks are also real.

“These tools can make mistakes, especially with strong accents, noisy rooms, or medical jargon. They are not currently regulated by the Therapeutic Goods Administration and legal responsibility for the content still lies with the doctor. Privacy is also a major concern,” he says.

If clinicians begin trusting AI-generated notes without properly reviewing them, errors may slip through, he says.

“But there’s also a deeper question here: Will outsourcing documentation change how doctors think, reflect and reason about a case?

“Prior research suggests that skipping the manual process of writing notes may lead to missed clinical insights.”

Despite the challenges, Ballantyne says the prognosis for AI scribes in the health sector is good – so long as use is coupled with appropriate training, good governance and patient consent.

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere