Artificial intelligence scribes designed specifically for medical settings have only been available in Australia for a short time but are gaining momentum among GP clinics.
But while these tools can help ease the administrative burdens faced by GPs, the RACGP and AMA has called for some caution on how AI is used in healthcare settings.
It comes as new research suggests AI scribes were better at taking notes than GPs.
Dr Darran Foo, a GP and deputy chair of the RACGP Digital Health and Innovation Specific Interest Group, said anecdotally about a third of Australian GPs were now using AI scribes.
An AI scribe captures speech that is taking place during a clinical encounter, then converting the audio data into text.
As well as producing written transcripts of conversations these tools provide a summary and may also assist with the preparation of clinical notes, discharge summaries, treatment summaries, and referral letters.
Dr Foo recently conducted a study which found that clinical notes compiled by AI scribes were of a higher quality than those put together by GPs.
“We were comparing the quality of the documentation of the clinical note as output by the AI scribes and comparing them in a simulated setting to what the human doctor was writing,” he said.
“What we found was, by far, when we had independent GPs look at all the different outputs, they favored the outputs from the AI scribes and rated them as a higher quality output than what the human GPs had written.”
RELATED: Artificial intelligence – making healthcare smarter
Dr Foo uses an AI scribe himself and said in his experience having the tool had allowed him to be more engaged with his patients.
“I feel like I’m more present because I’m not looking at my screen, I’m looking at them, and don’t have to type every few minutes,” he said.
“They can actually be quite powerful if someone is talking about a really complex mental health issue, and it’s quite an emotional consultation, then the dynamic changes if I’m furiously typing away versus being there, fully engaged.”

Dr Foo said it was important that GPs had informed consent from patients to use such tools.
“Informed consent is a little bit more than just saying ‘Is it okay for me to use this?’. Some clinics may have an information sheet for patients to read at the front desk so by the time a patient talks to a doctor about it, they already have the background information, then the GP can summarise that and ask if they consent.”
He said most GP practices would have policies around the usage of AI scribes and if they did not, they should be working on them.
On due diligence, Dr Foo said it was up to GP practices to do their own around whether the tools they used met privacy standards in Australia.
Some AI tools collect data for secondary purposes, such as for training AI models in order to improve the output of the digital scribe or to develop other AI products.
RELATED: AI gaining on doctors
Clinicians who use an AI scribe tool should always review the notes and make changes when necessary.
“We know that these tools are not always 100% accurate, and so mistakes can occur, they can miss things, or they can incorrectly put in things that weren’t said or talked about, or it can mishear names or medication names or dosages,” Dr Foo added.
“So, it’s important that clinicians using these tools review the outputs because ultimately they’re signing off their name on that clinical record, and all the medical and legal responsibilities currently still will lie with the clinician.”
The RACGP also points out that summaries compiled by AI scribes may not contain detail from other sources such as recent hospital discharge summaries, pathology/diagnostic imaging reports, and other elements of the electronic health record or those nonverbal cues from the patient or data from medical devices.
RELATED: New AI system provides accurate detection of a range of diseases
The Collage warned that as these tools gain popularity and their use increases, there could be potential for GPs to become over-reliant on them and pay less attention to critical clinical details resulting in errors that could affect patient safety.
The AMA has also called for caution on the use of AI while acknowledging it has the capability to revolutionise the medical sector.
In a statement calling for expert clinical oversight of the use of AI in healthcare, the Association said it supports regulatory measures that protect patients, consumers, healthcare professionals, and their data. It insists AI must remain a complementary tool — never a replacement for clinical judgement.
AMA President Dr Danielle McMullen said: “Any use of AI in healthcare must be clinically led, ethical, safe, and patient-centred, with its sole purpose being to advance the health and wellbeing of patients and the broader community.
‘“AI should always serve as a supporting tool and must never compromise a medical practitioner’s clinical independence or professional autonomy.
“Accountability must remain with the clinician or clinical team and ethical considerations – such as patient privacy and surveillance, bias and discrimination, and the philosophical challenge of human judgement versus AI systems – must be thoroughly addressed.
“The final decision on patient care should always be made by a human.”
Want more news, clinicals, features and guest columns delivered straight to you? Subscribe for free to WA’s only independent magazine for medical practitioners.
Want to submit an article? Email editor@mforum.com.au