Your notes, your responsibility

3 minute read


The AI horse has already bolted and medical AI scribe services are now a dime a dozen – but don’t get too comfy.


As a medical professional, your notes are still your problem, no matter who – or what – is writing them, the RACGP has warned.  

Over the last year, medical AI scribe services have taken the sector by storm, popping up left, right and centre

Most services use automatic speech recognition to transcribe consults into patient notes, with differing levels of sophistication. 

Some services are even integrated with practice management software to consolidate patient files. 

Catching up with the bolted horse, the RACGP has released guidance for healthcare professionals on using the tools to their advantage while mitigating risks. 

The reoccurring theme was not to leave anything to chance

“[Doctors] should carefully review the output prepared by an AI scribe for false positives and negatives and edit the text as required (adding any missing information or omitting incorrect information),” read the guidance. 

“The [doctor] can then add their own notes and observations, and in some cases attach documents, before signing off on the documentation.” 

A lack of mandatory regulatory assessment for these AI tools, or product recommendations from medical bodies or regulatory agencies, leaves doctors and practice owners responsible for deciding what and how to use the tools. 

In its guidance, the college said it would be “prudent” for practice owners to draw up policies on the use of scribes in their practices. 

When picking a product, check where the data is stored, added the college.  

Any service storing data outside Australia must have similar privacy standards. 

“The software vendor offering the AI scribe should provide assurances about how the data is encrypted, stored, and destroyed,” said the college. 

“General practices should also have their own policies and procedures for managing information security to help prevent data breaches, such as enabling multifactor authentication on AI scribes and other applications.” 

Some vendors may also sell data to third parties or use the data for training or product improvement. 

“[Doctors] should consider whether any secondary use of data specified in the user agreement is appropriate and acceptable before purchasing the product,” said the college. 

But ultimately, it remains a doctor’s responsibility to keep their notes up to scratch, and to obtain consent from patients for use of any scribe service. 

“Some medical defence organisations (MDOs) require that [doctors] obtain the written consent of the patient,” said the college. 

“Recording a private conversation without consent is a criminal offence in some Australian jurisdictions.” 

Speaking on the guidance, RACGP president Dr Nicole Higgins said the big plus would be reduction of admin burden and avoidance of burnout. 

“The administrative burden on GPs needs to be reduced urgently – our annual Health of the Nation report found GPs are increasingly reporting the administrative workload and associated stress among their greatest concerns,” she said. 

“[AI scribe] tools will allow [doctors] to focus on the patient instead of their computer during a consult, meaning happier patients.” 

“But they do need to be used with caution.” 

And don’t worry; AI will never replace a doctor. 

“Everyone deserves the quality care that comes from having a GP who knows you, and your health history – AI can never replace this relationship,” assured Dr Higgins. 

“But it can help with administrative tasks, and this will help GPs focus more on our patients, which is what we want.” 

The RACGP urged doctors to use their judgment on what tools to use and declined to recommend any particular AI scribe products. 

End of content

No more pages to load

Log In Register ×