The use of AI tools for clinical note-taking is set to surge among doctors, according to new research from medical indemnity provider Avant Mutual, leading to a growing need for robust regulation and mandatory minimum standards.
The research, involving nearly 600 Avant members across Australia, was conducted in August this year and focused on doctor use and knowledge of AI across specialties and career stages.
Avant’s legal and policy adviser, Tracy Pickett, said doctors were already using AI tools in small numbers, but there was significant interest in using AI scribes for clinical note-taking in future.
Ms Pickett is speaking on the topic of AI, medico-legal risk and regulation at the Australasian Institute of Digital Health’s AI.Care 2024 conference in Melbourne next week.
Ms Pickett said Avant’s research shows that one in two doctors are either currently using or want to use an AI scribe.
“While 11 per cent of doctors surveyed are already using an AI scribe, 39 per cent of doctors reported they were likely to use an AI scribe in future,” she said.
“Similar levels of interest were reported across career stages, with almost half of doctors practising for more than five years (47%) and six in 10 of those practising for five years or less either using or likely to use an AI scribe.”
She said there was a broad interest among the speciality groups, with at least half of each saying they were using or likely to use an AI scribe. This included 48 per cent of GPs, 64 per cent of surgeons and 61 per cent of physicians.
Ms Pickett said the research showed that doctors were attracted to using AI scribes by a combination of administrative efficiencies and the ability to focus on the patient.
“Doctors have indicated they are mainly using AI scribes to save time (88%) and focus more on their patients (72%), and because they were easy to use (53%),” she said.
“However, many doctors lack knowledge about AI generally – in fact, three in four doctors in our survey reported having a fair to poor knowledge of AI overall.”
Balancing clinical safety
Ms Pickett said Avant was concerned about the challenges facing doctors as they balance clinical safety and efficacy, the pace of AI developments, and limited knowledge about how AI works and its risks.
Avant is calling for proactive policy actions by government to strengthen transparency, assign accountability and establish robust regulatory oversight and governance for AI in healthcare.
Mandatory minimum standards for all AI tools used in healthcare should be developed, Avant says, so that any AI products not regulated currently would become subject to regulatory oversight.
This includes AI scribes, consumer health products, digital mental health tools and any AI tools that suggest clinical findings or make recommendations that could lead to adverse patient outcomes if inaccurate or not acted upon.
“AI holds immense potential for healthcare, however patient safety must always remain paramount,” Ms Pickett said.
“Added to this, using AI in healthcare introduces complex medico-legal, privacy and legal risks that existing regulatory frameworks are ill-equipped to address.
“AI scribes are a key example. These tools are not currently subject to any standards, making it difficult for doctors to assess whether they are safe or fit for purpose.
“It is critical that this gap is addressed, with mandatory minimum standards for all AI tools used in healthcare so that any AI products not regulated currently is subject to regulatory oversight.
“While GPs should be aware of their professional obligations when using any AI in practice, this must be in the context of a clear and robust regulatory framework that supports innovation and safeguards against potential harms as AI in healthcare continues to advance.”
Avant has released a position paper on AI in healthcare and medico-legal risks.
Tracy Pickett is speaking on the topic of AI, medico-legal risk and regulation: an indemnity insurer’s perspective at the AI.Care 2024 conference on Wednesday 27 November 2024.