It has been some time coming but this week there was a major breakthrough in the use of AI-assisted transcription tools in general practice when Best Practice announced it had partnered with very youthful start-up Lyrebird Health to integrate the latter’s Lyrebird Scribe tool into Bp Premier.
Rival Medical Director was not far behind, announcing it too was in discussions with a number of tech providers to develop a transcription capability that Telstra Health plans to add to its new Smart suite of tools for MD. (Telstra Health has big plans for its Smart suite that are worth taking a close look at.)
While these sorts of cheap and freely available AI-assisted voice to text apps for medical transcription in primary care have been around almost a year – a veritable lifetime in AI terms – what is different about BP’s announcement is that it’s embedding Lyrebird into its PMS. Other popular offerings out there like Heidi Health do the same job in terms of AI prompts and can be used with any PMS just by copying and pasting notes, integrating the app into the PMS has always been the holy grail for GPs.
What is more interesting than the tech – voice-to-text medical transcription has been around for quite a few years now, although it has been massively boosted in efficiency through conversational and generative AI – is what routine use of these tools will do for the patient experience. It will be interesting to see how GPs in particular introduce patients to the use of these apps in routine consultations with consent to being recorded having long been an impenetrable barrier.
These new tools though are able to listen in, provide real-time transcription, convert the consult to a clinical note and then wipe all trace of audio in a matter of seconds. It will be fascinating to see if there are immediate changes to doctor-patient interactions, with the doctor free to concentrate more on the patient in front of them than their computer.
Dictation tools have of course been commonplace for transcribing dictation for medical specialists and hospital notes for years, but they have never been able to penetrate the GP market. GPs have been very early adopters of this newer technology though, probably because it’s actually useful to them.
That BP has chosen to partner with a start-up like Lyrebird, established by two uni students with no history in healthcare, would normally raise everyone’s eyebrows. The commitment to privacy and security for patients and practitioners is impressive, although we would like to know a little more about how their AI has been trained and on what. We also think it might be an idea for the GP colleges to publish some guidance on how to use these tools with patients, including gaining their consent.
We also think there is real potential here for GPs to not just use the tools for their own notes, but for generating consult notes for patients. No more in one ear and out the other? That would be a breakthrough indeed.
That brings us to our poll question for the week:
Will AI transcription tools improve the patient experience in consultations?
Vote here and leave your comments below.
Last week we asked: Is there a need for comprehensive standards for LLM-generated clinical summaries? Overwhelmingly yes, our readers said: 88 per cent to 12 per cent against.
If yes, who should be in charge of developing them? Clinicians and GPs, the medical profession, colleges and regulators some said, while others said the government, Standards Australia and the Digital Health Agency. Here’s what else you said:
Fascinating that we are going to take a transcription tool with a 10% word error rate in ideal conditions (not a GP office), connect it to a LLM trained on reddit, wikipedia, and the unwashed text on twitter, and expect it to consistently deliver high quality error free summaries ready for insertion into a medical note.
Even if it does work well, how are we happy to simply turn it on in the vast majority of General Practices (for a fee), all the while knowing that there has been zero research conducted by the medical profession and colleges on whether this is safe?
I love innovation, but we should at least be TRYING to understand the true level of risk of these technologies before allowing them to be used mainstream.
The next thing is that these summaries will be sent, with a very low likelihood of adequate proof reading, to other specialists and health care organisations, who might have no idea it is the creation of an untested tool. It won’t take too many word substitutions or errors of dose, omission or negation before the Swiss cheese slices will align and somebody gets hurt…
I am not anti-innovation, but do the end users TRULY appreciate the risk posed by these tools?
Focus on the patient and not on the documentation during the consult
People prefer to talk to someone who is giving them their attention, having to type information takes the clinician’s attention a way from the patient. In the case of Indigenous patients this can mean they get up and leave
Focused attention
Gives you back time!
It hears everything, even the comments I might have overlooked or forgotten
Diminish transcription errors by gp. Include more of the direct patient input
Better engagement with patient. Less time spent on documentation. Increased efficiency for doctor and increased satisfaction for both patient and doctor.
No more staring at computer screen and not interacting with the patient
Able to Focus on, engaging with the patient and asking more relevant questions but if AI cannot understand accents and other health conditions, the transcription will be completely rubbish
It’s not clear as to how these tools would be used in the article. There needs to be a lot more discussion with regards their use in General Practice where the history has been not to provide any notes of the consultation. The risk is if they get it wrong and the patient can prove that they have followed the instructions and it results in harm, any documentation provided would be evidence in law. Might be of benefit in Aged Care where the best GP’s don’t do home/residential care visits and there is wide disparity in the quality of care that is provided based on my experience.
Reduce patient requests being forgotten or not transcribed. Free up cognitive time and f doctor to stay focused on problem solving the medicine and not the It. Better quality referral letters. Improved communication by improved accuracy
I can see both parties getting something out of adoption. Practitioners get an accurate record of the consult and patients, an understandable summary complete with useful links and information.
As the technology advances and trust is established in the recorded output, time poor practitioners can redirect their focus towards patients, building empathy and potentially uncovering insights beyond what a patient may initially divulge, due to their ability to sustain eye contact and attentive listening.