Your leading voice in digital health news
Twitter X Logo

Blog: National AI in healthcare plan gets three thumbs up

24 November 2023
| 14 comments
By Kate McDonald
Former Victorian Department of Health chief digital officer Neville Board at the AI.Care 2023 conference. Image courtesy Gil Carter.

Yes, our headline is a dodgy DALL-E joke but it’s also a reflection of the trust deficit in consumer-grade AI as the ChatGPT revolution continues almost a year to the day it was released, and in light of the extraordinary shenanigans that have taken place at GPT developer OpenAI over the last week.

For those out of the loop or not particularly interested, here’s a good backgrounder. For those wanting the inside scoop, here’s an interesting take. For those hankering for a good old-fashioned conspiracy, here’s an objective look. And for those wanting to know what this clique of Silicon Valley lunatics are really up to, this is probably your best bet.

What appears to be happening in the battle for OpenAI is perhaps a reflection of the wider debate over the opportunities and threats that AI poses and the motivations behind those now ruling the roost. There is the not-for-profit, research-oriented, for the good of humanity but warning of global destruction faction, and the hell for leather, accelerationalist capitalism, let’s embrace the opportunities faction, most of whom on both sides appear to belong to an emerging cult called “effective altruism” that appears to be anything but, exemplified as it is by utterly appalling people like Elon Musk, Peter Thiel and Sam Bankman-Fried.

All of this nonsense was going on in the background as the Australasian Institute of Digital Health held its inaugural AI.Care conference this week, which was a seriously good look at the serious side of AI. As was repeatedly mentioned over the two days, AI has been used in healthcare for more than 70 years, most prominently in radiology, and that discipline is still leading the way.

One of the best sessions was on the promise of AI in healthcare, which included Dimitry Tan from Harrison.AI and two clinicians who have used his technology – Monash Health’s Ronnie Ptasznik on implementing AI in radiology, and Virtus Health’s Petra Wale on using it for embryology. We’ll bring you those stories in the coming weeks.

But the conference also looked at not just the potential but the potential pitfalls of AI in healthcare. The release of the national policy roadmap for AI in healthcare was very warmly received and seemed to be a goer in terms of broad acceptance from the audience. It remains to be seen if it will be picked up by government, but it is a practical document outlining both the opportunities and challenges of AI in healthcare and how Australia can try to catch up with the rest of the world.

Along with the calls to develop a safe and ethical system, it also put a lot of emphasis on how the government can help industry to exploit the potential of AI. This is something Pulse+IT thinks is an essential and most welcome aspect of the roadmap.

One big recommendation concerned communicating the need for caution in the clinical use of generative AI, including the preparation of clinical documentation. General practice is currently taking to gen AI like a duck to water, according to the very entertaining Roy Mariathas, and so are hospital clinicians, who are being offered a somewhat safer alternative built on the same code, as Mater Health’s Bruno Braga explained. Attention was also drawn to the Victorian Department of Health’s advisory on generative AI, which is probably ripe for adoption by the other jurisdictions as everyone grapples with this brave new world.

AI may have been used in healthcare for 70 years, but the last year has been something else. This conference was a great way to begin to grapple with what is now confronting healthcare. We’ll probably need another one in another six months.

That brings us to our poll question for the week:

Do you support the AAAiH national roadmap plan?

Vote here, and leave your thoughts below.

Last week we asked: Should non-medical generative AI be regulated if used in clinical settings? 90 per cent of readers voted yes.

Here’s what you said.

Explore similar topics

14 comments on “Blog: National AI in healthcare plan gets three thumbs up”

  1. No priority area for data, eg standards, governance, accuracy, access to source data at the lowest possible atomic level, ability to make use of maximum number of data points to best suit AI use cases.

      1. This is 100% in the remit of ACSQHC. They administer and oversee the accreditation process, just as they do for other areas of healthcare and diagnostic services. ACSQHC engage the expertise they need to assess against the accreditation requirements that they administer.

      2. It doesn’t cover the existing limitation we have of quality data. There’s a substantial risk of dangerous models based on poor quality data.

        Quality data relies on data and interoperability standards. This is not mentioned in the report. Leaders and policy makers should prioritise data standards adoption and data sharing/access agreements based on patient preferences.

          1. “It doesn’t cover the existing limitation we have of quality data.”… I totally agree with this statement – where is the quality assurance of existing non-AI based applications in healthcare? There is no requirement for traditional software applications (or those governing the use of the applications) to support data standards, interoperability standards, or sound patient identity management. Currently anything goes, users are none the wiser, and the idea of training AI tools on this poor quality data foundation has the potential to exponentially increase our digital health risk. When are we going to start at the beginning and get the basics right first?

            • Whilst the roadmap is a great initiative, I agree that we are putting the cart before the hotse. Quality data is a prerequisite for good AI, and we need to solve that challenge first.

              • Name - George Margelis
              1. Yes indeed George, quality data should come first. However, all too sadly, their is little interest in enforcing that principle as opposed to promoting the widespread adoption and deployment of new technology as fast as possible. This is the root cause of so many major IT failures in health with the My Health Record being but one example.

                • Name - Dr Ian Colclough

          Leave a Reply

          Your leading voice in digital health news

          Twitter X

          Copyright © 2024 Pulse IT Communications Pty Ltd. No content published on this website can be reproduced by any person for any reason without the prior written permission of the publisher. If your organisation is featured in a Pulse+IT article you can purchase the permission to reproduce the article here.
          Website Design by Get Leads AU.

          Your leading voice in digital health news 

          Keep your finger on the pulse with full access to all articles published on 
          pulseit.news
          Subscribe from only $39
          magnifiercrossmenuchevron-down