In the week that the European parliament took some very big strides towards regulating artificial intelligence through its landmark AI Act, a lot of focus is now on how consumer-grade generative AI like ChatGPT will affect healthcare. The EU is recommending a risk-based approach to AI regulation, setting out harmonised rules for the development of AI systems and their placement on the market. The framework includes four risk tiers: unacceptable, high, limited and minimal. Medical devices would be deemed high risk. There’s a good explainer here.
Clinical-grade AI and machine learning have been around for quite a few years and are making great strides in medicine, in particular for diagnostic imaging and pathology, but AI is moving so fast in the non-medical device sphere that there are increasingly strong calls for a halt.
Just two months ago, EMR vendor Epic and Microsoft announced they were jumping on the potential of the technology by launching an Azure OpenAI service integration with Epic’s EMR aimed at streamlining workflows. This is pretty much where generative AI technology is currently at – promises to free up clinicians by using AI to fast-track administrative tasks – but there are already experiments in using it to answer patient enquiries, for example.
Meanwhile, new study in the Journal of Medical Informatics, detailed here, looked at how generative AI shows promise as assistive tools for clinicians trying to solve complex diagnostic cases.
The ABC reported recently that a Perth doctor had been playing with ChatGPT and used it to generate a discharge summary, leading the area health service to issue a warning not to, and the AMA and RACGP to lumber into action to ask members to cease and desist.
It has spurred our leading experts on AI in healthcare to get more vocal. In a perspective article on the use of AI published this week in the MJA, Macquarie University’s Enrico Coiera, RMIT’s Karin Verspoor and the CSIRO’s David Hansen have jointly called for a pause on the use of non-medical grade AI in clinical practice.
“Systems such as ChatGPT are not designed for use in clinical settings and have not been tested to be used safely in any aspect of patient care,” they say. “When it comes to clinical applications, producing text or images that are convincing is not the same as producing material that is correct, safe, and grounded in scientific evidence.”
Professor Coiera led the establishment of the Australian Alliance for AI in Healthcare and it is this body that he suggests lead a national conversation on what to do next, having established a roadmap for AI in healthcare and calling for a national strategy to be developed. “It is the unintended consequences of these technologies that we are truly unprepared for,” the authors say.
“There has been a view in some quarters that all we need to do as a nation is adopt the best of what is produced internationally, and that we do not need deep sovereign capabilities. Nothing is further from the truth.”
Let us know what you think in our poll question for the week below.
Elsewhere, the South Australian government has come through with some much needed funds to make a couple of its virtual health services permanent. The Child and Adolescent Virtual Urgent Care Service (CAVUCS) has been a pioneer for paediatric virtual care services in Australia, and the adult SA Virtual Care Service (SAVCS) has built on SA’s long history with telehealth and remote health monitoring, starting with the iCCnet cardiacvascular network that still runs much of SA’s rural services.
And harking back to our whinge last week about cowboys in the telehealth sector, this week Wesfarmers’ new healthcare division picked up InstantScripts – not a cowboy – for a very cool $135m. Wesfarmers has been building its health division following the purchase last year of Australian Pharmaceutical Industries (API), which owns the Priceline and Soul Pattinson banners.
The silly money is still there, however. We reckon Wesfarmers overpaid.
That brings us to our poll question for this week.
Do we need a national strategy for AI in healthcare?
Vote here and leave your comments below.
Last week, we asked: Are there too many cowboys in the telehealth sector? 70 per cent of readers said yes. We also asked if you thought they were hurting its reputation. Here’s what you said.
Great article on the potential impact of Europe’s AI Act and ChatGPT in healthcare. It’s exciting to see the progress being made in using AI to improve patient outcomes and reduce healthcare costs. Looking forward to seeing how these developments will continue to shape the future of healthcare. Thanks for keeping us updated! -Alex Cool
“AI” means different things to different disciplines. I have written blog on its application to NLP and the issues with CHATGPT class of solutions. See Reining in the wonderment at ChatGPT and LLMs
https://www.jon-patrick.com/2023/05/reining-in-the-wonderment-at-chatgpt-and-llms-chapter-1/
Do we need a national strategy for AI in healthcare? An overwhelming yes from our readers: 92 per cent were in the affirmative.
We also asked: If yes, who should develop it? If no, what are your reasons?
Here’s what you said:
– Best course of action is free market competition within federal guideline
– The Australian Alliance for AI in Healthcare is best positioned to develop given it is a consortium of AI leadership in Australia. And the strategy needs to include a clear workforce strategy for responsible use of AI
– Independent think tank
– Opportunities are too broad for a single national strategy, and it would be out of date once it is defined. There would also be overlap with non-healthcare specific providers making it untenable
– The government
– It needs to be prepared by a leading committee (oh no!) of no more than 11 people comprising individuals or NFPs involved in digital health. No professional bodies because they are biased. Then there could be sub committees which are thrown a topic to research and send back to the lead committee. These sub committees could involve professional bodies and all given a deadline of when their topic is to be returned. Lead committee members can visit a topic sub committee at any time to monitor progress. When all topics are back, the lead committee can review the topics, change as required and assemble into the final recommendations for public comment. After the public comments, the lead committee can review all comments and make any necessary changes. Give it 2 years from go to woah.
– AAAiH
– Health Information Managers because of their experience and expertise with data governance.
– patient quality/safety and informatic communities
– This needs to be a centrally lead collaboration between consumers, indigenous and vulnerable peoples such as the disabled, clinicians, subject matter experts and the technology sector.
– Federal government with extensive input from experts, stakeholders and consumers
– AAAiH
– Federal government agencies in collaboration with State governments
– ADHA
– Commonwealth with non- commercial players
– Dept of Health and Aged Care with independent/ arms length entity providing governance advice to government and industry.
– Federal Government
– A true integrated collaboration of consumers funders AMA and medical colleges
– Federal with State, but owned 100% federally
– CIO’s in each state
– ADHA
– Base on best practice from the EU, don’t reinvent the wheel
– Australian Digital Health Agency – Its currently a significant beat up with a lot of gaps. A fundamental tenet to use is accessible, good quality data and in healthcare this is still a significant issue.
– Government in consultation with industry.
– Government with health experts
– Board consisting of key stakeholders – service providers, software houses, standards, governance, clinicians
– Industry leaders along with Government
– As the article in the Guardian refers to: there is the chance for democracy to be undermined if governance is not brought about to address and regulate it’s many applications. I also agree with the EU decision to credit and acknowledge the authors of the original works.
– The gov!
– A multidisciplinary team that includes health consumer and whānau PLUS descriptions about how the strategy might be implemented AND regulated.
– Department of Health (Federal and State)
– AIDH and ADHA
Combination of federal govt, local gov, industry