Vendors of healthcare AI solutions must be prepared to share the original data set on which their technology is based.
That was the message delivered by Rob Bart, chief medical information officer (CMIO) at the University of Pittsburgh Medical Center (UPMC) in the US, to last week’s Future Health Summit in Dublin.
Dr Bart said this transparency was essential to ensure that AI tools are fit for purpose and equitable.
For AI algorithms, whether regenerative or generative, access to the original test dataset is critical to determining its value to a healthcare system, he said.
“You need to ask the vendors to give you access to that data. You need to understand the demographics of that data. You need to understand how the population that the algorithm is based on differs from or is the same as the population that you deliver care to.
“That will give you an understanding of the potential for any inherent bias either for or against the population that you care for.
“We have an expectation in the professional practice of physicians or care givers that they are accountable for the decisions they make. We also need to make sure that the tools we use are auditable and accountable.”
Dr Bart has extensive experience in delivering digital health solutions, initially as the first CMIO for the Los Angeles County Department of Health Services, before moving on to UPMC. He has overseen several major digital initiatives at UPMC, some of which have been developed in-house and others through outside vendors.
He believes securing the necessary transparency from developers will be a challenge.
“I have found that vendors are not reluctant to share their algorithm but they are reluctant to share the dataset,” he said. “I think, as healthcare practitioners, we need to really press the vendor community to be open and transparent about that. It will give us better insight into how to utilise their tool within our care delivery.”
Dr Bart said ethical considerations must be a central issue for any healthcare system contemplating the introduction of AI tools.
“One of the guiding principles in everything we do within healthcare is grounded in ethics and we need to ensure that the tools we bring in are also respectful of ethical principles and values.
“It is part of our professional responsibilities to make sure that these tools have the highest standard of ethical considerations embedded within them.”
Other important issues to consider are patient confidentiality and data ownership. This has become an increasingly challenging issue as developers gather data to facilitate AI learning, he said.
“Point of care ultrasounds have become popular in the US. Most of the companies that have these would like the images to go to cloud storage that they control. Those images are then used secondarily to train AI algorithms.
“We actually challenged some of these companies and we don’t allow them to extract our images into the cloud. We keep them on the premises or in a domain that we can control. We are not losing the ability to manage and maintain the privacy and security of the patients that we need to take care of.”
In an effort to meet some of those ethical responsibilities, UPMC Pittsburgh has established its own policy and procedure manual for the acceptable use of artificial intelligence technology. The document states that “AI has the potential to greatly enhance the delivery of care and improve business processes at UPMC. However, AI does not relieve UPMC staff (including medical staff) of its obligations, including those to patients.”
“Regardless of the use, the highest standards of professional responsibility must be observed. By example, in the use of AI, a staff member remains responsible for ensuring that any results of AI are accurate and appropriate, based on standards of care,” it states.