Luis Louro - Fotolia

Sharing medical imaging data with AI apps raises concerns

Opening the door to third-party AI apps is a double-edged sword for patients. Its potential for healthcare is promising, but privacy is still a work in progress.

Applying artificial intelligence to medical imaging data is one of the fastest-growing applications of AI in healthcare, but the practice is also fraught with patient privacy challenges, especially as third-party applications enter the fray.

Data privacy concerns stem from the fact that technology advancements, such as the capability to connect and share medical images with third-party AI apps, are outpacing federal healthcare privacy regulations, said Dr. Yvonne Lui, associate chair for AI at the NYU Langone Health department of radiology.

"The confluence between advances in computer vision and AI and medical imaging poses some challenges in terms of patient privacy," Lui said at the recent Radiological Society of North America (RSNA) 2020 conference.

During a session at the virtual event, Lui said sharing medical data with third-party apps is a less than perfect process. Patients are often subjected to lengthy consent contracts, which can introduce unintentional bias into the data set, and data anonymization practices are still flawed. But there is interest from the radiology community to get the process right, as it could pave the way for developing valuable AI medical imaging tools.  

Privacy implications

In 2020, patients now have greater access to their data -- including medical imaging data -- than ever before, and they can more easily share that data with third-party apps. Indeed, federal regulators have made patient access to health data a priority, requiring health systems to provide standardized APIs that can connect to consumer apps.

But ease of use can also create problems. If patients choose to share health data with third-party AI apps, they can overlook the privacy implications they may encounter -- and for good reason. Terms and agreements patients are typically required to sign before sharing health data are lengthy, complicated documents, sometimes topping more than 70,000 words, Lui said.

"If you're like most people I know, you scroll to the bottom and click agree," Lui said. "This is what I call the fallacy of consent."

But if app developers introduce an even more complex data sharing consent process, they may see even less participation from patients. This could result in less varied data with fewer participants and introduce unintended biases, which Lui said strikes a particular blow to AI and machine learning -- techniques that require large, varied data sets to build reliable models.

"It's been shown that higher socioeconomic groups are more highly represented in research requiring complex consent," she said.

Another challenge facing patient privacy is that anonymizing medical imaging data isn't a foolproof process.

Digital Imaging and Communications in Medicine (DICOM) metadata, pixel-level info and other data are burned into each medical image. DICOM metadata, which provides information about the image such as size, dimensions, equipment settings and device used, can include hundreds of fields for each image, according to Lui. While it's clear that some data fields such as patient names need to be removed in the de-identifying process, other fields, such as specifics about a study, are not as cut and dried. Additionally, medical device manufacturers can have their own unique data fields, which means figuring out what data should be eliminated in the de-identification process can differ from manufacturer to manufacturer, Lui said.

To address privacy implications brought on by sharing medical imaging data, Lui said regulatory protections for patient data such as the Health Insurance Portability and Accountability Act (HIPAA) need to be revisited by federal regulators and the medical community, healthcare systems need to work with vendors to avoid placing identifiable patient information in proprietary DICOM fields, and better methods of data de-identification and sharing need to be explored. 

"If done well and done responsibly, the goal is to improve the lives of individuals and society who are the very sources of this data," she said. "So it does come back in a positive way."

Ethics of sharing medical imaging data for AI

Clinical data such as medical imaging data has become more valuable to researchers as the ability to mine that data for insights using AI apps has advanced, said Dr. David Larson, a radiology professor at the Stanford University School of Medicine.

"We all recognize the advancing technology of AI and its potential implications not only for radiology but the entire field of medicine and patient care," he said during his talk at the RSNA virtual conference.

While AI in medical imaging is developing rapidly, ethical, regulatory and legal frameworks for using and sharing clinical data are evolving at a slower pace, especially as healthcare systems have begun sharing medical imaging data with AI apps for research purposes and AI tool development. 

We maintain then that it's ethical to share data with third parties as long as, first and foremost, privacy is safeguarded throughout the process.
Dr. David LarsonProfessor of radiology, Stanford University School of Medicine

As healthcare systems share data with third-party AI apps, they have to be sensitive to patient privacy and the ethical considerations of using such data, something Larson has worked to address in his efforts at Stanford University.

Larson, for one, believes clinical data should be treated as a form of public good. Treating it as such means that sharing medical imaging data with third-party AI apps can forward research efforts that will benefit future patients.

But for clinical data to become a public good, all parties involved have to play their parts. Patients have to agree to the common purpose of the public good, while clinicians, payers, researchers and healthcare systems have to serve as data stewards that are legally and ethically responsible for ensuring the data is protected and used appropriately.

If third-party AI apps want access to clinical data such as medical images, they should be held to the same standards, according to Larson.

"We maintain then that it's ethical to share data with third parties as long as, first and foremost, privacy is safeguarded throughout the process," he said. "Second, that the receiving organizations accept their role as data stewards. Third, that they adhere to data use agreements and all other stipulations. Fourth, that the receiving organization will not then share the data further."

Dig Deeper on Artificial intelligence legal issues and compliance

SearchBusinessAnalytics
SearchCIO
SearchDataManagement
SearchERP
Close