Artificial intelligence has moved from science fiction to everyday reality in a matter of years, being used for everything from online activity to driving cars. Even, yes, to make medical diagnoses. But that doesn’t mean people are ready to let AI drive all their medical decisions.
The technology is quickly evolving to help guide clinical decision-making across a growing number of medical specialties and diagnoses particularly when it comes to identifying anything out of the ordinary during a colonoscopy, skin cancer check or X-ray image.
New research is exploring what patients think about the use of AI in healthcare. Yale University’s Sanjay Aneja, MD, and colleagues surveyed a nationally representative group of 926 patients on their comfort with the use of the technology, what concerns they have, and on their overall opinions about AI.
Turns out, patient comfort with AI depends on its use.
For example, 12% of respondents were ‘very comfortable’ and 43% were “somewhat comfortable” with AI reading chest X-rays. However, only 6% were very comfortable and 25% were somewhat comfortable about AI making a cancer diagnosis, according to the survey results published online May 4, 2022 in JAMA Network Open.
“Having an AI algorithm read your X-ray … that’s a very different story than if one is relying on AI to make a diagnosis about a malignancy or delivering the news that somebody has cancer,” says Sean Khozin, MD, who was not affiliated with the research.
“What’s very interesting is that … there’s a lot of optimism among patients about the role of AI in making things better. That level of optimism was great to see,” added Khozin, an oncologist, data scientist, and member of the executive committee at the Alliance for Artificial Intelligence in Healthcare (AAIH). The AAIH is a global advocacy organization located in Baltimore that focuses on responsible, ethical, and reasonable standards for incorporating AI and machine learning in healthcare.
All in Favor, Say AI
Most people had a positive overall opinion of AI in healthcare. The survey revealed that 11% of people believed AI will make healthcare “much better” and 45% “somewhat better” in the next 5 years. In contrast, only 4% thought AI will make healthcare “somewhat worse” and 2% responded “much worse.”
Most of the work in medical AI focuses on clinical areas that could benefit most, “but rarely do we ask ourselves which areas patients really want AI to impact their healthcare,” says Aneja, senior study author and assistant professor in the Department of Therapeutic Radiology at Yale School of Medicine.
Not considering the patient perspective leaves an incomplete picture.
“In many ways, I would say our work highlights a potential blind spot among AI researchers that will need to be addressed as these technologies become more common in clinical practice,” says Aneja.
It remains unclear how much patients know or realize about the role AI already plays in medicine. Aneja, who assessed AI attitudes among clinicians in previous work, says, “What became clear as we surveyed both patients and physicians is that transparency is needed regarding the specific role AI plays within a patient’s treatment course.”
The current survey shows about two-thirds or 66% of patients responded it is “very important” to know when AI plays a large role in their diagnosis or treatment. Also, 46% believe the information is very important when AI plays a small role in their care.
Furthermore, less than 10% of people would be “very comfortable” receiving a diagnosis from a computer program that makes a correct diagnosis more than 90% of the time but is unable to explain why.
“Patients may not be aware of the automation that has been built into a lot of our devices today,” Khozin said. ECGs, imaging software, and colonoscopy interpretation systems are examples.
Even if unaware, patients are likely benefiting from the use of AI in diagnosis. One example is a 63-year-old Hispanic man with ulcerative colitis living in Brooklyn, New York. Aasma Shaukat, MD, MPH, a gastroenterologist at NYU Langone Medical Center performed a colonoscopy on the patient to check for disease activity and to take biopsies.
“As I was focussed on taking biopsies in the cecum, I did not notice a 6 mm flat polyp on a fold in the cecum until AI alerted me to it. I removed the polyp and the histology showed low grade dysplasia,” said Shaukat, who is also director of Outcomes Research in the Division of Gastroenterology and Hepatology at NYU.
Addressing AI Anxieties
The survey revealed a majority of people were “very concerned” or “somewhat concerned” about possible unintended consequences of AI in healthcare. A total of 92% said they would be concerned about a misdiagnosis, 71% about a privacy breach, 70% about spending less time with clinicians, and 68% about higher health care costs.
A previous study published in July 2021 by Aneja and colleagues focused on AI and medical liability. They identified discordant views between physicians and patients regarding liability when AI results in a clinical error. Although a majority of both physicians and patients believed physicians should be liable, physicians were more likely to want to hold vendors and healthcare organizations accountable as well.
Patient concerns about AI underline the importance of greater education and awareness, says Khozin, who is also chief executive officer of the non-profit health technology company CancerLinQ, LLC and executive vice president of the American Society of Clinical Oncology.
“It’s all about how AI is implemented,” he says, if done thoughtfully and judiciously. “I don’t think the use of AI tools and devices with lead to an increase in health care spending or health care costs.”
Khozin says patients should be informed that AI is designed to help physicians make an accurate diagnosis or select an optimal treatment, but it is not intended to replace clinicians or their experience and judgment. There’s always been a human behind medical technology connecting the dots “so there is a safety net,” he says.
EHR Rollout: A Cautionary Tale
Khozin also believes the introduction of electronic health record (EHR) systems presents valuable lessons for AI. Although EHRs were meant to streamline operations, “they have in some cases led to more inefficiencies, increases in spending, and in some cases really they’ve had a significant impact on physicians and providers in terms of the time that they’re spending with [technology].”
The “tough lessons” learned from early integration of EHR systems into healthcare are well known, Khozin says. “A lot of us, including myself, believe that if AI is done right it will give us more room to build better relationships with patients.”
A potential limitation of the study is the survey was conducted in December 2019, and attitudes toward AI may have evolved as the technology enters more and more clinical settings. “I wonder if they did a survey today if there would be more optimism and an increased comfort level with AI in healthcare,” Khozin says.
Aneja and colleagues agreed, noting that “future work should examine how views evolve as patients become more familiar with AI.”
AI is not an anomaly in healthcare technology, says Khozin. “It’s just part of the evolution of automation.”
JAMA Netw Open. Published online May 4, 2022. Full text
Damian McNamara is a staff journalist based in Miami. He covers a wide range of medical specialties, including infectious diseases, gastroenterology, and critical care. Follow Damian on Twitter: @MedReporter.