# Identifying patient needs to accept prostate cancer detection AI: a multicenter survey

# πŸ“š Citation

S.J. Fransen, T.C. Kwee, D. Rouw, S. Heijmink, Y. Ongena, M. Haan, F.F.J. Simonis, H. Huisman, D. Yakar, "Identifying patient needs to accept prostate cancer detection AI: a multicenter survey", Annual Meeting of the European Society of Radiology, 2024.

# πŸ“– Abstract

Purpose: Artificial intelligence (AI) models can potentially transform the field of clinically significant prostate cancer (csPCa) detection. Recent studies show that AI models for csPCa detection could reach expert-level performance [1–3]. However, successful clinical deployment does not depend only on the diagnostic performance of an AI system, and patient trust in AI diagnosis and the accountability for misdiagnosis have to be considered [4, 5].

The trust of prostate cancer suspicious patients in AI diagnosis is understudied. Existing studies show that the level of trust differs between patient groups and AI applications [6, 7]; hence, a targeted study on PCa suspicious patients is required. In the current care path, MRI is assessed by a single expert radiologist. AI might enhance this pathway in various ways, varying from second AI opinion to autonomous diagnosis [8]. All AI implementation deviates from the standard of care and raises questions regarding responsibility for misdiagnosis.

This study investigates patients' willingness to accept prostate cancer detection AI and identifies factors that influence patient acceptance of AI-generated diagnoses.

Materials and Methods: A prospective, multicenter study was conducted between January and September 2023. Patients undergoing MRI for prostate cancer suspicion were surveyed about their opinions on hypothetical AI assessment of their MRI scans.

Patients received a paper questionnaire containing questions on trust in AI and accountability for misdiagnosis by AI. The questionnaire was preceded by general information describing the hypothetical scenario in which an AI would be implemented at the same level as the radiologist. Following the general information, demographic questions were surveyed: birthdate, education level, MRI experience, and cancer history. Next, a total of nine topic-specific items were asked: four items on hypothetical scenarios of the collaboration between AI and the radiologist, two on trust in the diagnosis, and three on accountability for misdiagnosis.

Relationships between the collaboration scenarios and independent variables were assessed using multivariate analysis. The statistical threshold was set at 0.05. Analysis was performed in Rstudio, version 4.2.2, using the libraries MASS and Brant.

Results: This study included 193 patients (mean age: 69 Β± 11 years). Most patients (79%) favored AI collaboration with a radiologist, who holds final judgment. 15% accepted AI, performing as expert radiologists, as the sole decision-maker. Stand-alone AI garnered more acceptance if it outperformed radiologists (52% vs. 15%). Higher education correlated with a greater AI trust, and a history of cancer correlated to a preference for radiologists assessment (p < 0.05). Respondents assigned accountability for misdiagnosis to the radiologist (70%), program developer (52%), and hospital (75%).

Conclusion: Patients with suspected PCa are more likely to accept a diagnosis made by AI when a radiologist is part of the process. Stand-alone AI is, especially among higher-educated patients, more accepted when it would perform better than radiologists. Informing and educating patients about AI may result in greater acceptance of autonomous AI systems. The results are preliminary from an ongoing study.

The results should be interpreted with caution because more patients are needed for conclusive findings. Moreover, these results might not generalize to other populations and settings [7].

References:

  1. Bosma JS, Saha A, Hosseinzadeh M, et al (2023) Semisupervised Learning with Report-guided Pseudo Labels for Deep Learning–based Prostate Cancer Detection Using Biparametric MRI. Radiol Artif Intell 5:. https://doi.org/10.1148/ryai.230031
  2. Saha A, Hosseinzadeh M, Huisman H (2021) End-to-end prostate cancer detection in bpMRI via 3D CNNs: Effects of attention mechanisms, clinical priori and decoupled false positive reduction. Med Image Anal 73:. https://doi.org/10.1016/j.media.2021.102155
  3. Roest C, Fransen SJ, Kwee TC, Yakar D (2022) Comparative Performance of Deep Learning and Radiologists for the Diagnosis and Localization of Clinically Significant Prostate Cancer at MRI: A Systematic Review. Life 12
  4. Rajpurkar P, Chen E, Banerjee O, Topol EJ (2022) AI in health and medicine. Nat Med 28:31–38. https://doi.org/10.1038/s41591-021-01614-0
  5. Wiens J, Saria S, Sendak M, et al (2019) Do no harm: a roadmap for responsible machine learning for health care. Nat Med 25:1337–1340. https://doi.org/10.1038/s41591-019-0548-6
  6. Vo V, Chen G, Aquino YSJ, et al (2023) Multi-stakeholder preferences for the use of artificial intelligence in healthcare: A systematic review and thematic analysis. Soc Sci Med 338
  7. Yakar D, Ongena YP, Kwee TC, Haan M (2022) Themed Section: Artificial Intelligence Do People Favor Artificial Intelligence Over Physicians? A Survey Among the General Population and Their View on Artificial Intelligence in Medicine. Value in Health 25:374–381. https://doi.org/10.1016/j.jval.2