AI-powered prostate cancer detection: a multi-centre, multi-scanner validation study

Objectives Multi-centre, multi-vendor validation of artificial intelligence (AI) software to detect clinically significant prostate cancer (PCa) using multiparametric magnetic resonance imaging (MRI) is lacking. We compared a new AI solution, validated on a separate dataset from different UK hospita...

Full description

Saved in:
Bibliographic Details
Published inEuropean radiology Vol. 35; no. 8; pp. 4915 - 4924
Main Authors Giganti, Francesco, Moreira da Silva, Nadia, Yeung, Michael, Davies, Lucy, Frary, Amy, Ferrer Rodriguez, Mirjana, Sushentsev, Nikita, Ashley, Nicholas, Andreou, Adrian, Bradley, Alison, Wilson, Chris, Maskell, Giles, Brembilla, Giorgio, Caglic, Iztok, Suchánek, Jakub, Budd, Jobie, Arya, Zobair, Aning, Jonathan, Hayes, John, De Bono, Mark, Vasdev, Nikhil, Sanmugalingam, Nimalan, Burn, Paul, Persad, Raj, Woitek, Ramona, Hindley, Richard, Liyanage, Sidath, Squire, Sophie, Barrett, Tristan, Barwick, Steffi, Hinton, Mark, Padhani, Anwar R., Rix, Antony, Shah, Aarti, Sala, Evis
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.08.2025
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1432-1084
0938-7994
1432-1084
DOI10.1007/s00330-024-11323-0

Cover

More Information
Summary:Objectives Multi-centre, multi-vendor validation of artificial intelligence (AI) software to detect clinically significant prostate cancer (PCa) using multiparametric magnetic resonance imaging (MRI) is lacking. We compared a new AI solution, validated on a separate dataset from different UK hospitals, to the original multidisciplinary team (MDT)-supported radiologist’s interpretations. Materials and methods A Conformité Européenne (CE)-marked deep-learning (DL) computer-aided detection (CAD) medical device (Pi) was trained to detect Gleason Grade Group (GG) ≥ 2 cancer using retrospective data from the PROSTATEx dataset and five UK hospitals (793 patients). Our separate validation dataset was on six machines from two manufacturers across six sites (252 patients). Data included in the study were from MRI scans performed between August 2018 to October 2022. Patients with a negative MRI who did not undergo biopsy were assumed to be negative (90.4% had prostate-specific antigen density < 0.15 ng/mL 2 ). ROC analysis was used to compare radiologists who used a 5-category suspicion score. Results GG ≥ 2 prevalence in the validation set was 31%. Evaluated per patient, Pi was non-inferior to radiologists (considering a 10% performance difference as acceptable), with an area under the curve (AUC) of 0.91 vs. 0.95. At the predetermined risk threshold of 3.5, the AI software’s sensitivity was 95% and specificity 67%, while radiologists at Prostate Imaging-Reporting and Data Systems/Likert ≥ 3 identified GG ≥ 2 with a sensitivity of 99% and specificity of 73%. AI performed well per-site (AUC ≥ 0.83) at the patient-level independent of scanner age and field strength. Conclusion Real-world data testing suggests that Pi matches the performance of MDT-supported radiologists in GG ≥ 2 PCa detection and generalises to multiple sites, scanner vendors, and models. Key Points Question The performance of artificial intelligence-based medical tools for prostate MRI has yet to be evaluated on multi-centre, multi-vendor data to assess generalisability. Findings A dedicated AI medical tool matches the performance of multidisciplinary team-supported radiologists in prostate cancer detection and generalises to multiple sites and scanners. Clinical relevance This software has the potential to support the MRI process for biopsy decision-making and target identification, but future prospective studies, where lesions identified by artificial intelligence are biopsied separately, are needed. Graphical Abstract
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1432-1084
0938-7994
1432-1084
DOI:10.1007/s00330-024-11323-0