Automatic assessment of mammographic density using a deep transfer learning method
Mammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to provide better risk predictions than other methods. Our purpose is to build automated models using deep learning and train on radiologist score...
Saved in:
Published in | Journal of medical imaging (Bellingham, Wash.) Vol. 10; no. 2; p. 024502 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
Society of Photo-Optical Instrumentation Engineers
01.03.2023
SPIE |
Subjects | |
Online Access | Get full text |
ISSN | 2329-4302 2329-4310 |
DOI | 10.1117/1.JMI.10.2.024502 |
Cover
Abstract | Mammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to provide better risk predictions than other methods. Our purpose is to build automated models using deep learning and train on radiologist scores to make accurate and consistent predictions.
We used a dataset of almost 160,000 mammograms, each with two independent density scores made by expert medical practitioners. We used two pretrained deep networks and adapted them to produce feature vectors, which were then used for both linear and nonlinear regression to make density predictions. We also simulated an "optimal method," which allowed us to compare the quality of our results with a simulated upper bound on performance.
Our deep learning method produced estimates with a root mean squared error (RMSE) of
. The model estimates of cancer risk perform at a similar level to human experts, within uncertainty bounds. We made comparisons between different model variants and demonstrated the high level of consistency of the model predictions. Our modeled "optimal method" produced image predictions with a RMSE of between 7.98 and 8.90 for cranial caudal images.
We demonstrated a deep learning framework based upon a transfer learning approach to make density estimates based on radiologists' visual scores. Our approach requires modest computational resources and has the potential to be trained with limited quantities of data. |
---|---|
AbstractList | Mammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to provide better risk predictions than other methods. Our purpose is to build automated models using deep learning and train on radiologist scores to make accurate and consistent predictions.
We used a dataset of almost 160,000 mammograms, each with two independent density scores made by expert medical practitioners. We used two pretrained deep networks and adapted them to produce feature vectors, which were then used for both linear and nonlinear regression to make density predictions. We also simulated an "optimal method," which allowed us to compare the quality of our results with a simulated upper bound on performance.
Our deep learning method produced estimates with a root mean squared error (RMSE) of
. The model estimates of cancer risk perform at a similar level to human experts, within uncertainty bounds. We made comparisons between different model variants and demonstrated the high level of consistency of the model predictions. Our modeled "optimal method" produced image predictions with a RMSE of between 7.98 and 8.90 for cranial caudal images.
We demonstrated a deep learning framework based upon a transfer learning approach to make density estimates based on radiologists' visual scores. Our approach requires modest computational resources and has the potential to be trained with limited quantities of data. Mammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to provide better risk predictions than other methods. Our purpose is to build automated models using deep learning and train on radiologist scores to make accurate and consistent predictions.PurposeMammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to provide better risk predictions than other methods. Our purpose is to build automated models using deep learning and train on radiologist scores to make accurate and consistent predictions.We used a dataset of almost 160,000 mammograms, each with two independent density scores made by expert medical practitioners. We used two pretrained deep networks and adapted them to produce feature vectors, which were then used for both linear and nonlinear regression to make density predictions. We also simulated an "optimal method," which allowed us to compare the quality of our results with a simulated upper bound on performance.ApproachWe used a dataset of almost 160,000 mammograms, each with two independent density scores made by expert medical practitioners. We used two pretrained deep networks and adapted them to produce feature vectors, which were then used for both linear and nonlinear regression to make density predictions. We also simulated an "optimal method," which allowed us to compare the quality of our results with a simulated upper bound on performance.Our deep learning method produced estimates with a root mean squared error (RMSE) of 8.79 ± 0.21 . The model estimates of cancer risk perform at a similar level to human experts, within uncertainty bounds. We made comparisons between different model variants and demonstrated the high level of consistency of the model predictions. Our modeled "optimal method" produced image predictions with a RMSE of between 7.98 and 8.90 for cranial caudal images.ResultsOur deep learning method produced estimates with a root mean squared error (RMSE) of 8.79 ± 0.21 . The model estimates of cancer risk perform at a similar level to human experts, within uncertainty bounds. We made comparisons between different model variants and demonstrated the high level of consistency of the model predictions. Our modeled "optimal method" produced image predictions with a RMSE of between 7.98 and 8.90 for cranial caudal images.We demonstrated a deep learning framework based upon a transfer learning approach to make density estimates based on radiologists' visual scores. Our approach requires modest computational resources and has the potential to be trained with limited quantities of data.ConclusionWe demonstrated a deep learning framework based upon a transfer learning approach to make density estimates based on radiologists' visual scores. Our approach requires modest computational resources and has the potential to be trained with limited quantities of data. |
Audience | Academic |
Author | Astley, Susan M. Squires, Steven Harkness, Elaine Gareth Evans, Dafydd |
Author_xml | – sequence: 1 givenname: Steven surname: Squires fullname: Squires, Steven email: s.squires@exeter.ac.uk organization: University of Manchester, School of Health Sciences, Division of Imaging, Informatics and Data Sciences, Faculty of Biology, Medicine and Health, Manchester, United Kingdom – sequence: 2 givenname: Elaine orcidid: 0000-0001-6625-7739 surname: Harkness fullname: Harkness, Elaine email: elaine.harkness@manchester.ac.uk organization: University of Manchester, School of Health Sciences, Division of Imaging, Informatics and Data Sciences, Faculty of Biology, Medicine and Health, Manchester, United Kingdom – sequence: 3 givenname: Dafydd surname: Gareth Evans fullname: Gareth Evans, Dafydd email: gareth.d.evans@manchester.ac.uk organization: University of Manchester, Manchester Academic Health Science Centre, School of Biological Sciences, Division of Evolution, Infection and Genomics, Faculty of Biology, Medicine and Health, Manchester, United Kingdom – sequence: 4 givenname: Susan M. orcidid: 0000-0002-2989-2765 surname: Astley fullname: Astley, Susan M. email: sue.astley@manchester.ac.uk organization: University of Manchester, School of Health Sciences, Division of Imaging, Informatics and Data Sciences, Faculty of Biology, Medicine and Health, Manchester, United Kingdom |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37034359$$D View this record in MEDLINE/PubMed |
BookMark | eNp9Uk1r3DAQFSWlSdP8gF6KoZde1tWHLdmnsoR-pKQUSnsWsjzaVbAkV5ID-feV2U1oSgkCaUbz3mOGNy_RiQ8eEHpNcE0IEe9J_fXbVV0yWmPatJg-Q2eU0X7TMIJPHmJMT9FFSjcYY0JwS0nzAp0ygVnD2v4M_dguOTiVra5USpCSA5-rYCqnnAu7qOZ9KY3gk8131ZKs31Wq5DBXOSqfDMRqAhX9WnCQ92F8hZ4bNSW4OL7n6Nenjz8vv2yuv3--utxeb3SLed4oiimnrYFRdK0wrDF4UKrlmvR45BR3o-oF4HJpOnQGYOgaPlAwKw36np2jDwfdeRkcjLo0HtUk52idincyKCsfV7zdy124lQRjwWlDisK7o0IMvxdIWTqbNEyT8hCWJKnoeyI44bxA3x6gOzWBtN6EIqlXuNx2DWu6ruWsoOr_oMoZwVld_DO2_D8ivPl7hofm7w0qAHEA6BhSimCktrnYFdaR7FRmkes2SCLLNqwZlYdtKEzyD_Ne_CnOsf80W5A3YYm-OPgE4Q-sB8Te |
CitedBy_id | crossref_primary_10_1117_1_JMI_11_4_044506 crossref_primary_10_1088_2057_1976_accaea crossref_primary_10_1088_2057_1976_ad470b |
ContentType | Journal Article |
Copyright | 2023 Society of Photo-Optical Instrumentation Engineers (SPIE) 2023 Society of Photo-Optical Instrumentation Engineers (SPIE). COPYRIGHT 2023 SPIE 2023 Society of Photo-Optical Instrumentation Engineers (SPIE) 2023 Society of Photo-Optical Instrumentation Engineers |
Copyright_xml | – notice: 2023 Society of Photo-Optical Instrumentation Engineers (SPIE) – notice: 2023 Society of Photo-Optical Instrumentation Engineers (SPIE). – notice: COPYRIGHT 2023 SPIE – notice: 2023 Society of Photo-Optical Instrumentation Engineers (SPIE) 2023 Society of Photo-Optical Instrumentation Engineers |
DBID | AAYXX CITATION NPM 7X8 5PM |
DOI | 10.1117/1.JMI.10.2.024502 |
DatabaseName | CrossRef PubMed MEDLINE - Academic PubMed Central (Full Participant titles) |
DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
DatabaseTitleList | PubMed MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine |
EISSN | 2329-4310 |
EndPage | 024502 |
ExternalDocumentID | PMC10076241 A843488563 37034359 10_1117_1_JMI_10_2_024502 |
Genre | Journal Article |
GrantInformation_xml | – fundername: National Institute for Health Research Manchester Biomedical Research Centre grantid: IS-BRC-1215-20007 – fundername: CRUK AI-informed screening grantid: A29024 – fundername: Cancer Research UK grantid: 29024 |
GroupedDBID | 0R~ 4.4 ACGFS ALMA_UNASSIGNED_HOLDINGS EBS FQ0 O9- OK1 RPM SPBNH UT2 AAYXX ABJNI ADMLS AKROS CITATION HYE M4X EJD NPM 7X8 5PM |
ID | FETCH-LOGICAL-c506t-a202625fed7857f34f0baa56c190d6208da97e0a97c2b8feeb846b2ef0262e993 |
ISSN | 2329-4302 |
IngestDate | Thu Aug 21 18:34:42 EDT 2025 Fri Jul 11 15:37:46 EDT 2025 Wed Jun 18 17:00:35 EDT 2025 Tue Jun 17 03:41:17 EDT 2025 Sat May 31 02:13:25 EDT 2025 Thu Apr 24 23:03:20 EDT 2025 Tue Jul 01 02:16:00 EDT 2025 Sun Apr 30 04:10:41 EDT 2023 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 2 |
Keywords | cancer risk deep learning transfer learning mammography breast density |
Language | English |
License | 2023 Society of Photo-Optical Instrumentation Engineers (SPIE). |
LinkModel | OpenURL |
MergedId | FETCHMERGED-LOGICAL-c506t-a202625fed7857f34f0baa56c190d6208da97e0a97c2b8feeb846b2ef0262e993 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ORCID | 0000-0002-2989-2765 0000-0001-6625-7739 |
OpenAccessLink | https://pubmed.ncbi.nlm.nih.gov/PMC10076241 |
PMID | 37034359 |
PQID | 2799176166 |
PQPubID | 23479 |
PageCount | 1 |
ParticipantIDs | gale_infotracacademiconefile_A843488563 pubmed_primary_37034359 proquest_miscellaneous_2799176166 pubmedcentral_primary_oai_pubmedcentral_nih_gov_10076241 gale_infotracmisc_A843488563 spie_journals_10_1117_1_JMI_10_2_024502 crossref_citationtrail_10_1117_1_JMI_10_2_024502 crossref_primary_10_1117_1_JMI_10_2_024502 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-03-01 |
PublicationDateYYYYMMDD | 2023-03-01 |
PublicationDate_xml | – month: 03 year: 2023 text: 2023-03-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States |
PublicationTitle | Journal of medical imaging (Bellingham, Wash.) |
PublicationTitleAlternate | J. Med. Imag |
PublicationYear | 2023 |
Publisher | Society of Photo-Optical Instrumentation Engineers SPIE |
Publisher_xml | – name: Society of Photo-Optical Instrumentation Engineers – name: SPIE |
SSID | ssj0001105214 |
Score | 2.2398977 |
Snippet | Mammographic breast density is one of the strongest risk factors for cancer. Density assessed by radiologists using visual analogue scales has been shown to... |
SourceID | pubmedcentral proquest gale pubmed crossref spie |
SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 024502 |
SubjectTerms | Cancer Computer-Aided Diagnosis Health aspects Mammography Methods Oncology, Experimental |
Title | Automatic assessment of mammographic density using a deep transfer learning method |
URI | http://www.dx.doi.org/10.1117/1.JMI.10.2.024502 https://www.ncbi.nlm.nih.gov/pubmed/37034359 https://www.proquest.com/docview/2799176166 https://pubmed.ncbi.nlm.nih.gov/PMC10076241 |
Volume | 10 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bb9MwFLZKkaa9IO5kDGQkJCRQSq5O-lghYEwqD7BJe4vsxN4qaFq1ycP2yC_nHNu5bSsCXqIkdhLH58vJZ_tcCHktwlTlufRcn3nKjZQAPchj5Qah4DHQ_dzXjrTzr-zoNDo-i89Go189q6W6EpP86la_kv-RKpwDuaKX7D9Itr0pnIB9kC9sQcKw_SsZz-pqZUOutgE29YI5hzboUNRQVKCJOlDtWs8KcDiWa8wMAXxVbpqsEec2lfQOrrq0yzmLpUlqBLQUHXtg17pZY06mSW9W4TsaGBsNZPKmdZpu8wO1qzEp471V_c-Yb_fiXUftubosihaQuBx92doR2UlcO10RhJ29ltFqwOCmbhR6QxXs9aAW9PQpLgybqreoeh0sYHI8_zJBg_bJzbogrfVSyz4ExQbEcNr99VpbxKboDrkbJMC_ejM-ep7OR_fmSOcotC23i-PQgPc3Hr9P9pobDpjO9f99j_BcN8Ydb9cL2SM5J_fJPStxOjNQe0BGsnxI9ubW_uIR-dYijnaIoytF-4ijFnFUI45yioijDeJogzhqEPeYnH76ePLhyLVZOdw89ljlcpAqDJqVLJI0TlQYKU9wHrMcqGXBAi8t-DSRHmzyQKRKSgEUVwRS4WUS6PATMi5XpXxGaFCkKi6mTKaKRYmIuIwwPqJIpSikiqVDvKYHs9yGrMfMKT8zM3RNMj-D_sejIDP975C37SVrE6_lT5XfoFgyxArcN-fWJQVah1HRslkahfCDi1nokMNBTdDB-aD4VSPYDIvQcLGUq3qbBQkMwBLmM-aQp0bQbbsaoDgkHUCgrYCh34cl5eJCh4BH2yYG5BveANGSWaW03f2uBzsf_5zsdx_qIRlXm1q-AKZdiZf6U_gN6H_Omg |
linkProvider | National Library of Medicine |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Automatic+assessment+of+mammographic+density+using+a+deep+transfer+learning+method&rft.jtitle=Journal+of+medical+imaging+%28Bellingham%2C+Wash.%29&rft.au=Squires%2C+Steven&rft.au=Harkness%2C+Elaine&rft.au=Gareth+Evans%2C+Dafydd&rft.au=Astley%2C+Susan+M&rft.date=2023-03-01&rft.issn=2329-4302&rft.volume=10&rft.issue=2&rft.spage=024502&rft_id=info:doi/10.1117%2F1.JMI.10.2.024502&rft_id=info%3Apmid%2F37034359&rft.externalDocID=37034359 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2329-4302&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2329-4302&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2329-4302&client=summon |