Deep learning prediction of sex on chest radiographs: a potential contributor to biased algorithms

Background Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which cou...

Full description

Saved in:
Bibliographic Details
Published inEmergency radiology Vol. 29; no. 2; pp. 365 - 370
Main Authors Li, David, Lin, Cheng Ting, Sulam, Jeremias, Yi, Paul H.
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 01.04.2022
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1070-3004
1438-1435
1438-1435
DOI10.1007/s10140-022-02019-3

Cover

Abstract Background Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias. Objective To (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs. Materials and methods Chest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN’s prediction. Results On the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions. Conclusion DCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.
AbstractList Background Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias. Objective To (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs. Materials and methods Chest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN’s prediction. Results On the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions. Conclusion DCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.
Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias.BACKGROUNDDeep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias.To (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs.OBJECTIVETo (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs.Chest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN's prediction.MATERIALS AND METHODSChest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN's prediction.On the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions.RESULTSOn the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions.DCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.CONCLUSIONDCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.
Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias. To (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs. Chest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN's prediction. On the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions. DCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.
BackgroundDeep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the datasets used to train them have unbalanced sex representation. Prior work has suggested that DCNNs can predict sex on CXR, which could aid forensic evaluations, but also be a source of bias.ObjectiveTo (1) evaluate the performance of DCNNs for predicting sex across different datasets and architectures and (2) evaluate visual biomarkers used by DCNNs to predict sex on CXRs.Materials and methodsChest radiographs were obtained from the Stanford CheXPert and NIH Chest XRay14 datasets which comprised of 224,316 and 112,120 CXRs, respectively. To control for dataset size and class imbalance, random undersampling was used to reduce each dataset to 97,560 images that were balanced for sex. Each dataset was randomly split into training (70%), validation (10%), and test (20%) sets. Four DCNN architectures pre-trained on ImageNet were used for transfer learning. DCNNs were externally validated using a test set from the opposing dataset. Performance was evaluated using area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to generate heatmaps visualizing the regions contributing to the DCNN’s prediction.ResultsOn the internal test set, DCNNs achieved AUROCs ranging from 0.98 to 0.99. On external validation, the models reached peak cross-dataset performance of 0.94 for the VGG19-Stanford model and 0.95 for the InceptionV3-NIH model. Heatmaps highlighted similar regions of attention between model architectures and datasets, localizing to the mediastinal and upper rib regions, as well as to the lower chest/diaphragmatic regions.ConclusionDCNNs trained on two large CXR datasets accurately predicted sex on internal and external test data with similar heatmap localizations across DCNN architectures and datasets. These findings support the notion that DCNNs can leverage imaging biomarkers to predict sex and potentially confound the accurate prediction of disease on CXRs and contribute to biased models. On the other hand, these DCNNs can be beneficial to emergency radiologists for forensic evaluations and identifying patient sex for patients whose identities are unknown, such as in acute trauma.
Author Lin, Cheng Ting
Sulam, Jeremias
Li, David
Yi, Paul H.
Author_xml – sequence: 1
  givenname: David
  surname: Li
  fullname: Li, David
  organization: Faculty of Medicine, University of Ottawa, University of Maryland Medical Intelligent Imaging (UM2II) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine
– sequence: 2
  givenname: Cheng Ting
  surname: Lin
  fullname: Lin, Cheng Ting
  organization: Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine
– sequence: 3
  givenname: Jeremias
  surname: Sulam
  fullname: Sulam, Jeremias
  organization: Department of Biomedical Engineering, Johns Hopkins University
– sequence: 4
  givenname: Paul H.
  surname: Yi
  fullname: Yi, Paul H.
  email: pyi@som.umaryland.edu
  organization: University of Maryland Medical Intelligent Imaging (UM2II) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35006495$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1rFTEUxYNU7If-Ay4k4MbN6M3HTDLupNoqFNzoOiSZO--lzEvGJAP635v6WoQuusjNWfzO5XDPOTmJKSIhrxm8ZwDqQ2HAJHTAeXvAxk48I2dMCt210Z80DQo6ASBPyXkptwAwjIN-QU5F36Qc-zPiPiOudEGbY4g7umacgq8hRZpmWvA3bcrvsVSa7RTSLtt1Xz5SS9dUMdZgF-pTrDm4raZMa6Iu2IITtcsu5VD3h_KSPJ_tUvDV_X9Bfl59-XH5tbv5fv3t8tNN54Xqa6cGryTj48wZ45zJQWvntRBCOtUzqXvmkKNyE7JJOOfkPFg2KedAW4F8FBfk3XHvmtOvrUU2h1A8LouNmLZi-MB0D2IA3tC3j9DbtOXY0jVKaKZEu1Sj3txTmzvgZNYcDjb_MQ_Xa4A-Aj6nUjLOxodq765Xsw2LYWDuijLHokwryvwryohm5Y-sD9ufNImjqTQ47jD_j_2E6y8m6aQU
CitedBy_id crossref_primary_10_1016_j_radi_2023_10_014
crossref_primary_10_3348_kjr_2023_0393
crossref_primary_10_1148_radiol_221488
crossref_primary_10_1177_20552076231191055
crossref_primary_10_1016_j_jacr_2023_06_015
crossref_primary_10_3389_fdata_2023_1120989
crossref_primary_10_3389_fmicb_2023_1332857
crossref_primary_10_1007_s00246_024_03561_2
Cites_doi 10.1007/S10140-021-01953-Y
10.1038/s41597-019-0322-0
10.1371/journal.pmed.1002686
10.1073/pnas.1919012117
10.1148/radiol.2017162326
10.1371/journal.pmed.1002683
10.1142/9789811232701_0022
10.1109/CVPR.2016.90
10.1109/CVPR.2016.308
10.1109/CVPR.2017.369
10.1609/aaai.v33i01.3301590
10.1117/12.2293027
ContentType Journal Article
Copyright American Society of Emergency Radiology 2022
2022. American Society of Emergency Radiology.
American Society of Emergency Radiology 2022.
Copyright_xml – notice: American Society of Emergency Radiology 2022
– notice: 2022. American Society of Emergency Radiology.
– notice: American Society of Emergency Radiology 2022.
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7RV
7X7
7XB
88E
8AO
8FE
8FG
8FI
8FJ
8FK
ABUWG
AFKRA
ARAPS
BENPR
BGLVJ
CCPQU
DWQXO
FYUFA
GHDGH
HCIFZ
K9.
KB0
M0S
M1P
NAPCQ
P5Z
P62
PHGZM
PHGZT
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
7X8
DOI 10.1007/s10140-022-02019-3
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Nursing & Allied Health Database
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
ProQuest Pharma Collection
ProQuest SciTech Collection
ProQuest Technology Collection
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Advanced Technologies & Computer Science Collection
ProQuest Central
Technology Collection (via ProQuest SciTech Premium Collection)
ProQuest One Community College
ProQuest Central
Health Research Premium Collection
Health Research Premium Collection (Alumni)
SciTech Premium Collection
ProQuest Health & Medical Complete (Alumni)
Nursing & Allied Health Database (Alumni Edition)
ProQuest Health & Medical Collection
Medical Database
Nursing & Allied Health Premium
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
Proquest Central Premium
ProQuest One Academic
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Pharma Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
Advanced Technologies & Aerospace Collection
ProQuest One Academic Eastern Edition
ProQuest Nursing & Allied Health Source
ProQuest Hospital Collection
ProQuest Technology Collection
Health Research Premium Collection (Alumni)
ProQuest SciTech Collection
ProQuest Hospital Collection (Alumni)
Advanced Technologies & Aerospace Database
Nursing & Allied Health Premium
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest Nursing & Allied Health Source (Alumni)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic
MEDLINE
Technology Collection
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Public Health
EISSN 1438-1435
EndPage 370
ExternalDocumentID 35006495
10_1007_s10140_022_02019_3
Genre Journal Article
GrantInformation_xml – fundername: Johns Hopkins University
  grantid: Malone Center Seed Grant
  funderid: http://dx.doi.org/10.13039/100007880
– fundername: Johns Hopkins University
  grantid: Malone Center Seed Grant
GroupedDBID ---
-53
-5E
-5G
-BR
-EM
-XW
-Y2
-~C
.86
.GJ
.VR
04C
06C
06D
0R~
0VY
1N0
1SB
2.D
203
28-
29G
29~
2J2
2JN
2JY
2KG
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
53G
5GY
5QI
5VS
67Z
6NX
6PF
78A
7RV
7X7
88E
8AO
8FE
8FG
8FI
8FJ
8TC
8UJ
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANXM
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAWTL
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDZT
ABECU
ABFTV
ABHLI
ABHQN
ABIPD
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABPLI
ABQBU
ABQSL
ABSXP
ABTEG
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACDTI
ACGFS
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACUDM
ACZOJ
ADBBV
ADHHG
ADHIR
ADIMF
ADINQ
ADJJI
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHIZS
AHKAY
AHMBA
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
AKMHD
ALIPV
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARMRJ
AXYYD
B-.
BA0
BBWZM
BDATZ
BENPR
BGLVJ
BGNMA
BKEYQ
BMSDO
BPHCQ
BSONS
BVXVI
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
EBLON
EBS
EIOEI
EJD
EMOBN
EN4
ESBYG
EX3
F5P
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FWDCC
FYUFA
G-Y
G-Z
GGCAI
GGRSB
GJIRD
GNWQR
GQ6
GQ7
GQ8
GRRUI
GXS
H13
HCIFZ
HF~
HG5
HG6
HMCUK
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
I09
IHE
IJ-
IKXTQ
IMOTQ
ITM
IWAJR
IXC
IXE
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
KDC
KOV
KOW
KPH
LAS
LLZTM
M1P
M4Y
MA-
N2Q
NAPCQ
NB0
NDZJH
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
P19
P62
P9S
PF0
PQQKQ
PROAC
PSQYO
PT4
PT5
Q2X
QOK
QOR
QOS
R4E
R89
R9I
RHV
RNI
ROL
RPX
RRX
RSV
RZK
S16
S1Z
S26
S27
S28
S37
S3B
SAP
SCLPG
SDE
SDH
SDM
SHX
SISQX
SJYHP
SMD
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
SSXJD
STPWE
SZ9
SZN
T13
T16
TSG
TSK
TSV
TT1
TUC
U2A
U9L
UG4
UKHRP
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WJK
WK8
WOW
YLTOR
Z45
Z7X
Z82
Z87
Z8V
ZMTXR
ZOVNA
~A9
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
AEZWR
AFDZB
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
PJZUB
PPXIY
PQGLB
PUEGO
CGR
CUY
CVF
ECM
EIF
NPM
7XB
8FK
DWQXO
K9.
PKEHL
PQEST
PQUKI
PRINS
7X8
ID FETCH-LOGICAL-c375t-76c74129f2112214688bc83334b7514851be2e7bde1d3bbb4f6a1d7bb08a3e293
IEDL.DBID U2A
ISSN 1070-3004
1438-1435
IngestDate Wed Oct 01 15:10:29 EDT 2025
Tue Oct 07 06:02:35 EDT 2025
Wed Feb 19 02:26:04 EST 2025
Thu Apr 24 23:08:15 EDT 2025
Wed Oct 01 04:09:30 EDT 2025
Fri Feb 21 02:47:24 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 2
Keywords Deep learning
Chest
Radiograph
Forensics
Sex prediction
Bias
Anatomy
Fairness
Language English
License 2022. American Society of Emergency Radiology.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c375t-76c74129f2112214688bc83334b7514851be2e7bde1d3bbb4f6a1d7bb08a3e293
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
PMID 35006495
PQID 2638173069
PQPubID 55417
PageCount 6
ParticipantIDs proquest_miscellaneous_2618503602
proquest_journals_2638173069
pubmed_primary_35006495
crossref_citationtrail_10_1007_s10140_022_02019_3
crossref_primary_10_1007_s10140_022_02019_3
springer_journals_10_1007_s10140_022_02019_3
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220400
2022-04-00
2022-Apr
20220401
PublicationDateYYYYMMDD 2022-04-01
PublicationDate_xml – month: 4
  year: 2022
  text: 20220400
PublicationDecade 2020
PublicationPlace Cham
PublicationPlace_xml – name: Cham
– name: United States
– name: Heidelberg
PublicationSubtitle A Journal of Practical Imaging Official Journal of the American Society of Emergency Radiology
PublicationTitle Emergency radiology
PublicationTitleAbbrev Emerg Radiol
PublicationTitleAlternate Emerg Radiol
PublicationYear 2022
Publisher Springer International Publishing
Springer Nature B.V
Publisher_xml – name: Springer International Publishing
– name: Springer Nature B.V
References CR2
Lakhani, Sundaram (CR12) 2017; 284
Ph, J, TK (CR4) 2021; 28
Johnson, Pollard, Berkowitz (CR6) 2019; 6
CR5
CR8
CR7
CR9
Larrazabal, Nieto, Peterson (CR3) 2020; 117
Zech, Badgeley, Liu (CR13) 2018; 15
CR11
CR10
Rajpurkar, Irvin, Ball (CR1) 2018; 15
AJ Larrazabal (2019_CR3) 2020; 117
2019_CR10
2019_CR8
P Lakhani (2019_CR12) 2017; 284
2019_CR9
JR Zech (2019_CR13) 2018; 15
2019_CR11
AEW Johnson (2019_CR6) 2019; 6
2019_CR2
Y Ph (2019_CR4) 2021; 28
2019_CR7
P Rajpurkar (2019_CR1) 2018; 15
2019_CR5
References_xml – volume: 28
  start-page: 949
  year: 2021
  end-page: 954
  ident: CR4
  article-title: Radiology “forensics”: determination of age and sex from chest radiographs using deep learning
  publication-title: Emerg Radiol
  doi: 10.1007/S10140-021-01953-Y
– volume: 6
  start-page: 317
  year: 2019
  ident: CR6
  article-title: MIMIC-CXR, a de-identified publicly available database of chest radiographs with free-text reports
  publication-title: Sci Data
  doi: 10.1038/s41597-019-0322-0
– volume: 15
  year: 2018
  ident: CR1
  article-title: Deep learning for chest radiograph diagnosis: a retrospective comparison of the CheXNeXt algorithm to practicing radiologists
  publication-title: PLoS Med
  doi: 10.1371/journal.pmed.1002686
– ident: CR2
– volume: 117
  start-page: 12592
  year: 2020
  end-page: 12594
  ident: CR3
  article-title: Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis
  publication-title: Proc Natl Acad Sci U S A
  doi: 10.1073/pnas.1919012117
– ident: CR10
– ident: CR11
– ident: CR9
– volume: 284
  start-page: 574
  year: 2017
  end-page: 582
  ident: CR12
  article-title: Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks
  publication-title: Radiology
  doi: 10.1148/radiol.2017162326
– ident: CR5
– ident: CR7
– ident: CR8
– volume: 15
  year: 2018
  ident: CR13
  article-title: Variable generalization performance of a deep learning model to detect pneumonia in chest radiographs: a cross-sectional study
  publication-title: PLOS Med
  doi: 10.1371/journal.pmed.1002683
– ident: 2019_CR11
– volume: 15
  year: 2018
  ident: 2019_CR13
  publication-title: PLOS Med
  doi: 10.1371/journal.pmed.1002683
– ident: 2019_CR2
  doi: 10.1142/9789811232701_0022
– ident: 2019_CR9
  doi: 10.1109/CVPR.2016.90
– ident: 2019_CR10
  doi: 10.1109/CVPR.2016.308
– volume: 15
  year: 2018
  ident: 2019_CR1
  publication-title: PLoS Med
  doi: 10.1371/journal.pmed.1002686
– volume: 28
  start-page: 949
  year: 2021
  ident: 2019_CR4
  publication-title: Emerg Radiol
  doi: 10.1007/S10140-021-01953-Y
– volume: 6
  start-page: 317
  year: 2019
  ident: 2019_CR6
  publication-title: Sci Data
  doi: 10.1038/s41597-019-0322-0
– ident: 2019_CR7
  doi: 10.1109/CVPR.2017.369
– ident: 2019_CR8
  doi: 10.1609/aaai.v33i01.3301590
– volume: 284
  start-page: 574
  year: 2017
  ident: 2019_CR12
  publication-title: Radiology
  doi: 10.1148/radiol.2017162326
– ident: 2019_CR5
  doi: 10.1117/12.2293027
– volume: 117
  start-page: 12592
  year: 2020
  ident: 2019_CR3
  publication-title: Proc Natl Acad Sci U S A
  doi: 10.1073/pnas.1919012117
SSID ssj0006968
Score 2.361579
Snippet Background Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females...
Deep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females if the...
BackgroundDeep convolutional neural networks (DCNNs) for diagnosis of disease on chest radiographs (CXR) have been shown to be biased against males or females...
SourceID proquest
pubmed
crossref
springer
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 365
SubjectTerms Algorithms
Artificial neural networks
Biomarkers
Chest
Datasets
Deep Learning
Emergency Medicine
Female
Humans
Imaging
Machine learning
Male
Medicine
Medicine & Public Health
Neural Networks, Computer
Original Article
Performance evaluation
Radiographs
Radiography
Radiologists
Radiology
Sex
Test sets
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1ta9UwFD7MOxBhiM636pQIftNgm7Q3rSDiy8YQvIg42LeS1zmYt929HfjzPSdN70WG-1Zo2oSck5MnycnzALyyZdWI4C2XeZPzsnGWG1tJXtTaB-mUUj6yfS7mxyfl19PqdAcW010YSqucYmIM1K6ztEf-VsyJSw4BbvOhv-SkGkWnq5OEhk7SCu59pBi7BbuCmLFmsPvpcPH9xyY2ExXMmIWI0Qf9I12jSZfpYqojLs4QQhUNl_9OVdfw57Wz0zglHd2DuwlLso-j8e_Djl_uw-1v6bR8H_bGPTk2XjV6AOaL9z1LOhFnrF9RSTIM6wJb-z8Mn6J-Fltpdz5yWa_fMc36bqCkIqwrZraTRFa3YkPHzDlOgo7pizPsquHX7_VDODk6_Pn5mCeNBW6lqgau5hYxhWgCLgQFiXzXtbG1lLI0CrEU4jHjhVfG-cJJY0wZ5rpwypi81tIjVngEs2W39E-AIXQzKpi61pUspXY6hFwYtIa2Ade-dQbF1J2tTQTkpINx0W6pk2OqGZqgjSZoZQavN9_0I_3GjaUPJiu1aSiu263jZPBy8xoHEZ2M6KXvrqgMwhacy3ORwePRupvqZEWwrakyeDOZe_vz_7fl6c1teQZ3RHQ1ygE6gNmwuvLPEd4M5kXy2b9d0POt
  priority: 102
  providerName: ProQuest
Title Deep learning prediction of sex on chest radiographs: a potential contributor to biased algorithms
URI https://link.springer.com/article/10.1007/s10140-022-02019-3
https://www.ncbi.nlm.nih.gov/pubmed/35006495
https://www.proquest.com/docview/2638173069
https://www.proquest.com/docview/2618503602
Volume 29
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVLSH
  databaseName: SpringerLink Journals
  customDbUrl:
  mediaType: online
  eissn: 1438-1435
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: AFBBN
  dateStart: 19970101
  isFulltext: true
  providerName: Library Specific Holdings
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl: http://www.proquest.com/pqcentral?accountid=15518
  eissn: 1438-1435
  dateEnd: 20241102
  omitProxy: true
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: BENPR
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Health & Medical
  customDbUrl:
  eissn: 1438-1435
  dateEnd: 20241102
  omitProxy: true
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: 7X7
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Technology Collection
  customDbUrl:
  eissn: 1438-1435
  dateEnd: 20241102
  omitProxy: true
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: 8FG
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/technologycollection1
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLINK - Czech Republic Consortium
  customDbUrl:
  eissn: 1438-1435
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: AGYKE
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://link.springer.com
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: SpringerLink Journals (ICM)
  customDbUrl:
  eissn: 1438-1435
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0006968
  issn: 1070-3004
  databaseCode: U2A
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://www.springerlink.com/journals/
  providerName: Springer Nature
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3da9swED_6AWNQRtt9eeuCBnvbDLZkRfLesi5paVkZY4HsyUiy1AW6ONgu9M_fSbbTjW6FvtgGny2jO-l-5_sCeGcynlNnTcySPImzvDSxNpzFqVTWsVIIYUO1z4vx6Tw7W_BFnxTWDNHug0sy7NR_JLuFUEQ0nhDipHnMtmGX-3JeKMVzOtnsv77cSxdpiDsMykCfKvPvd_ytju5gzDv-0aB2ZvvwpMeLZNIx-AC27OoQHn3pPeKHsNf9dyNdOtFT0J-tXZO-F8QlWdee0k8-qRxp7A3Bq9Aji9SqXHb1qpuPRJF11frAIRwrRK_7NlhVTdqK6CUqupKoq8uqXrY_fzXPYD6bfj8-jfs-CrFhgrexGBvEDTR3aOxR38hbSm0kYyzTAvESYi5tqRW6tGnJtNaZG6u0FFonUjGLeOA57KyqlX0JBOGZFk5LqTjLmCqVcwnVHDlrHNq3MoJ0mM7C9EXGfa-Lq-K2PHIIJ0MWFIEFBYvg_eaZdVdi417qo4FLRb_cmoKOfaFBtH7yCN5ubuNC8d4PtbLVtadBaIL6OqERvOi4uxmOcQ_Nch7Bh4Hdty___7e8ehj5a3hMg-j5uJ8j2Gnra_sGIU2rR7AtFgKPcnYygt3JyY_zKZ4_TS--fhsF6f4NdJnu3g
linkProvider Springer Nature
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1ba9VAEB5KCypI0XqLrbqCPulispucJEIRtS2ntj2ItNC3uNdaqCfpSUr1z_nbnN1szkGKfetbIJsLmdmdb7Iz3wfwSqVZyaxRlMdlTNNSKypVxmlSCGO5zvPceLbPyWh8lH45zo6X4M_QC-PKKoc10S_UulbuH_k7NnJccghwyw_NOXWqUW53dZDQEEFaQW96irHQ2LFnfl9iCtdu7m6hvV8ztrN9-HlMg8oAVTzPOpqPFEZVVlpMhZiTuS4KqQrOeSpzRBOISKRhJpfaJJpLKVM7EonOpYwLwQ1zZEwYAlZSnpaY_K182p58_TaPBY56pq96xNUO_TG07YTmPV9aickgQrakpPzf0HgF717Zq_UhcOcerAbsSj72znYflsx0DW4dhN35Nbjb_wMkfWvTA5BbxjQk6FKckGbmRjpHILUlrflF8MjrdZGZ0Kc9d3b7ngjS1J0rYsJn-Up6J8lVz0hXE3mKQVcTcXaCpul-_GwfwtGNfO1HsDytp-YJEISKMreyKETGUy60sDZmEq0vlMVcu4ggGT5npQLhudPdOKsWVM2-tA1NUHkTVDyCN_Nrmp7u49rRG4OVqjD122rhqBG8nJ_GSet2YsTU1BduDMIkxA4xi-Bxb93543jmYGKZRfB2MPfi5v9_l6fXv8sLuD0-PNiv9ncne-twh3m3c_VHG7DczS7MM4RWnXwe_JfA95ueMn8BRz4vog
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3da9UwFD-MCUMQ0fmx6tQI-qRhbdLetIKIeL1sTocPDu5bzecczNt626H-a_51nqTpvchwb3srNG1Czkd-J_nlHIBnOi8q5qymPK1SmldGU6ULTrNSWseNEMKGbJ9Hk_3j_MO8mG_An_EujKdVjj4xOGrTaL9HvscmPpccAtxqz0VaxOfp7E37g_oKUv6kdSynMajIof39E8O37vXBFGX9nLHZ-y_v9mmsMEA1F0VPxUTjisoqh2EQ8yWuy1LpknOeK4FIAtGIsswKZWxmuFIqdxOZGaFUWkpumU_EhO7_muC88nRCMV8Fe6lPOjPwHdHPoSbGCzvx2l4gVWIYiGAtqyj_d1G8gHQvnNKGxW92C25G1EreDmp2GzbsYhu2PsVz-W24Mez-keFS0x1QU2tbEitSnJB26Vt6FSCNI539RfApVOoiS2lOh6zZ3SsiSdv0nr6EfQUOvS_G1SxJ3xB1isutIfLsBAXRf_ve3YXjK5nre7C5aBZ2BwiCRCWcKktZ8JxLI51LmSpQv7TDKLtMIBuns9Yx1bmvuHFWr5M0B1IbiqAOIqh5Ai9W37RDoo9LW--OUqqj0Xf1WkUTeLp6jebqz2Dkwjbnvg0CJEQNKUvg_iDdVXe88ACxKhJ4OYp7_fP_j-XB5WN5AltoKPXHg6PDh3CdBa3zxKNd2OyX5_YRYqpePQ7KS-DrVVvLX9qaLTw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+learning+prediction+of+sex+on+chest+radiographs%3A+a+potential+contributor+to+biased+algorithms&rft.jtitle=Emergency+radiology&rft.au=Li%2C+David&rft.au=Lin%2C+Cheng+Ting&rft.au=Sulam%2C+Jeremias&rft.au=Yi%2C+Paul+H.&rft.date=2022-04-01&rft.pub=Springer+International+Publishing&rft.issn=1070-3004&rft.eissn=1438-1435&rft.volume=29&rft.issue=2&rft.spage=365&rft.epage=370&rft_id=info:doi/10.1007%2Fs10140-022-02019-3&rft.externalDocID=10_1007_s10140_022_02019_3
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1070-3004&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1070-3004&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1070-3004&client=summon