On the effect of training database size for MR-based synthetic CT generation in the head

Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging. To investigate the performance of unsupervised training using a large number of unpaired data sets a...

Full description

Saved in:
Bibliographic Details
Published inComputerized medical imaging and graphics Vol. 107; p. 102227
Main Authors Estakhraji, Seyed Iman Zare, Pirasteh, Ali, Bradshaw, Tyler, McMillan, Alan
Format Journal Article
LanguageEnglish
Published United States Elsevier Ltd 01.07.2023
Subjects
Online AccessGet full text
ISSN0895-6111
1879-0771
1879-0771
DOI10.1016/j.compmedimag.2023.102227

Cover

Abstract Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging. To investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images. A cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist. While unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size. Supervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data. •The value and limit of unsupervised dual GANs in the head is investigated•Study on fine-tuning’s power to improve with limited paired and large unpaired data.•12 models trained for unsupervised, semi-supervised, and supervised training.•Supervised learning (fine tuning/full supervision) yields higher accuracy.•Fine-tuned training with unpaired data not better than registered data alone.
AbstractList Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging. To investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images. A cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist. While unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size. Supervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data. •The value and limit of unsupervised dual GANs in the head is investigated•Study on fine-tuning’s power to improve with limited paired and large unpaired data.•12 models trained for unsupervised, semi-supervised, and supervised training.•Supervised learning (fine tuning/full supervision) yields higher accuracy.•Fine-tuned training with unpaired data not better than registered data alone.
Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging.
Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging.Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging.To investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images.PURPOSETo investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images.A cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist.MATERIALS AND METHODSA cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist.While unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size.RESULTSWhile unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size.Supervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data.CONCLUSIONSupervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data.
AbstractGeneration of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging. Purpose:To investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images. Materials and methods:A cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist. Results:While unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size. Conclusion:Supervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data.
Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving MR-guided radiotherapy and PET/MR imaging. To investigate the performance of unsupervised training using a large number of unpaired data sets as well as the potential gain in performance after fine-tuning with supervised training using spatially registered data sets in generation of synthetic computed tomography (sCT) from magnetic resonance (MR) images. A cycleGAN method consisting of two generators (residual U-Net) and two discriminators (patchGAN) was used for unsupervised training. Unsupervised training utilized unpaired T1-weighted MR and CT images (2061 sets for each modality). Five supervised models were then fine-tuned starting with the generator of the unsupervised model for 1, 10, 25, 50, and 100 pairs of spatially registered MR and CT images. Four supervised training models were also trained from scratch for 10, 25, 50, and 100 pairs of spatially registered MR and CT images using only the residual U-Net generator. All models were evaluated on a holdout test set of spatially registered images from 253 patients, including 30 with significant pathology. sCT images were compared against the acquired CT images using mean absolute error (MAE), Dice coefficient, and structural similarity index (SSIM). sCT images from 60 test subjects generated by the unsupervised, and most accurate of the fine-tuned and supervised models were qualitatively evaluated by a radiologist. While unsupervised training produced realistic-appearing sCT images, addition of even one set of registered images improved quantitative metrics. Addition of more paired data sets to the training further improved image quality, with the best results obtained using the highest number of paired data sets (n=100). Supervised training was found to be superior to unsupervised training, while fine-tuned training showed no clear benefit over supervised learning, regardless of the training sample size. Supervised learning (using either fine tuning or full supervision) leads to significantly higher quantitative accuracy in the generation of sCT from MR images. However, fine-tuned training using both a large number of unpaired image sets was generally no better than supervised learning using registered image sets alone, suggesting the importance of well registered paired data set for training compared to a large set of unpaired data.
ArticleNumber 102227
Author Bradshaw, Tyler
Estakhraji, Seyed Iman Zare
McMillan, Alan
Pirasteh, Ali
AuthorAffiliation c Department of Electrical and Computer Engineering, University of Wisconsin-Madison, United States of America
b Department of Medical Physics, University of Wisconsin-Madison, United States of America
a Department of Radiology, University of Wisconsin-Madison, United States of America
d Department of Biomedical Engineering, University of Wisconsin-Madison, United States of America
AuthorAffiliation_xml – name: d Department of Biomedical Engineering, University of Wisconsin-Madison, United States of America
– name: b Department of Medical Physics, University of Wisconsin-Madison, United States of America
– name: a Department of Radiology, University of Wisconsin-Madison, United States of America
– name: c Department of Electrical and Computer Engineering, University of Wisconsin-Madison, United States of America
Author_xml – sequence: 1
  givenname: Seyed Iman Zare
  surname: Estakhraji
  fullname: Estakhraji, Seyed Iman Zare
  email: zareestakhra@wisc.edu
  organization: Department of Radiology, University of Wisconsin-Madison, United States of America
– sequence: 2
  givenname: Ali
  surname: Pirasteh
  fullname: Pirasteh, Ali
  organization: Department of Radiology, University of Wisconsin-Madison, United States of America
– sequence: 3
  givenname: Tyler
  surname: Bradshaw
  fullname: Bradshaw, Tyler
  organization: Department of Radiology, University of Wisconsin-Madison, United States of America
– sequence: 4
  givenname: Alan
  orcidid: 0000-0003-4502-6522
  surname: McMillan
  fullname: McMillan, Alan
  organization: Department of Radiology, University of Wisconsin-Madison, United States of America
BackLink https://www.ncbi.nlm.nih.gov/pubmed/37167815$$D View this record in MEDLINE/PubMed
BookMark eNqVkktvEzEUhS1URNPCX0Bmx2aCHxnbwwJURbykokpQJHaW47lOHCZ2as8UhV-PhxRaKiGFlS3b57vX554TdBRiAISeUTKlhIoX66mNm-0GWr8xyykjjJdzxph8gCZUyaYiUtIjNCGqqStBKT1GJzmvCSGMSPoIHXNJhVS0nqCvFwH3K8DgHNgeR4f7ZHzwYYlb05uFyYCz_wHYxYQ_fqrGgxbnXSii3ls8v8RLCJBM72PAfg9bgWkfo4fOdBme3Kyn6MvbN5fz99X5xbsP87PzytaN7CvmmoYo6axs-YwpIklTW-Icb5SQsDDCKsMEWBCECGoEL3tmjKkJUcIsHD9FL_fcIWzN7rvpOr1NxZe005To0S691nfs0qNdem9XEb_ei7fDotxbCOX3t4BovP77JviVXsbrop8pzhkthOc3hBSvBsi93vhsoetMgDhkzRTldV2aV-Xp07vF_lT5PYzbbmyKOSdw2vr-l7HjTLqD_tPcI_yPF_O9Fsq0rj0kna2HYMvTVKKh2-gPory6R7FdiZM13TfYQV7HIYUSB011Zproz2NEx4QyXtI5q5sCOPs34MAmfgLCB_v_
CitedBy_id crossref_primary_10_1016_j_radonc_2024_110387
crossref_primary_10_1016_j_compbiomed_2025_109834
crossref_primary_10_1186_s40658_023_00569_0
crossref_primary_10_1016_j_media_2023_103046
Cites_doi 10.1016/j.jacr.2017.12.026
10.1016/j.ijrobp.2009.01.065
10.1016/j.radonc.2019.03.026
10.1016/j.ejmp.2021.07.027
10.1002/acm2.12554
10.1186/s13014-016-0747-y
10.1002/mp.13047
10.1186/s40658-018-0225-8
10.1016/j.radonc.2020.06.049
10.21037/qims-19-885
10.1038/s41598-021-87564-6
10.1002/mp.13247
10.1038/s41592-019-0686-2
10.1002/mp.12964
10.1118/1.4758068
10.3389/fninf.2014.00014
10.1109/CVPR.2017.632
10.1002/mp.13617
10.1016/j.ijrobp.2017.08.043
10.1016/j.cpet.2021.06.010
10.1002/mp.15150
10.7717/peerj.453
10.1088/1361-6560/aada6d
10.1016/j.clon.2018.08.009
10.1109/ICCV.2017.244
10.1016/j.compbiomed.2022.105277
10.1109/ICCV.2017.310
10.1002/mp.13927
10.1109/TBME.2018.2814538
10.1016/j.ijrobp.2016.03.002
10.1109/ACCESS.2021.3068094
10.1259/bjr/41321492
10.1088/0031-9155/60/22/R323
ContentType Journal Article
Copyright 2023 Elsevier Ltd
Elsevier Ltd
Copyright © 2023 Elsevier Ltd. All rights reserved.
Copyright_xml – notice: 2023 Elsevier Ltd
– notice: Elsevier Ltd
– notice: Copyright © 2023 Elsevier Ltd. All rights reserved.
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
5PM
ADTOC
UNPAY
DOI 10.1016/j.compmedimag.2023.102227
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
Unpaywall for CDI: Periodical Content
Unpaywall
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList

MEDLINE - Academic

MEDLINE

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: UNPAY
  name: Unpaywall
  url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 1879-0771
EndPage 102227
ExternalDocumentID oai:pubmedcentral.nih.gov:10483321
PMC10483321
37167815
10_1016_j_compmedimag_2023_102227
S0895611123000459
1_s2_0_S0895611123000459
Genre Journal Article
Research Support, N.I.H., Extramural
GrantInformation_xml – fundername: NIBIB NIH HHS
  grantid: R01 EB026708
– fundername: NCATS NIH HHS
  grantid: KL2 TR002374
– fundername: NCATS NIH HHS
  grantid: UL1 TR002373
GroupedDBID ---
--K
--M
.1-
.DC
.FO
.GJ
.~1
0R~
1B1
1P~
1RT
1~.
1~5
29F
4.4
457
4G.
53G
5GY
5RE
5VS
7-5
71M
8P~
9JM
9JN
AAEDT
AAEDW
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AATTM
AAXKI
AAXUO
AAYFN
AAYWO
ABBOA
ABBQC
ABFNM
ABJNI
ABMAC
ABMZM
ABWVN
ABXDB
ACDAQ
ACGFS
ACIEU
ACIUM
ACIWK
ACLOT
ACNNM
ACPRK
ACRLP
ACRPL
ACVFH
ACZNC
ADBBV
ADCNI
ADEZE
ADJOM
ADMUD
ADNMO
AEBSH
AEIPS
AEKER
AENEX
AEUPX
AEVXI
AFJKZ
AFPUW
AFRAH
AFRHN
AFTJW
AFXIZ
AGHFR
AGQPQ
AGUBO
AGYEJ
AHHHB
AHZHX
AIALX
AIEXJ
AIGII
AIIUN
AIKHN
AITUG
AJRQY
AJUYK
AKBMS
AKRWK
AKYEP
ALMA_UNASSIGNED_HOLDINGS
AMRAJ
ANKPU
ANZVX
AOUOD
APXCP
ASPBG
AVWKF
AXJTR
AZFZN
BKOJK
BLXMC
BNPGV
CS3
DU5
EBS
EFJIC
EFKBS
EFLBG
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-2
G-Q
GBLVA
GBOLZ
HEI
HLZ
HMK
HMO
HVGLF
HZ~
IHE
J1W
KOM
LX9
M29
M41
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
Q38
R2-
ROL
RPZ
SAE
SBC
SDF
SDG
SDP
SEL
SES
SEW
SPC
SPCBC
SSH
SSV
SSZ
T5K
WUQ
Z5R
ZGI
~G-
~HD
AACTN
AFCTW
AFKWA
AJOXV
AMFUW
RIG
AAIAV
ABLVK
ABYKQ
AJBFU
LCYCR
AAYXX
CITATION
AGCQF
AGRNS
CGR
CUY
CVF
ECM
EIF
NPM
7X8
5PM
ADTOC
UNPAY
ID FETCH-LOGICAL-c597t-2f99087fc7d342807095c0ff39867eba6c8a26ece60061a636ec2aaa50086abf3
IEDL.DBID .~1
ISSN 0895-6111
1879-0771
IngestDate Sun Oct 26 04:15:15 EDT 2025
Tue Sep 30 17:08:31 EDT 2025
Wed Oct 01 14:25:17 EDT 2025
Mon Jul 21 05:14:45 EDT 2025
Thu Oct 16 04:33:48 EDT 2025
Thu Apr 24 22:51:24 EDT 2025
Fri Feb 23 02:35:33 EST 2024
Tue Feb 25 19:59:06 EST 2025
Tue Oct 14 19:38:02 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords MR-guided radiotherapy
Generative adversarial networks (GAN), Synthetic CT generation
Fine-Tuning
Language English
License Copyright © 2023 Elsevier Ltd. All rights reserved.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c597t-2f99087fc7d342807095c0ff39867eba6c8a26ece60061a636ec2aaa50086abf3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0003-4502-6522
OpenAccessLink https://proxy.k.utb.cz/login?url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10483321/pdf/nihms-1895731.pdf
PMID 37167815
PQID 2813556008
PQPubID 23479
PageCount 1
ParticipantIDs unpaywall_primary_10_1016_j_compmedimag_2023_102227
pubmedcentral_primary_oai_pubmedcentral_nih_gov_10483321
proquest_miscellaneous_2813556008
pubmed_primary_37167815
crossref_citationtrail_10_1016_j_compmedimag_2023_102227
crossref_primary_10_1016_j_compmedimag_2023_102227
elsevier_sciencedirect_doi_10_1016_j_compmedimag_2023_102227
elsevier_clinicalkeyesjournals_1_s2_0_S0895611123000459
elsevier_clinicalkey_doi_10_1016_j_compmedimag_2023_102227
PublicationCentury 2000
PublicationDate 2023-07-01
PublicationDateYYYYMMDD 2023-07-01
PublicationDate_xml – month: 07
  year: 2023
  text: 2023-07-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle Computerized medical imaging and graphics
PublicationTitleAlternate Comput Med Imaging Graph
PublicationYear 2023
Publisher Elsevier Ltd
Publisher_xml – name: Elsevier Ltd
References Chen, Qin, Zhou, Yan (b5) 2018; 45
McMillan, Bradshaw (b29) 2021; 16
Yang, Sun, Carass, Zhao, Lee, Xu, Prince (b41) 2018
He, Xia, Qin, Wang, Yu, Liu, Ma (b12) 2016; 29
Cohen, Luck, Honari (b6) 2018
Jonsson, Nyholm, Söderkvist (b17) 2019; 18
Tustison, Cook, Holbrook, Johnson, Muschelli, Devenyi, Duda, Das, Cullen, Gillen (b36) 2021; 11
Virtanen, Gommers, Oliphant, Haberland, Reddy, Cournapeau, Burovski, Peterson, Weckesser, Bright, van der Walt, Brett, Wilson, Millman, Mayorov, Nelson, Jones, Kern, Larson, Carey, Polat, Feng, Moore, VanderPlas, Laxalde, Perktold, Cimrman, Henriksen, Quintero, Harris, Archibald, Ribeiro, Pedregosa, van Mulbregt (b38) 2020; 17
Thrall, Li, Li, Cruz, Do, Dreyer, Brink (b35) 2018; 15
Jang, Liu, Zhao, Bradshaw, McMillan (b15) 2018; 45
Zhou, Xiao, Yang, Feng, He, He (b44) 2017
Maspero, Savenije, Dinkla, Seevinck, Intven, Jurgenliemk-Schulz, Kerkmeijer, van den Berg (b28) 2018; 63
Peng, Chen, Qin, Chen, Gao, Liu, Miao, Gu, Zhao, Deng (b31) 2020; 150
Wolterink, Dinkla, Savenije, Seevinck, van den Berg, Išgum (b39) 2017
Liu, Yadav, Baschnagel, McMillan (b27) 2019; 20
Abraham, Pedregosa, Eickenberg, Gervais, Mueller, Kossaifi, Gramfort, Thirion, Varoquaux (b1) 2014; 8
Edmund, Nyholm (b9) 2017; 12
Boulanger, Nunes, Chourak, Largent, Tahri, Acosta, De Crevoisier, Lafond, Barateau (b3) 2021; 89
Kerkmeijer, Maspero, Meijer, van Zyp, De Boer, van den Berg (b20) 2018; 30
Almahairi, Rajeshwar, Sordoni, Bachman, Courville (b2) 2018
Karlsson, Karlsson, Nyholm, Amies, Zackrisson (b18) 2009; 74
Price, Kim, Zheng, Chetty, Glide-Hurst (b32) 2016; 95
Consortium (b7) 2020
Chen, Guan, Li (b4) 2021; 9
Xiang, Li, Lin, Wang, Shen (b40) 2018
Li, Li, Qin, Liang, Xu, Xiong, Xie (b25) 2020; 10
Liu, Jang, Kijowski, Zhao, Bradshaw, McMillan (b26) 2018; 5
Jabbarpour, Mahdavi, Sadr, Esmaili, Shiri, Zaidi (b14) 2022; 143
Khoo, Joon (b21) 2006; 79
Spadea, Maspero, Zaffino, Seco (b34) 2021; 48
Zhu, Jun-Yan, Park, Taesung, Isola, Phillip, Efros, Alexei A, 2017. Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 2223–2232.
Isola, Phillip, Zhu, Jun-Yan, Zhou, Tinghui, Efros, Alexei A, 2017. Image-to-image translation with conditional adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 1125–1134.
van der Walt, Schönberger, Nunez-Iglesias, Boulogne, Warner, Yager, Gouillart, Yu (b37) 2014; 2
Johnstone, Wyatt, Henry, Short, Sebag-Montefiore, Murray, Kelly, McCallum, Speight (b16) 2018; 100
Nie, Trullo, Lian, Wang, Petitjean, Ruan, Wang, Shen (b30) 2018; 65
Schmidt, Payne (b33) 2015; 60
Yi, Zili, Zhang, Hao, Tan, Ping, Gong, Minglun, 2017. Dualgan: Unsupervised dual learning for image-to-image translation. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 2849–2857.
Kim, Cha, Kim, Lee, Kim (b22) 2017
Zhang, Pfister, Li (b43) 2019
Goodfellow, Pouget-Abadie, Mirza, Xu, Warde-Farley, Ozair, Courville, Bengio (b11) 2014; 27
Klages, Benslimane, Riyahi, Jiang, Hunt, Deasy, Veeraraghavan, Tyagi (b23) 2020; 47
Emami, Dong, Nejad-Davarani, Glide-Hurst (b10) 2018; 45
Devic (b8) 2012; 39
Kazemifar, McGuire, Timmerman, Wardak, Nguyen, Park, Jiang, Owrangi (b19) 2019; 136
Lei, Harms, Wang, Liu, Shu, Jani, Curran, Mao, Liu, Yang (b24) 2019; 46
Johnstone (10.1016/j.compmedimag.2023.102227_b16) 2018; 100
Abraham (10.1016/j.compmedimag.2023.102227_b1) 2014; 8
Thrall (10.1016/j.compmedimag.2023.102227_b35) 2018; 15
Klages (10.1016/j.compmedimag.2023.102227_b23) 2020; 47
Yang (10.1016/j.compmedimag.2023.102227_b41) 2018
10.1016/j.compmedimag.2023.102227_b45
Xiang (10.1016/j.compmedimag.2023.102227_b40) 2018
Boulanger (10.1016/j.compmedimag.2023.102227_b3) 2021; 89
Li (10.1016/j.compmedimag.2023.102227_b25) 2020; 10
Zhou (10.1016/j.compmedimag.2023.102227_b44) 2017
He (10.1016/j.compmedimag.2023.102227_b12) 2016; 29
Lei (10.1016/j.compmedimag.2023.102227_b24) 2019; 46
Zhang (10.1016/j.compmedimag.2023.102227_b43) 2019
Devic (10.1016/j.compmedimag.2023.102227_b8) 2012; 39
Emami (10.1016/j.compmedimag.2023.102227_b10) 2018; 45
van der Walt (10.1016/j.compmedimag.2023.102227_b37) 2014; 2
Jang (10.1016/j.compmedimag.2023.102227_b15) 2018; 45
10.1016/j.compmedimag.2023.102227_b42
Edmund (10.1016/j.compmedimag.2023.102227_b9) 2017; 12
Almahairi (10.1016/j.compmedimag.2023.102227_b2) 2018
Tustison (10.1016/j.compmedimag.2023.102227_b36) 2021; 11
Khoo (10.1016/j.compmedimag.2023.102227_b21) 2006; 79
Spadea (10.1016/j.compmedimag.2023.102227_b34) 2021; 48
Kazemifar (10.1016/j.compmedimag.2023.102227_b19) 2019; 136
Price (10.1016/j.compmedimag.2023.102227_b32) 2016; 95
Nie (10.1016/j.compmedimag.2023.102227_b30) 2018; 65
Karlsson (10.1016/j.compmedimag.2023.102227_b18) 2009; 74
Maspero (10.1016/j.compmedimag.2023.102227_b28) 2018; 63
10.1016/j.compmedimag.2023.102227_b13
Jonsson (10.1016/j.compmedimag.2023.102227_b17) 2019; 18
Schmidt (10.1016/j.compmedimag.2023.102227_b33) 2015; 60
Liu (10.1016/j.compmedimag.2023.102227_b27) 2019; 20
Chen (10.1016/j.compmedimag.2023.102227_b4) 2021; 9
Kim (10.1016/j.compmedimag.2023.102227_b22) 2017
McMillan (10.1016/j.compmedimag.2023.102227_b29) 2021; 16
Goodfellow (10.1016/j.compmedimag.2023.102227_b11) 2014; 27
Virtanen (10.1016/j.compmedimag.2023.102227_b38) 2020; 17
Liu (10.1016/j.compmedimag.2023.102227_b26) 2018; 5
Cohen (10.1016/j.compmedimag.2023.102227_b6) 2018
Jabbarpour (10.1016/j.compmedimag.2023.102227_b14) 2022; 143
Consortium (10.1016/j.compmedimag.2023.102227_b7) 2020
Kerkmeijer (10.1016/j.compmedimag.2023.102227_b20) 2018; 30
Chen (10.1016/j.compmedimag.2023.102227_b5) 2018; 45
Wolterink (10.1016/j.compmedimag.2023.102227_b39) 2017
Peng (10.1016/j.compmedimag.2023.102227_b31) 2020; 150
References_xml – volume: 30
  start-page: 692
  year: 2018
  end-page: 701
  ident: b20
  article-title: Magnetic resonance imaging only workflow for radiotherapy simulation and planning in prostate cancer
  publication-title: Clin. Oncol.
– volume: 45
  start-page: 5659
  year: 2018
  end-page: 5665
  ident: b5
  article-title: U-net-generated synthetic CT images for magnetic resonance imaging-only prostate intensity-modulated radiation therapy treatment planning
  publication-title: Med. Phys.
– start-page: 14
  year: 2017
  end-page: 23
  ident: b39
  article-title: Deep MR to CT synthesis using unpaired data
  publication-title: International Workshop on Simulation and Synthesis in Medical Imaging
– volume: 9
  start-page: 46776
  year: 2021
  end-page: 46787
  ident: b4
  article-title: ArCycleGAN: Improved cyclegan for style transferring of fruit images
  publication-title: IEEE Access
– volume: 12
  start-page: 1
  year: 2017
  end-page: 15
  ident: b9
  article-title: A review of substitute CT generation for MRI-only radiation therapy
  publication-title: Radiat. Oncol.
– volume: 20
  start-page: 105
  year: 2019
  end-page: 114
  ident: b27
  article-title: MR-based treatment planning in radiation therapy using a deep learning approach
  publication-title: J. Appl. Clin. Med. Phys.
– volume: 16
  start-page: 543
  year: 2021
  end-page: 552
  ident: b29
  article-title: Artificial intelligence–based data corrections for attenuation and scatter in position emission tomography and single-photon emission computed tomography
  publication-title: PET Clin.
– volume: 27
  year: 2014
  ident: b11
  article-title: Generative adversarial nets
  publication-title: Adv. Neural Inf. Process. Syst.
– reference: Zhu, Jun-Yan, Park, Taesung, Isola, Phillip, Efros, Alexei A, 2017. Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 2223–2232.
– volume: 15
  start-page: 504
  year: 2018
  end-page: 508
  ident: b35
  article-title: Artificial intelligence and machine learning in radiology: opportunities, challenges, pitfalls, and criteria for success
  publication-title: J. Am. College Radiol.
– volume: 150
  start-page: 217
  year: 2020
  end-page: 224
  ident: b31
  article-title: Magnetic resonance-based synthetic computed tomography images generated using generative adversarial networks for nasopharyngeal carcinoma radiotherapy treatment planning
  publication-title: Radiother. Oncol.
– volume: 143
  year: 2022
  ident: b14
  article-title: Unsupervised pseudo CT generation using heterogenous multicentric CT/MR images and CycleGAN: Dosimetric assessment for 3D conformal radiotherapy
  publication-title: Comput. Biol. Med.
– volume: 46
  start-page: 3565
  year: 2019
  end-page: 3581
  ident: b24
  article-title: MRI-only based synthetic CT generation using dense cycle consistent generative adversarial networks
  publication-title: Med. Phys.
– volume: 95
  start-page: 1281
  year: 2016
  end-page: 1289
  ident: b32
  article-title: Image guided radiation therapy using synthetic computed tomography images in brain cancer
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
– reference: Isola, Phillip, Zhu, Jun-Yan, Zhou, Tinghui, Efros, Alexei A, 2017. Image-to-image translation with conditional adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 1125–1134.
– volume: 10
  start-page: 1223
  year: 2020
  ident: b25
  article-title: Magnetic resonance image (MRI) synthesis from brain computed tomography (CT) images based on deep learning methods for magnetic resonance (MR)-guided radiotherapy
  publication-title: Quant. Imaging Med. Surg.
– year: 2017
  ident: b44
  article-title: Genegan: Learning object transfiguration and attribute subspace from unpaired data
– volume: 100
  start-page: 199
  year: 2018
  end-page: 217
  ident: b16
  article-title: Systematic review of synthetic computed tomography generation methodologies for use in magnetic resonance imaging–only radiation therapy
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
– volume: 5
  start-page: 1
  year: 2018
  end-page: 15
  ident: b26
  article-title: A deep learning approach for 18 F-FDG PET attenuation correction
  publication-title: EJNMMI Phys.
– volume: 39
  start-page: 6701
  year: 2012
  end-page: 6711
  ident: b8
  article-title: MRI simulation for radiotherapy treatment planning
  publication-title: Med. Phys.
– volume: 11
  start-page: 1
  year: 2021
  end-page: 13
  ident: b36
  article-title: The ANTsX ecosystem for quantitative biological and medical imaging
  publication-title: Sci. Rep.
– volume: 89
  start-page: 265
  year: 2021
  end-page: 281
  ident: b3
  article-title: Deep learning methods to generate synthetic CT from MRI in radiotherapy: A literature review
  publication-title: Phys. Med.
– reference: Yi, Zili, Zhang, Hao, Tan, Ping, Gong, Minglun, 2017. Dualgan: Unsupervised dual learning for image-to-image translation. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 2849–2857.
– year: 2020
  ident: b7
  article-title: Project MONAI
– volume: 2
  year: 2014
  ident: b37
  article-title: Scikit-image: image processing in python
  publication-title: PeerJ
– start-page: 529
  year: 2018
  end-page: 536
  ident: b6
  article-title: Distribution matching losses can hallucinate features in medical image translation
  publication-title: Medical Image Computing and Computer Assisted Intervention–MICCAI 2018: 21st International Conference, Granada, Spain, September 16-20, 2018, Proceedings, Part I
– volume: 48
  start-page: 6537
  year: 2021
  end-page: 6566
  ident: b34
  article-title: Deep learning based synthetic-CT generation in radiotherapy and PET: A review
  publication-title: Med. Phys.
– volume: 47
  start-page: 626
  year: 2020
  end-page: 642
  ident: b23
  article-title: Patch-based generative adversarial neural network models for head and neck MR-only planning
  publication-title: Med. Phys.
– volume: 29
  year: 2016
  ident: b12
  article-title: Dual learning for machine translation
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 74
  start-page: 644
  year: 2009
  end-page: 651
  ident: b18
  article-title: Dedicated magnetic resonance imaging in the radiotherapy clinic
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
– start-page: 174
  year: 2018
  end-page: 182
  ident: b41
  article-title: Unpaired brain MR-to-CT synthesis using a structure-constrained CycleGAN
  publication-title: Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support
– volume: 8
  start-page: 14
  year: 2014
  ident: b1
  article-title: Machine learning for neuroimaging with scikit-learn
  publication-title: Front. Neuroinform.
– volume: 79
  start-page: S2
  year: 2006
  end-page: S15
  ident: b21
  article-title: New developments in MRI for target volume delineation in radiotherapy
  publication-title: Br. J. Radiol.
– start-page: 195
  year: 2018
  end-page: 204
  ident: b2
  article-title: Augmented cyclegan: Learning many-to-many mappings from unpaired data
  publication-title: International Conference on Machine Learning
– volume: 63
  year: 2018
  ident: b28
  article-title: Dose evaluation of fast synthetic-CT generation using a generative adversarial network for general pelvis MR-only radiotherapy
  publication-title: Phys. Med. Biol.
– start-page: 1857
  year: 2017
  end-page: 1865
  ident: b22
  article-title: Learning to discover cross-domain relations with generative adversarial networks
  publication-title: International Conference on Machine Learning
– start-page: 155
  year: 2018
  end-page: 164
  ident: b40
  article-title: Unpaired deep cross-modality synthesis with fast training
  publication-title: Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support
– year: 2019
  ident: b43
  article-title: Harmonic unpaired image-to-image translation
– volume: 136
  start-page: 56
  year: 2019
  end-page: 63
  ident: b19
  article-title: MRI-only brain radiotherapy: Assessing the dosimetric accuracy of synthetic CT images generated using a deep learning approach
  publication-title: Radiother. Oncol.
– volume: 45
  start-page: 3627
  year: 2018
  end-page: 3636
  ident: b10
  article-title: Generating synthetic CTs from magnetic resonance images using generative adversarial networks
  publication-title: Med. Phys.
– volume: 65
  start-page: 2720
  year: 2018
  end-page: 2730
  ident: b30
  article-title: Medical image synthesis with deep convolutional adversarial networks
  publication-title: IEEE Trans. Biomed. Eng.
– volume: 45
  start-page: 3697
  year: 2018
  end-page: 3704
  ident: b15
  article-title: Deep learning based MRAC using rapid ultrashort echo time imaging
  publication-title: Med. Phys.
– volume: 60
  start-page: R323
  year: 2015
  ident: b33
  article-title: Radiotherapy planning using MRI
  publication-title: Phys. Med. Biol.
– volume: 17
  start-page: 261
  year: 2020
  end-page: 272
  ident: b38
  article-title: SciPy 1.0: Fundamental algorithms for scientific computing in python
  publication-title: Nature Methods
– volume: 18
  start-page: 60
  year: 2019
  end-page: 65
  ident: b17
  article-title: The rationale for MR-only treatment planning for external radiotherapy
  publication-title: Clin. Transl. Radiat. Oncol.
– volume: 29
  year: 2016
  ident: 10.1016/j.compmedimag.2023.102227_b12
  article-title: Dual learning for machine translation
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 15
  start-page: 504
  issue: 3
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b35
  article-title: Artificial intelligence and machine learning in radiology: opportunities, challenges, pitfalls, and criteria for success
  publication-title: J. Am. College Radiol.
  doi: 10.1016/j.jacr.2017.12.026
– volume: 74
  start-page: 644
  issue: 2
  year: 2009
  ident: 10.1016/j.compmedimag.2023.102227_b18
  article-title: Dedicated magnetic resonance imaging in the radiotherapy clinic
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
  doi: 10.1016/j.ijrobp.2009.01.065
– volume: 136
  start-page: 56
  year: 2019
  ident: 10.1016/j.compmedimag.2023.102227_b19
  article-title: MRI-only brain radiotherapy: Assessing the dosimetric accuracy of synthetic CT images generated using a deep learning approach
  publication-title: Radiother. Oncol.
  doi: 10.1016/j.radonc.2019.03.026
– volume: 89
  start-page: 265
  year: 2021
  ident: 10.1016/j.compmedimag.2023.102227_b3
  article-title: Deep learning methods to generate synthetic CT from MRI in radiotherapy: A literature review
  publication-title: Phys. Med.
  doi: 10.1016/j.ejmp.2021.07.027
– volume: 20
  start-page: 105
  issue: 3
  year: 2019
  ident: 10.1016/j.compmedimag.2023.102227_b27
  article-title: MR-based treatment planning in radiation therapy using a deep learning approach
  publication-title: J. Appl. Clin. Med. Phys.
  doi: 10.1002/acm2.12554
– start-page: 529
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b6
  article-title: Distribution matching losses can hallucinate features in medical image translation
– volume: 12
  start-page: 1
  issue: 1
  year: 2017
  ident: 10.1016/j.compmedimag.2023.102227_b9
  article-title: A review of substitute CT generation for MRI-only radiation therapy
  publication-title: Radiat. Oncol.
  doi: 10.1186/s13014-016-0747-y
– volume: 45
  start-page: 3627
  issue: 8
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b10
  article-title: Generating synthetic CTs from magnetic resonance images using generative adversarial networks
  publication-title: Med. Phys.
  doi: 10.1002/mp.13047
– volume: 5
  start-page: 1
  issue: 1
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b26
  article-title: A deep learning approach for 18 F-FDG PET attenuation correction
  publication-title: EJNMMI Phys.
  doi: 10.1186/s40658-018-0225-8
– volume: 150
  start-page: 217
  year: 2020
  ident: 10.1016/j.compmedimag.2023.102227_b31
  article-title: Magnetic resonance-based synthetic computed tomography images generated using generative adversarial networks for nasopharyngeal carcinoma radiotherapy treatment planning
  publication-title: Radiother. Oncol.
  doi: 10.1016/j.radonc.2020.06.049
– volume: 10
  start-page: 1223
  issue: 6
  year: 2020
  ident: 10.1016/j.compmedimag.2023.102227_b25
  article-title: Magnetic resonance image (MRI) synthesis from brain computed tomography (CT) images based on deep learning methods for magnetic resonance (MR)-guided radiotherapy
  publication-title: Quant. Imaging Med. Surg.
  doi: 10.21037/qims-19-885
– volume: 11
  start-page: 1
  issue: 1
  year: 2021
  ident: 10.1016/j.compmedimag.2023.102227_b36
  article-title: The ANTsX ecosystem for quantitative biological and medical imaging
  publication-title: Sci. Rep.
  doi: 10.1038/s41598-021-87564-6
– volume: 45
  start-page: 5659
  issue: 12
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b5
  article-title: U-net-generated synthetic CT images for magnetic resonance imaging-only prostate intensity-modulated radiation therapy treatment planning
  publication-title: Med. Phys.
  doi: 10.1002/mp.13247
– start-page: 195
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b2
  article-title: Augmented cyclegan: Learning many-to-many mappings from unpaired data
– year: 2020
  ident: 10.1016/j.compmedimag.2023.102227_b7
– volume: 17
  start-page: 261
  year: 2020
  ident: 10.1016/j.compmedimag.2023.102227_b38
  article-title: SciPy 1.0: Fundamental algorithms for scientific computing in python
  publication-title: Nature Methods
  doi: 10.1038/s41592-019-0686-2
– volume: 45
  start-page: 3697
  issue: 8
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b15
  article-title: Deep learning based MRAC using rapid ultrashort echo time imaging
  publication-title: Med. Phys.
  doi: 10.1002/mp.12964
– volume: 39
  start-page: 6701
  issue: 11
  year: 2012
  ident: 10.1016/j.compmedimag.2023.102227_b8
  article-title: MRI simulation for radiotherapy treatment planning
  publication-title: Med. Phys.
  doi: 10.1118/1.4758068
– volume: 8
  start-page: 14
  year: 2014
  ident: 10.1016/j.compmedimag.2023.102227_b1
  article-title: Machine learning for neuroimaging with scikit-learn
  publication-title: Front. Neuroinform.
  doi: 10.3389/fninf.2014.00014
– ident: 10.1016/j.compmedimag.2023.102227_b13
  doi: 10.1109/CVPR.2017.632
– volume: 46
  start-page: 3565
  issue: 8
  year: 2019
  ident: 10.1016/j.compmedimag.2023.102227_b24
  article-title: MRI-only based synthetic CT generation using dense cycle consistent generative adversarial networks
  publication-title: Med. Phys.
  doi: 10.1002/mp.13617
– volume: 100
  start-page: 199
  issue: 1
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b16
  article-title: Systematic review of synthetic computed tomography generation methodologies for use in magnetic resonance imaging–only radiation therapy
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
  doi: 10.1016/j.ijrobp.2017.08.043
– volume: 16
  start-page: 543
  issue: 4
  year: 2021
  ident: 10.1016/j.compmedimag.2023.102227_b29
  article-title: Artificial intelligence–based data corrections for attenuation and scatter in position emission tomography and single-photon emission computed tomography
  publication-title: PET Clin.
  doi: 10.1016/j.cpet.2021.06.010
– volume: 48
  start-page: 6537
  issue: 11
  year: 2021
  ident: 10.1016/j.compmedimag.2023.102227_b34
  article-title: Deep learning based synthetic-CT generation in radiotherapy and PET: A review
  publication-title: Med. Phys.
  doi: 10.1002/mp.15150
– volume: 18
  start-page: 60
  year: 2019
  ident: 10.1016/j.compmedimag.2023.102227_b17
  article-title: The rationale for MR-only treatment planning for external radiotherapy
  publication-title: Clin. Transl. Radiat. Oncol.
– volume: 2
  year: 2014
  ident: 10.1016/j.compmedimag.2023.102227_b37
  article-title: Scikit-image: image processing in python
  publication-title: PeerJ
  doi: 10.7717/peerj.453
– volume: 63
  issue: 18
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b28
  article-title: Dose evaluation of fast synthetic-CT generation using a generative adversarial network for general pelvis MR-only radiotherapy
  publication-title: Phys. Med. Biol.
  doi: 10.1088/1361-6560/aada6d
– volume: 30
  start-page: 692
  issue: 11
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b20
  article-title: Magnetic resonance imaging only workflow for radiotherapy simulation and planning in prostate cancer
  publication-title: Clin. Oncol.
  doi: 10.1016/j.clon.2018.08.009
– ident: 10.1016/j.compmedimag.2023.102227_b45
  doi: 10.1109/ICCV.2017.244
– start-page: 174
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b41
  article-title: Unpaired brain MR-to-CT synthesis using a structure-constrained CycleGAN
– volume: 143
  year: 2022
  ident: 10.1016/j.compmedimag.2023.102227_b14
  article-title: Unsupervised pseudo CT generation using heterogenous multicentric CT/MR images and CycleGAN: Dosimetric assessment for 3D conformal radiotherapy
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2022.105277
– ident: 10.1016/j.compmedimag.2023.102227_b42
  doi: 10.1109/ICCV.2017.310
– volume: 47
  start-page: 626
  issue: 2
  year: 2020
  ident: 10.1016/j.compmedimag.2023.102227_b23
  article-title: Patch-based generative adversarial neural network models for head and neck MR-only planning
  publication-title: Med. Phys.
  doi: 10.1002/mp.13927
– volume: 65
  start-page: 2720
  issue: 12
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b30
  article-title: Medical image synthesis with deep convolutional adversarial networks
  publication-title: IEEE Trans. Biomed. Eng.
  doi: 10.1109/TBME.2018.2814538
– volume: 95
  start-page: 1281
  issue: 4
  year: 2016
  ident: 10.1016/j.compmedimag.2023.102227_b32
  article-title: Image guided radiation therapy using synthetic computed tomography images in brain cancer
  publication-title: Int. J. Radiat. Oncol.* Biol.* Phys.
  doi: 10.1016/j.ijrobp.2016.03.002
– volume: 27
  year: 2014
  ident: 10.1016/j.compmedimag.2023.102227_b11
  article-title: Generative adversarial nets
  publication-title: Adv. Neural Inf. Process. Syst.
– start-page: 155
  year: 2018
  ident: 10.1016/j.compmedimag.2023.102227_b40
  article-title: Unpaired deep cross-modality synthesis with fast training
– year: 2017
  ident: 10.1016/j.compmedimag.2023.102227_b44
– year: 2019
  ident: 10.1016/j.compmedimag.2023.102227_b43
– volume: 9
  start-page: 46776
  year: 2021
  ident: 10.1016/j.compmedimag.2023.102227_b4
  article-title: ArCycleGAN: Improved cyclegan for style transferring of fruit images
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2021.3068094
– volume: 79
  start-page: S2
  issue: special_issue_1
  year: 2006
  ident: 10.1016/j.compmedimag.2023.102227_b21
  article-title: New developments in MRI for target volume delineation in radiotherapy
  publication-title: Br. J. Radiol.
  doi: 10.1259/bjr/41321492
– start-page: 1857
  year: 2017
  ident: 10.1016/j.compmedimag.2023.102227_b22
  article-title: Learning to discover cross-domain relations with generative adversarial networks
– start-page: 14
  year: 2017
  ident: 10.1016/j.compmedimag.2023.102227_b39
  article-title: Deep MR to CT synthesis using unpaired data
– volume: 60
  start-page: R323
  issue: 22
  year: 2015
  ident: 10.1016/j.compmedimag.2023.102227_b33
  article-title: Radiotherapy planning using MRI
  publication-title: Phys. Med. Biol.
  doi: 10.1088/0031-9155/60/22/R323
SSID ssj0002071
Score 2.3919566
Snippet Generation of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in improving...
AbstractGeneration of computed tomography (CT) images from magnetic resonance (MR) images using deep learning methods has recently demonstrated promise in...
SourceID unpaywall
pubmedcentral
proquest
pubmed
crossref
elsevier
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 102227
SubjectTerms Fine-Tuning
Generative adversarial networks (GAN), Synthetic CT generation
Humans
Image Processing, Computer-Assisted - methods
Internal Medicine
Magnetic Resonance Imaging - methods
Magnetic Resonance Spectroscopy
MR-guided radiotherapy
Other
Radiotherapy, Image-Guided
Tomography, X-Ray Computed
SummonAdditionalLinks – databaseName: Unpaywall
  dbid: UNPAY
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3db9MwED-NThq88D0WvuRJvCaN4zROEC9TxZiQOiZYpfJkOY6zFdq0Wlqh7a_nLl9sDKQK8ZbEOUfxXe5-J_9yB_AmTTBt4EK70vrWDTlP3VT7sSt5YkxiJe12EtviODoahx8ng8kWfGj_hVnOcRaTTr1iNveK6XnFrWw5Yv2T0ZBTDXQR8P4yy_t4w7x0eZwMpECXk-V3YDsaICjvwfb4-OTga4UhkwEmSFUnXmqt7fpS8h3Y_8X0IvI27WTP9ZlH3cS9KguSf4tUt5HobULl3XWx1Jc_9Gx2LVodPoDz7j0rksp3b71KPXP1WwnI_7AQD-F-g2jZQS30CLZs8Rh2Rs2e_ROYfCoYokxWM0fYImdtWwpG9FQKo6ycXlmG8JmNPrt0IWPlZYFCOCUbnrKzqjY2mRCb1pNhCMmewvjw_enwyG36ObgG05aVG-QY-mKZG5mJkKrwILwzfp6LJI6kTXVkYh1E1tiIgJWOBB4HWlPPhjjSaS52oVcsCrsHzNeYphluSST0tdDoWEIe2kBzLXHIgbjVnTJNsXN6uZlqWW3f1DW1K1K7qtXuQNCJLuuKH5sIvW0NRLW_tKITVhiXNhGWfxK2ZeNOSsVVGShffSFbJlPGvJGweOLAu06yQUw1Etr0wfutLSv0KrRVpAu7WJcqiDkCUdRD7MCz2ra7xRCYYsuYD3CNb1h9dwNVLL85guZZVS5vbdYB0X0gmy_y83-SegH36KymWL-E3upibV8hkFylrxsn8RPc0XBW
  priority: 102
  providerName: Unpaywall
Title On the effect of training database size for MR-based synthetic CT generation in the head
URI https://www.clinicalkey.com/#!/content/1-s2.0-S0895611123000459
https://www.clinicalkey.es/playcontent/1-s2.0-S0895611123000459
https://dx.doi.org/10.1016/j.compmedimag.2023.102227
https://www.ncbi.nlm.nih.gov/pubmed/37167815
https://www.proquest.com/docview/2813556008
https://pubmed.ncbi.nlm.nih.gov/PMC10483321
https://pmc.ncbi.nlm.nih.gov/articles/PMC10483321/pdf/nihms-1895731.pdf
UnpaywallVersion submittedVersion
Volume 107
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVESC
  databaseName: Baden-Württemberg Complete Freedom Collection (Elsevier)
  customDbUrl:
  eissn: 1879-0771
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002071
  issn: 1879-0771
  databaseCode: GBLVA
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
– providerCode: PRVESC
  databaseName: Elsevier ScienceDirect Freedom Collection Journals
  customDbUrl:
  eissn: 1879-0771
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002071
  issn: 1879-0771
  databaseCode: ACRLP
  dateStart: 19950101
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
– providerCode: PRVESC
  databaseName: Elsevier SD Freedom Collection Journals [SCFCJ]
  customDbUrl:
  eissn: 1879-0771
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002071
  issn: 1879-0771
  databaseCode: AIKHN
  dateStart: 19950101
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
– providerCode: PRVESC
  databaseName: Science Direct
  customDbUrl:
  eissn: 1879-0771
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002071
  issn: 1879-0771
  databaseCode: .~1
  dateStart: 19950101
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
– providerCode: PRVLSH
  databaseName: Elsevier Journals
  customDbUrl:
  mediaType: online
  eissn: 1879-0771
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002071
  issn: 1879-0771
  databaseCode: AKRWK
  dateStart: 19880101
  isFulltext: true
  providerName: Library Specific Holdings
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3da9swED9KC91exr7nbi0q7NWJZbmWMvYSQku20WxsDWRPQlbk1iNzQp0w2of97bvzVxuyQWBPcWydje5O94F-ugN4m_QwbeDC-NIFzo84T_zEBMqXvGdtz0na7SS0xSgejqOPk5PJDgyaszAEq6xtf2XTS2td3-nW3Owusqz7LVB0KBPjBVEGJnSIL4okdTHo_L6DeYRBmXTRYJ9G78PxHcaLYNu0h_3TXHaoj3inzH_kv3zUZgy6CaV8sMoX5uaXmc3u-amzx_CoDjBZv5rDE9hx-VPYP6-30J_B5HPOMOhjFZCDzVPWdIlghBYlr8aK7NYxjGbZ-VefbkxZcZMjEb6SDS7YZVmqmiTKsuplaNGnz2F8dnoxGPp1ewXfYhax9MMUPZGSqZVTEVFRHIy2bJCmoqdi6RITW2XC2FkXU5xjYoHXoTHUQkHFJknFC9jN57l7BSwwmDVZ7ogkCowwuM4jHrnQcCPxkQeqYai2de1xmtxMNyCzH_qeLDTJQley8CBsSRdVAY5tiN41UtPNCVO0iRrdxDbE8m_ErqhXd6G5LkId6A0N9OB9S7mmxNt--LhRMI2LnHZuTO7mq0KHimNciHJQHrysFK5lhsCMVyp-gjxeU8V2ABUQX3-SZ1dlIXFO_QREyD0QrdZuz-SD_5vta3hI_yro8xvYXV6v3CEGeMvkqFzBR7DX__BpOMLf8ehL__sfABNRtg
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3db9MwED-NIW17QXwTPj2J17SxncYO4gVVTAXWIUEn9c1yUmcEdWlFWqHxwN_OXb62qiBV4i2KfYl8d74P-ec7gNdJjGkDl9ZXLnB-yHniJzbQvuJxmsZO0WknoS3OotF5-HE6mO7BsL0LQ7DKxvbXNr2y1s2bfsPN_jLP-18DTZcyMV6QVWAS34Lb4UAoysB6v69xHiKosi6a7dP0Azi-BnkRbpsOsS_tRY8aifeqBEj9y0ltB6HbWMrDdbG0Vz_tfH7DUZ3chTtNhMne1Yu4B3uuuA8H4-YM_QFMPxcMoz5WIznYImNtmwhGcFFya6zMfzmG4Swbf_HpxYyVVwUS4SfZcMIuqlrVJFKW1x9Dkz57COcn7yfDkd_0V_BTTCNWvsjQFWmVpWomQ6qKg-FWGmSZjHWkXGKjVFsRudRFFOjYSOKzsJZ6KOjIJpl8BPvFonBPgAUW06aUOyIJAystbvSQh05YbhUOeaBbhpq0KT5Oi5ubFmX23dyQhSFZmFoWHoiOdFlX4NiF6E0rNdNeMUWjaNBP7EKs_kbsymZ7l4abUpjAbKmgB287yg0t3vXHx62CGdzldHRjC7dYl0ZojoEhykF78LhWuI4ZElNepfkAebyhit0EqiC-OVLk36pK4pwaCkjBPZCd1u7O5Kf_t9pXcDiajE_N6YezT8_giEZqHPRz2F_9WLsXGO2tkpfVbv4DzZRRmw
linkToUnpaywall http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3db9MwED-NThq88D0WvuRJvCaN4zROEC9TxZiQOiZYpfJkOY6zFdq0Wlqh7a_nLl9sDKQK8ZbEOUfxXe5-J_9yB_AmTTBt4EK70vrWDTlP3VT7sSt5YkxiJe12EtviODoahx8ng8kWfGj_hVnOcRaTTr1iNveK6XnFrWw5Yv2T0ZBTDXQR8P4yy_t4w7x0eZwMpECXk-V3YDsaICjvwfb4-OTga4UhkwEmSFUnXmqt7fpS8h3Y_8X0IvI27WTP9ZlH3cS9KguSf4tUt5HobULl3XWx1Jc_9Gx2LVodPoDz7j0rksp3b71KPXP1WwnI_7AQD-F-g2jZQS30CLZs8Rh2Rs2e_ROYfCoYokxWM0fYImdtWwpG9FQKo6ycXlmG8JmNPrt0IWPlZYFCOCUbnrKzqjY2mRCb1pNhCMmewvjw_enwyG36ObgG05aVG-QY-mKZG5mJkKrwILwzfp6LJI6kTXVkYh1E1tiIgJWOBB4HWlPPhjjSaS52oVcsCrsHzNeYphluSST0tdDoWEIe2kBzLXHIgbjVnTJNsXN6uZlqWW3f1DW1K1K7qtXuQNCJLuuKH5sIvW0NRLW_tKITVhiXNhGWfxK2ZeNOSsVVGShffSFbJlPGvJGweOLAu06yQUw1Etr0wfutLSv0KrRVpAu7WJcqiDkCUdRD7MCz2ra7xRCYYsuYD3CNb1h9dwNVLL85guZZVS5vbdYB0X0gmy_y83-SegH36KymWL-E3upibV8hkFylrxsn8RPc0XBW
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=On+the+effect+of+training+database+size+for+MR-based+synthetic+CT+generation+in+the+head&rft.jtitle=Computerized+medical+imaging+and+graphics&rft.au=Estakhraji%2C+Seyed+Iman+Zare&rft.au=Pirasteh%2C+Ali&rft.au=Bradshaw%2C+Tyler&rft.au=McMillan%2C+Alan&rft.date=2023-07-01&rft.issn=0895-6111&rft.eissn=1879-0771&rft.volume=107&rft.spage=102227&rft.epage=102227&rft_id=info:doi/10.1016%2Fj.compmedimag.2023.102227&rft_id=info%3Apmid%2F37167815&rft.externalDocID=PMC10483321
thumbnail_m http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fcdn.clinicalkey.com%2Fck-thumbnails%2F08956111%2Fcov200h.gif