Automatic Multi-Organ Segmentation on Abdominal CT With Dense V-Networks

Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challengi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on medical imaging Vol. 37; no. 8; pp. 1822 - 1834
Main Authors Gibson, Eli, Giganti, Francesco, Hu, Yipeng, Bonmati, Ester, Bandula, Steve, Gurusamy, Kurinchi, Davidson, Brian, Pereira, Stephen P., Clarkson, Matthew J., Barratt, Dean C.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.08.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0278-0062
1558-254X
1558-254X
DOI10.1109/TMI.2018.2806309

Cover

Abstract Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the gastrointestinal tract (esophagus, stomach, and duodenum) and surrounding organs (liver, spleen, left kidney, and gallbladder). We directly compared the segmentation accuracy of the proposed method to the existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 versus 0.71, 0.74, and 0.74 for the pancreas, 0.90 versus 0.85, 0.87, and 0.83 for the stomach, and 0.76 versus 0.68, 0.69, and 0.66 for the esophagus. We conclude that the deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.
AbstractList Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the GI tract (esophagus, stomach, duodenum) and surrounding organs (liver, spleen, left kidney, gallbladder). We directly compared the segmentation accuracy of the proposed method to existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 vs. 0.71, and 0.74 for the pancreas, 0.90 vs 0.85, 0.87 and 0.83 for the stomach and 0.76 vs 0.68, 0.69 and 0.66 for the esophagus. We conclude that deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.
Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the gastrointestinal tract (esophagus, stomach, and duodenum) and surrounding organs (liver, spleen, left kidney, and gallbladder). We directly compared the segmentation accuracy of the proposed method to the existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 versus 0.71, 0.74, and 0.74 for the pancreas, 0.90 versus 0.85, 0.87, and 0.83 for the stomach, and 0.76 versus 0.68, 0.69, and 0.66 for the esophagus. We conclude that the deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.
Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the gastrointestinal tract (esophagus, stomach, and duodenum) and surrounding organs (liver, spleen, left kidney, and gallbladder). We directly compared the segmentation accuracy of the proposed method to the existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 versus 0.71, 0.74, and 0.74 for the pancreas, 0.90 versus 0.85, 0.87, and 0.83 for the stomach, and 0.76 versus 0.68, 0.69, and 0.66 for the esophagus. We conclude that the deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the gastrointestinal tract (esophagus, stomach, and duodenum) and surrounding organs (liver, spleen, left kidney, and gallbladder). We directly compared the segmentation accuracy of the proposed method to the existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 versus 0.71, 0.74, and 0.74 for the pancreas, 0.90 versus 0.85, 0.87, and 0.83 for the stomach, and 0.76 versus 0.68, 0.69, and 0.66 for the esophagus. We conclude that the deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.
Author Bonmati, Ester
Barratt, Dean C.
Giganti, Francesco
Gibson, Eli
Clarkson, Matthew J.
Hu, Yipeng
Pereira, Stephen P.
Davidson, Brian
Bandula, Steve
Gurusamy, Kurinchi
AuthorAffiliation Department of Radiology, University College Hospital Trust, UK
Institute for Liver and Digestive Health, University College London, UK
Division of Surgery and Interventional Science, University College London, UK
UCL Centre for Medical Imaging, University College London, UK
Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London, UK
UCL Centre for Medical Image Computing, Department of Medical Physics & Biomedical Engineering, University College London, UK
AuthorAffiliation_xml – name: Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London, UK
– name: UCL Centre for Medical Image Computing, Department of Medical Physics & Biomedical Engineering, University College London, UK
– name: UCL Centre for Medical Imaging, University College London, UK
– name: Division of Surgery and Interventional Science, University College London, UK
– name: Institute for Liver and Digestive Health, University College London, UK
– name: Department of Radiology, University College Hospital Trust, UK
Author_xml – sequence: 1
  givenname: Eli
  orcidid: 0000-0001-9207-7280
  surname: Gibson
  fullname: Gibson, Eli
  email: eli.gibson@ucl.ac.uk
  organization: Department of Medical Physics and Biomedical Engineering, UCL Centre for Medical Image Computing, University College London, London, U.K
– sequence: 2
  givenname: Francesco
  orcidid: 0000-0001-5218-6431
  surname: Giganti
  fullname: Giganti, Francesco
  organization: Department of Radiology, University College Hospital Trust, London, U.K
– sequence: 3
  givenname: Yipeng
  orcidid: 0000-0003-4902-0486
  surname: Hu
  fullname: Hu, Yipeng
  organization: Department of Medical Physics and Biomedical Engineering, UCL Centre for Medical Image Computing, University College London, London, U.K
– sequence: 4
  givenname: Ester
  orcidid: 0000-0001-9217-5438
  surname: Bonmati
  fullname: Bonmati, Ester
  organization: Department of Medical Physics and Biomedical Engineering, UCL Centre for Medical Image Computing, University College London, London, U.K
– sequence: 5
  givenname: Steve
  orcidid: 0000-0002-4558-288X
  surname: Bandula
  fullname: Bandula, Steve
  organization: UCL Centre for Medical Imaging, University College London, London, U.K
– sequence: 6
  givenname: Kurinchi
  orcidid: 0000-0002-0313-9134
  surname: Gurusamy
  fullname: Gurusamy, Kurinchi
  organization: Division of Surgery and Interventional Science, University College London, London, U.K
– sequence: 7
  givenname: Brian
  orcidid: 0000-0002-9152-5907
  surname: Davidson
  fullname: Davidson, Brian
  organization: Division of Surgery and Interventional Science, University College London, London, U.K
– sequence: 8
  givenname: Stephen P.
  orcidid: 0000-0003-0821-1809
  surname: Pereira
  fullname: Pereira, Stephen P.
  organization: Institute for Liver and Digestive Health, University College London, London, U.K
– sequence: 9
  givenname: Matthew J.
  orcidid: 0000-0002-5565-1252
  surname: Clarkson
  fullname: Clarkson, Matthew J.
  organization: Department of Medical Physics and Biomedical Engineering, UCL Centre for Medical Image Computing, University College London, London, U.K
– sequence: 10
  givenname: Dean C.
  orcidid: 0000-0003-2916-655X
  surname: Barratt
  fullname: Barratt, Dean C.
  organization: Department of Medical Physics and Biomedical Engineering, UCL Centre for Medical Image Computing, University College London, London, U.K
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29994628$$D View this record in MEDLINE/PubMed
BookMark eNptkt1r2zAUxcXoWNN274PBMOxlL850ZVuWXgYhW9dCPx6afbwJ2blK1dlSatkt_e-n4CzbQkEg0P2do3uPdEQOnHdIyBugUwAqPy4uz6eMgpgyQXlG5QsygaIQKSvynwdkQlkpUko5OyRHIdxRCnlB5StyyKSUOWdiQs5mQ-9b3ds6uRya3qbX3Uq75AZXLbo-nnuXxDWrlr61TjfJfJH8sP1t8hldwOR7eoX9o-9-hRPy0ugm4Ovtfky-nX5ZzM_Si-uv5_PZRVrned6nmaygqkRRoi5NLqBGRkuqWWGg1jTPuKn0kjMEagwzBRRZKUzEgCM3hZTZMYHRd3Br_fSom0atO9vq7kkBVZtUVN9atUlFbVOJmk-jZj1ULS7rOFmn_-q8tur_irO3auUfFKclj0lFgw9bg87fDxh61dpQY9Noh34I8TYuspxyyCL6fg-980MXk4sUQAmQCbYxfPdvR7tW_jxMBPgI1J0PoUOjajs-R2zQNrtZ4w_Yn5XuCffjeUbydpRYRNzhgkngsfobHJi45Q
CODEN ITMID4
CitedBy_id crossref_primary_10_1002_mp_17364
crossref_primary_10_1016_j_compbiomed_2024_109079
crossref_primary_10_1088_1361_6560_ad33b5
crossref_primary_10_1097_RLU_0000000000003789
crossref_primary_10_3390_app12146876
crossref_primary_10_1016_j_eswa_2024_123856
crossref_primary_10_3389_feart_2022_783481
crossref_primary_10_1038_s41598_022_21206_3
crossref_primary_10_1109_MCE_2020_3048254
crossref_primary_10_1186_s12880_022_00893_4
crossref_primary_10_2139_ssrn_4117316
crossref_primary_10_1016_j_compbiomed_2023_106995
crossref_primary_10_1109_TMI_2022_3162111
crossref_primary_10_36548_jiip_2020_2_006
crossref_primary_10_1016_j_ijrobp_2024_07_2149
crossref_primary_10_1016_j_media_2024_103181
crossref_primary_10_1016_j_smhl_2022_100304
crossref_primary_10_1053_j_ro_2023_01_005
crossref_primary_10_1097_RLI_0000000000000755
crossref_primary_10_1016_j_compbiomed_2022_106473
crossref_primary_10_1007_s00530_021_00776_8
crossref_primary_10_1097_BRS_0000000000004308
crossref_primary_10_1038_s41598_024_55137_y
crossref_primary_10_3390_app131910765
crossref_primary_10_1186_s12938_024_01238_8
crossref_primary_10_1016_j_media_2021_101979
crossref_primary_10_1007_s11042_024_18336_3
crossref_primary_10_1024_1661_8157_a003597
crossref_primary_10_3389_fonc_2020_618357
crossref_primary_10_1007_s00330_020_06679_y
crossref_primary_10_1109_TPAMI_2021_3100536
crossref_primary_10_3389_fnins_2021_756536
crossref_primary_10_1051_itmconf_20235401003
crossref_primary_10_3390_jimaging11010019
crossref_primary_10_1016_j_media_2019_101558
crossref_primary_10_1109_TMI_2024_3354673
crossref_primary_10_4018_IJBDAH_287605
crossref_primary_10_1016_j_ejrad_2020_109031
crossref_primary_10_1016_j_bspc_2022_103867
crossref_primary_10_1016_j_media_2022_102642
crossref_primary_10_1038_s41598_022_18173_0
crossref_primary_10_1088_2057_1976_acfb06
crossref_primary_10_1007_s11548_022_02730_z
crossref_primary_10_1016_j_ejmp_2023_102595
crossref_primary_10_1016_j_media_2023_102838
crossref_primary_10_1016_j_engappai_2022_105532
crossref_primary_10_1142_S0219467822500395
crossref_primary_10_3390_bioengineering11121255
crossref_primary_10_1016_j_compbiomed_2024_108684
crossref_primary_10_1016_j_bspc_2022_104294
crossref_primary_10_1038_s41598_022_20804_5
crossref_primary_10_1016_j_amjms_2024_01_018
crossref_primary_10_1016_j_compbiomed_2024_108326
crossref_primary_10_3389_frobt_2020_00106
crossref_primary_10_1088_1361_6560_ab59a4
crossref_primary_10_1016_j_bbe_2021_05_004
crossref_primary_10_1007_s00371_021_02328_7
crossref_primary_10_1016_j_cmpb_2022_107085
crossref_primary_10_1002_mp_14429
crossref_primary_10_1088_1361_6560_ab9b57
crossref_primary_10_1016_j_gande_2025_03_002
crossref_primary_10_1088_1361_6560_ad611d
crossref_primary_10_1186_s13014_021_01864_9
crossref_primary_10_1371_journal_pone_0293560
crossref_primary_10_1007_s11548_021_02363_8
crossref_primary_10_3390_s23073420
crossref_primary_10_1002_mp_16280
crossref_primary_10_1007_s11684_020_0770_0
crossref_primary_10_26599_IJCS_2022_9100014
crossref_primary_10_1088_1361_6560_ab7877
crossref_primary_10_1016_j_radonc_2021_04_019
crossref_primary_10_1002_ima_23039
crossref_primary_10_1002_mp_14422
crossref_primary_10_1016_j_radi_2024_02_001
crossref_primary_10_1177_0003702820929064
crossref_primary_10_1016_j_displa_2022_102223
crossref_primary_10_1002_mp_15507
crossref_primary_10_1148_radiol_2019190737
crossref_primary_10_1109_ACCESS_2021_3055803
crossref_primary_10_1016_j_liver_2024_100251
crossref_primary_10_3390_app14083275
crossref_primary_10_1371_journal_pone_0265567
crossref_primary_10_1016_j_media_2024_103156
crossref_primary_10_1016_j_neucom_2021_05_081
crossref_primary_10_2196_26601
crossref_primary_10_1007_s11042_019_08320_7
crossref_primary_10_1016_j_cmpb_2020_105447
crossref_primary_10_1109_ACCESS_2023_3264582
crossref_primary_10_1109_TRPMS_2021_3055199
crossref_primary_10_1109_TMI_2021_3137854
crossref_primary_10_1002_mp_15264
crossref_primary_10_1016_j_media_2020_101766
crossref_primary_10_1002_mp_15146
crossref_primary_10_1016_j_media_2019_04_002
crossref_primary_10_1016_j_bspc_2023_105562
crossref_primary_10_3390_diagnostics12112765
crossref_primary_10_1109_TMI_2019_2903562
crossref_primary_10_1109_JBHI_2024_3516012
crossref_primary_10_1007_s11042_022_12055_3
crossref_primary_10_1016_j_bspc_2021_102686
crossref_primary_10_1016_j_acra_2019_08_014
crossref_primary_10_1016_j_cagd_2021_101972
crossref_primary_10_1016_j_eswa_2021_116444
crossref_primary_10_1016_j_neucom_2019_03_049
crossref_primary_10_1016_j_rineng_2023_100929
crossref_primary_10_1016_j_eswa_2022_118625
crossref_primary_10_1016_S2589_7500_20_30078_9
crossref_primary_10_1016_j_compbiomed_2021_104497
crossref_primary_10_2139_ssrn_3984233
crossref_primary_10_1016_j_compbiomed_2022_106152
crossref_primary_10_3390_diagnostics12061489
crossref_primary_10_1016_j_compmedimag_2024_102434
crossref_primary_10_1155_2020_8861035
crossref_primary_10_1109_TCYB_2024_3418937
crossref_primary_10_1109_JBHI_2023_3285230
crossref_primary_10_1148_radiol_233029
crossref_primary_10_1109_TMI_2024_3398728
crossref_primary_10_1109_JPROC_2019_2950506
crossref_primary_10_18663_tjcl_1647005
crossref_primary_10_1016_j_canrad_2023_05_001
crossref_primary_10_36401_IDDB_24_1
crossref_primary_10_1016_j_media_2020_101874
crossref_primary_10_1016_j_ymeth_2020_10_004
crossref_primary_10_1109_JBHI_2021_3137603
crossref_primary_10_1002_mp_15490
crossref_primary_10_1007_s11548_021_02386_1
crossref_primary_10_1016_j_ejrad_2021_109735
crossref_primary_10_1109_ACCESS_2020_3012990
crossref_primary_10_1016_j_media_2020_101876
crossref_primary_10_3389_fonc_2022_960056
crossref_primary_10_1109_TCSVT_2023_3295062
crossref_primary_10_1109_TMI_2019_2926568
crossref_primary_10_3389_fnimg_2022_832512
crossref_primary_10_3390_healthcare10081511
crossref_primary_10_1002_mp_14835
crossref_primary_10_1186_s12880_020_00460_9
crossref_primary_10_1109_ACCESS_2020_3027685
crossref_primary_10_1109_TRPMS_2018_2876562
crossref_primary_10_1080_03772063_2021_1944335
crossref_primary_10_1007_s10278_021_00563_x
crossref_primary_10_3390_genes13030431
crossref_primary_10_1016_j_clinimag_2022_04_007
crossref_primary_10_1038_s41467_022_34257_x
crossref_primary_10_1016_j_compmedimag_2019_101664
crossref_primary_10_1109_TIP_2020_3038363
crossref_primary_10_1016_j_imed_2021_06_004
crossref_primary_10_1002_mp_14193
crossref_primary_10_1007_s11766_022_4346_4
crossref_primary_10_1007_s00521_020_05407_3
crossref_primary_10_1016_j_bspc_2024_107208
crossref_primary_10_6009_jjrt_2020_JSRT_76_11_1133
crossref_primary_10_1007_s13246_023_01295_8
crossref_primary_10_17816_DD629866
crossref_primary_10_3389_fcell_2024_1532228
crossref_primary_10_1007_s00330_020_07147_3
crossref_primary_10_1016_j_compeleceng_2023_108926
crossref_primary_10_1007_s11548_020_02212_0
crossref_primary_10_1016_j_bspc_2022_103832
crossref_primary_10_1007_s11517_024_03273_y
crossref_primary_10_1109_ACCESS_2020_2998901
crossref_primary_10_1137_21M1433782
crossref_primary_10_3390_s23125720
crossref_primary_10_1007_s11633_021_1313_0
crossref_primary_10_1093_jrr_rraa132
crossref_primary_10_1016_j_cmpb_2022_106902
crossref_primary_10_1186_s12880_024_01362_w
crossref_primary_10_2174_1573405620666230515090523
crossref_primary_10_1109_TMI_2022_3225667
crossref_primary_10_1109_TNNLS_2023_3243241
crossref_primary_10_3390_make3020026
crossref_primary_10_1016_j_compmedimag_2019_101672
crossref_primary_10_3389_fonc_2024_1415859
crossref_primary_10_55007_dufed_1181996
crossref_primary_10_1016_j_cmpb_2025_108611
crossref_primary_10_1016_j_media_2025_103499
crossref_primary_10_1007_s00500_023_07891_w
crossref_primary_10_1016_j_compmedimag_2021_101938
crossref_primary_10_1016_j_ijrobp_2023_05_034
crossref_primary_10_1016_j_media_2021_102200
crossref_primary_10_1109_TMI_2021_3060634
crossref_primary_10_1007_s00530_022_00977_9
crossref_primary_10_1109_RBME_2022_3183852
crossref_primary_10_3390_cancers14153581
crossref_primary_10_1073_pnas_1812995116
crossref_primary_10_1016_j_media_2020_101840
crossref_primary_10_1007_s10462_019_09743_2
crossref_primary_10_1007_s11063_023_11356_4
crossref_primary_10_1016_j_ibmed_2022_100055
crossref_primary_10_1007_s42452_022_04936_x
crossref_primary_10_1142_S2196888824500076
crossref_primary_10_1002_mp_14131
crossref_primary_10_1016_j_displa_2024_102650
crossref_primary_10_1148_ryai_2020190102
crossref_primary_10_3390_s24144749
crossref_primary_10_1002_mp_14134
crossref_primary_10_1016_j_cosrev_2024_100648
crossref_primary_10_1016_j_neucom_2021_01_135
crossref_primary_10_1016_j_heliyon_2024_e26414
crossref_primary_10_1109_TMI_2020_2987981
crossref_primary_10_1007_s00330_021_08036_z
crossref_primary_10_1016_j_artmed_2021_102109
crossref_primary_10_3390_electronics11203323
crossref_primary_10_1002_acm2_14233
crossref_primary_10_1016_j_compbiomed_2022_105782
crossref_primary_10_1016_j_jvcir_2021_103134
crossref_primary_10_3389_fonc_2021_618496
crossref_primary_10_1088_1361_6560_ac344d
crossref_primary_10_1136_bmjopen_2021_053204
crossref_primary_10_1007_s10278_019_00227_x
crossref_primary_10_1007_s10278_023_00857_2
crossref_primary_10_1016_j_patrec_2023_08_005
crossref_primary_10_1016_j_softx_2019_100347
crossref_primary_10_1016_j_eswa_2022_118090
crossref_primary_10_1109_TMI_2022_3213983
crossref_primary_10_1109_TMI_2021_3104460
crossref_primary_10_1186_s13244_019_0738_2
crossref_primary_10_1016_j_cmpb_2019_07_002
crossref_primary_10_1002_mp_14364
crossref_primary_10_1007_s10489_021_02197_6
crossref_primary_10_1007_s00259_019_04606_y
crossref_primary_10_7717_peerj_17005
crossref_primary_10_1109_ACCESS_2020_3020579
crossref_primary_10_1109_TMI_2022_3228316
crossref_primary_10_1148_ryai_210026
crossref_primary_10_1016_j_media_2024_103333
crossref_primary_10_1016_j_ejro_2023_100537
crossref_primary_10_1016_j_engappai_2025_110230
crossref_primary_10_3389_fnins_2022_918623
crossref_primary_10_1109_TMI_2020_3045775
crossref_primary_10_1038_s41598_020_63285_0
crossref_primary_10_1109_ACCESS_2024_3450961
crossref_primary_10_1109_TMI_2020_3001036
crossref_primary_10_1007_s00330_022_09347_5
crossref_primary_10_1109_JBHI_2021_3063080
crossref_primary_10_1109_TMI_2021_3079709
crossref_primary_10_1186_s13014_019_1392_z
crossref_primary_10_1002_mp_16330
crossref_primary_10_1002_mp_15485
crossref_primary_10_1002_mp_17300
crossref_primary_10_1109_TMI_2023_3262680
crossref_primary_10_1016_j_semradonc_2019_02_001
crossref_primary_10_1186_s40658_023_00536_9
crossref_primary_10_12677_mos_2024_133213
crossref_primary_10_1002_mp_13735
crossref_primary_10_25699_SSSB_2022_44_4_004
crossref_primary_10_1007_s10462_024_10966_1
crossref_primary_10_1097_MPA_0000000000001412
crossref_primary_10_1117_1_JMI_9_4_045001
crossref_primary_10_1016_j_ejmp_2021_05_003
crossref_primary_10_3390_jimaging10080202
crossref_primary_10_1016_j_neucom_2021_08_157
crossref_primary_10_1080_08839514_2022_2151186
crossref_primary_10_1109_TCYB_2023_3291369
crossref_primary_10_1002_mp_14141
crossref_primary_10_1007_s00371_020_02018_w
crossref_primary_10_1016_j_media_2020_101931
crossref_primary_10_1038_s41598_022_20108_8
crossref_primary_10_1002_mp_14386
crossref_primary_10_1155_2022_5797431
crossref_primary_10_1371_journal_pone_0253863
crossref_primary_10_1016_j_bspc_2020_102233
crossref_primary_10_3390_bioengineering11040319
crossref_primary_10_1016_j_bspc_2021_102652
crossref_primary_10_1016_j_bspc_2021_102894
crossref_primary_10_1002_ima_22901
crossref_primary_10_1016_j_cmpb_2021_106480
crossref_primary_10_1038_s41598_022_07111_9
crossref_primary_10_1002_mp_14814
crossref_primary_10_3390_electronics13245038
crossref_primary_10_3390_app131810521
crossref_primary_10_1007_s10462_019_09788_3
crossref_primary_10_1007_s11255_024_04082_w
crossref_primary_10_1016_j_artmed_2021_102231
crossref_primary_10_3389_fonc_2022_908903
crossref_primary_10_1007_s11548_022_02757_2
crossref_primary_10_1007_s12024_021_00431_8
crossref_primary_10_1002_mp_16072
crossref_primary_10_1109_TMI_2019_2911588
crossref_primary_10_1007_s10489_021_02221_9
crossref_primary_10_1049_ipr2_13221
crossref_primary_10_1080_0284186X_2020_1775290
crossref_primary_10_3389_fbioe_2021_636953
crossref_primary_10_1016_j_compbiomed_2021_104658
crossref_primary_10_1016_j_compbiomed_2020_103720
crossref_primary_10_1097_MNM_0000000000001436
crossref_primary_10_1186_s40644_024_00711_w
crossref_primary_10_1002_mp_15303
crossref_primary_10_1016_j_cmpb_2022_106894
crossref_primary_10_1016_j_eswa_2023_121905
crossref_primary_10_3390_bioengineering10010119
crossref_primary_10_1007_s10916_019_1476_1
crossref_primary_10_1016_j_compeleceng_2022_108345
crossref_primary_10_1109_ACCESS_2019_2899635
crossref_primary_10_1016_j_compbiomed_2021_105067
crossref_primary_10_1186_s12880_020_00509_9
crossref_primary_10_1007_s00261_023_04122_6
crossref_primary_10_1186_s13244_024_01636_5
crossref_primary_10_1177_20552076221111941
crossref_primary_10_1109_TBME_2024_3380058
crossref_primary_10_1016_j_media_2023_103024
crossref_primary_10_1016_j_media_2024_103382
crossref_primary_10_1016_j_artmed_2021_102023
crossref_primary_10_1109_TMI_2021_3060465
crossref_primary_10_1016_j_inffus_2024_102560
crossref_primary_10_1007_s40747_024_01671_1
crossref_primary_10_1038_s41598_021_95972_x
crossref_primary_10_1016_j_cmpb_2021_106547
crossref_primary_10_1016_j_media_2022_102567
crossref_primary_10_1007_s11604_021_01098_5
crossref_primary_10_1016_j_media_2023_102987
crossref_primary_10_1016_j_bspc_2024_106667
crossref_primary_10_1148_radiol_220101
crossref_primary_10_1016_j_cmpb_2022_106887
crossref_primary_10_1016_j_media_2019_02_006
crossref_primary_10_1109_JBHI_2021_3136597
crossref_primary_10_1016_j_compbiomed_2021_104424
crossref_primary_10_29109_gujsc_1210343
crossref_primary_10_1109_TMI_2023_3242980
crossref_primary_10_1007_s11042_020_09592_0
crossref_primary_10_1016_j_displa_2025_103032
crossref_primary_10_1016_j_artmed_2023_102608
crossref_primary_10_1096_fj_202200253R
crossref_primary_10_1007_s10916_023_01968_7
crossref_primary_10_1371_journal_pone_0313126
crossref_primary_10_3390_app15020837
crossref_primary_10_1088_1361_6560_ada0a0
crossref_primary_10_1016_j_morpho_2019_09_002
crossref_primary_10_1109_TRPMS_2021_3071148
crossref_primary_10_1016_j_diii_2019_05_008
crossref_primary_10_1016_j_media_2023_102984
crossref_primary_10_1016_j_media_2022_102444
crossref_primary_10_3389_fonc_2020_00675
crossref_primary_10_1016_j_compbiomed_2024_109173
crossref_primary_10_3748_wjg_v27_i27_4395
crossref_primary_10_1097_JP9_0000000000000056
crossref_primary_10_1186_s12880_022_00807_4
crossref_primary_10_1016_j_placenta_2023_02_009
crossref_primary_10_1002_ird3_101
crossref_primary_10_3748_wjg_v27_i43_7480
crossref_primary_10_1016_j_heliyon_2023_e13081
crossref_primary_10_1016_j_acra_2019_01_012
crossref_primary_10_1038_s41597_020_00715_8
crossref_primary_10_1002_mp_15322
crossref_primary_10_1038_s41574_021_00543_9
crossref_primary_10_1002_mp_16527
crossref_primary_10_1109_ACCESS_2020_2975983
crossref_primary_10_1016_j_neucom_2024_128125
crossref_primary_10_1016_j_neucom_2024_128124
crossref_primary_10_1049_ipr2_12709
crossref_primary_10_1016_j_media_2021_102155
crossref_primary_10_1016_j_media_2021_102156
crossref_primary_10_1007_s11517_018_1929_6
crossref_primary_10_1016_j_bbe_2020_07_011
crossref_primary_10_1016_j_media_2023_102873
crossref_primary_10_3390_app15052516
crossref_primary_10_1016_j_media_2021_102035
crossref_primary_10_1109_LSP_2024_3451962
crossref_primary_10_1007_s10278_022_00660_5
crossref_primary_10_1007_s12149_019_01359_4
crossref_primary_10_1007_s12265_021_10166_0
crossref_primary_10_1186_s12885_020_07595_6
crossref_primary_10_4251_wjgo_v13_i11_1599
crossref_primary_10_1186_s13014_020_01528_0
crossref_primary_10_3390_tomography7040078
crossref_primary_10_1016_j_cviu_2024_104138
crossref_primary_10_1016_j_knosys_2021_107471
crossref_primary_10_1088_1361_6560_ab9453
crossref_primary_10_1089_ten_tec_2019_0052
crossref_primary_10_1016_j_bspc_2021_103027
crossref_primary_10_3390_jimaging8120330
crossref_primary_10_1186_s12891_023_06153_y
crossref_primary_10_1038_s41598_022_19037_3
crossref_primary_10_1155_2021_5852595
Cites_doi 10.1007/978-3-319-24553-9_68
10.1016/j.media.2015.05.009
10.1109/TPAMI.2012.143
10.1148/radiol.11091710
10.1016/j.csda.2006.08.028
10.1109/TBME.2016.2574816
10.1002/jmrs.65
10.1109/CVPR.2015.7298664
10.1016/j.media.2015.04.015
10.1007/s11548-007-0135-z
10.1016/j.media.2015.06.009
10.1007/978-3-319-46493-0_38
10.1007/s10278-013-9622-7
10.1109/CVPR.2015.7298965
10.1007/978-3-319-10404-1_83
10.1016/j.media.2015.04.003
10.1007/978-3-642-33454-2_52
10.1016/S0016-5107(01)70082-X
10.9734/BJMCS/2016/20812
10.5565/rev/elcvia.206
10.1007/978-3-319-46723-8_53
10.1146/annurev.bioeng.1.1.211
10.1016/j.media.2009.05.004
10.1007/978-3-319-10470-6_62
10.1109/3DV.2016.79
10.1007/s11548-016-1501-5
10.1007/978-3-319-49644-3_11
10.1109/TMI.2017.2743464
10.1109/TMI.2013.2246577
10.1007/978-3-319-59129-2_4
10.1007/978-3-319-66182-7_83
10.1109/34.927467
10.1007/978-3-319-46976-8_12
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
ADTOC
UNPAY
DOI 10.1109/TMI.2018.2806309
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library (IEL)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
PubMed Central (Full Participant titles)
Unpaywall for CDI: Periodical Content
Unpaywall
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList
MEDLINE
Materials Research Database
MEDLINE - Academic

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Xplorer
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 4
  dbid: UNPAY
  name: Unpaywall
  url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
EISSN 1558-254X
EndPage 1834
ExternalDocumentID oai:pubmedcentral.nih.gov:6076994
PMC6076994
29994628
10_1109_TMI_2018_2806309
8291609
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: Cancer Research UK
  grantid: C28070/A19985
  funderid: 10.13039/501100000289
– fundername: National Institute for Health Research
  funderid: 10.13039/501100000272
– fundername: Cancer Research UK
  grantid: 19985
– fundername: Department of Health
  grantid: UCLBRC-2012-1
– fundername: Department of Health
  grantid: PB-PG-0712-28114
– fundername: Cancer Research UK
  grantid: A19985
GroupedDBID ---
-DZ
-~X
.GJ
0R~
29I
4.4
53G
5GY
5RE
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ACPRK
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
RXW
TAE
TN5
VH1
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
RIG
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
ADTOC
UNPAY
ID FETCH-LOGICAL-c444t-39b1bb857ea7f481ce2070a25f1ca0436fbad62e10ff2f515378f81c16e6f5993
IEDL.DBID RIE
ISSN 0278-0062
1558-254X
IngestDate Wed Oct 29 11:45:15 EDT 2025
Tue Sep 30 15:59:20 EDT 2025
Sun Sep 28 10:07:37 EDT 2025
Sun Jun 29 12:30:05 EDT 2025
Mon Jul 21 06:03:21 EDT 2025
Thu Apr 24 23:00:12 EDT 2025
Wed Oct 01 03:55:28 EDT 2025
Wed Aug 27 02:30:46 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 8
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c444t-39b1bb857ea7f481ce2070a25f1ca0436fbad62e10ff2f515378f81c16e6f5993
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-9152-5907
0000-0002-5565-1252
0000-0001-5218-6431
0000-0003-4902-0486
0000-0003-2916-655X
0000-0001-9207-7280
0000-0003-0821-1809
0000-0002-4558-288X
0000-0001-9217-5438
0000-0002-0313-9134
OpenAccessLink https://proxy.k.utb.cz/login?url=https://www.ncbi.nlm.nih.gov/pmc/articles/6076994
PMID 29994628
PQID 2117113824
PQPubID 85460
PageCount 13
ParticipantIDs unpaywall_primary_10_1109_tmi_2018_2806309
pubmed_primary_29994628
ieee_primary_8291609
proquest_journals_2117113824
pubmedcentral_primary_oai_pubmedcentral_nih_gov_6076994
crossref_citationtrail_10_1109_TMI_2018_2806309
proquest_miscellaneous_2068340613
crossref_primary_10_1109_TMI_2018_2806309
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2018-08-01
PublicationDateYYYYMMDD 2018-08-01
PublicationDate_xml – month: 08
  year: 2018
  text: 2018-08-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on medical imaging
PublicationTitleAbbrev TMI
PublicationTitleAlternate IEEE Trans Med Imaging
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ioffe (ref38) 2015
ref13
he (ref14) 2015
ref12
ref15
ref53
çiçek (ref24) 2016
ref11
ref10
ref19
brust (ref46) 2015
ref18
landman (ref50) 2012
veit (ref43) 2016
gal (ref41) 2016; 48
ref51
zografos (ref54) 2015
litjens (ref17) 2017
landman (ref16) 2015
ref42
pereyra (ref47) 2017
odena (ref45) 2016
oda (ref52) 2011
ref49
pleiss (ref44) 2017
ref8
ref7
ref9
ref4
ref3
ref6
ref5
roth (ref33) 2016
ref40
ref35
ref34
ref37
ref30
ref32
ref2
ref1
huang (ref27) 2016
chen (ref36) 2017
yu (ref28) 2015
ref23
ref26
ref25
nair (ref39) 2010
ref20
ref22
ref21
kroon (ref48) 2010
ref29
roth (ref31) 2017
References_xml – ident: ref26
  doi: 10.1007/978-3-319-24553-9_68
– year: 2016
  ident: ref27
  publication-title: Densely Connected Convolutional Networks
– ident: ref7
  doi: 10.1016/j.media.2015.05.009
– year: 2017
  ident: ref44
  publication-title: Memory-efficient implementation of densenets
– ident: ref35
  doi: 10.1109/TPAMI.2012.143
– year: 2017
  ident: ref17
  publication-title: A survey on deep learning in medical image analysis
– year: 2012
  ident: ref50
  publication-title: MICCAI Grand Challenge and Workshop on Multi-Atlas Labeling
– volume: 48
  start-page: 1050
  year: 2016
  ident: ref41
  article-title: Dropout as a Bayesian approximation: Representing model uncertainty in deep learning
  publication-title: Proc 33rd Int Conf Mach Learn
– ident: ref1
  doi: 10.1148/radiol.11091710
– ident: ref49
  doi: 10.1016/j.csda.2006.08.028
– year: 2015
  ident: ref38
  publication-title: Batch Normalization Accelerating Deep Network Training by Reducing Internal Covariate Shift
– year: 2016
  ident: ref45
  publication-title: Deconvolution and checkerboard artifacts
– start-page: 1
  year: 2015
  ident: ref46
  article-title: Efficient convolutional patch networks for scene understanding
  publication-title: CVPR Scene Understanding Workshop
– ident: ref15
  doi: 10.1109/TBME.2016.2574816
– start-page: 18
  year: 2015
  ident: ref14
  article-title: Fully automatic multi-organ segmentation based on multi-boost learning and statistical shape model search
  publication-title: Proc VISCERAL Challenge at ISBI
– ident: ref2
  doi: 10.1002/jmrs.65
– ident: ref40
  doi: 10.1109/CVPR.2015.7298664
– ident: ref8
  doi: 10.1016/j.media.2015.04.015
– ident: ref10
  doi: 10.1007/s11548-007-0135-z
– year: 2016
  ident: ref33
  publication-title: Data from TCIA pancreas-CT
– ident: ref6
  doi: 10.1016/j.media.2015.06.009
– year: 2010
  ident: ref48
  publication-title: Smooth Triangulated Mesh MATLAB Central File Exchange
– year: 2015
  ident: ref16
  publication-title: Multi-Atlas Labeling Beyond the Cranial Vault-Workshop and Challenge
– year: 2015
  ident: ref28
  publication-title: Multi-scale context aggregation by dilated convolutions
– year: 2017
  ident: ref31
  publication-title: Hierarchical 3D fully convolutional networks for multi-organ segmentation
– ident: ref42
  doi: 10.1007/978-3-319-46493-0_38
– ident: ref32
  doi: 10.1007/s10278-013-9622-7
– ident: ref37
  doi: 10.1109/CVPR.2015.7298965
– ident: ref53
  doi: 10.1007/978-3-319-10404-1_83
– ident: ref5
  doi: 10.1016/j.media.2015.04.003
– ident: ref9
  doi: 10.1007/978-3-642-33454-2_52
– ident: ref4
  doi: 10.1016/S0016-5107(01)70082-X
– ident: ref12
  doi: 10.9734/BJMCS/2016/20812
– year: 2017
  ident: ref36
  article-title: VoxResNet: Deep voxelwise residual networks for brain segmentation from 3D MR images
  publication-title: NeuroImage
– ident: ref11
  doi: 10.5565/rev/elcvia.206
– ident: ref30
  doi: 10.1007/978-3-319-46723-8_53
– start-page: 424
  year: 2016
  ident: ref24
  article-title: 3D U-net: Learning dense volumetric segmentation from sparse annotation
  publication-title: Proc MICCAI
– ident: ref3
  doi: 10.1146/annurev.bioeng.1.1.211
– ident: ref22
  doi: 10.1016/j.media.2009.05.004
– start-page: 550
  year: 2016
  ident: ref43
  article-title: Residual networks behave like ensembles of relatively shallow networks
  publication-title: Proc NIPS
– ident: ref13
  doi: 10.1007/978-3-319-10470-6_62
– ident: ref25
  doi: 10.1109/3DV.2016.79
– start-page: 37
  year: 2015
  ident: ref54
  article-title: Hierarchical multi-organ segmentation without registration in 3D abdominal CT images
  publication-title: Proc MICCAI Workshop on Medical Computer Vision
– ident: ref19
  doi: 10.1007/s11548-016-1501-5
– start-page: 807
  year: 2010
  ident: ref39
  article-title: Rectified linear units improve restricted Boltzmann machines
  publication-title: Proc ICML
– ident: ref51
  doi: 10.1007/978-3-319-49644-3_11
– ident: ref29
  doi: 10.1109/TMI.2017.2743464
– ident: ref34
  doi: 10.1109/TMI.2013.2246577
– ident: ref20
  doi: 10.1007/978-3-319-59129-2_4
– ident: ref21
  doi: 10.1007/978-3-319-66182-7_83
– year: 2017
  ident: ref47
  publication-title: Regularizing neural networks by penalizing confident output disrributions
– ident: ref23
  doi: 10.1109/34.927467
– start-page: 181
  year: 2011
  ident: ref52
  article-title: Organ segmentation from 3D abdominal CT images based on atlas selection and graph cut
  publication-title: Proc Int MICCAI Workshop Comput Clin Challenges Abdominal Imag
– ident: ref18
  doi: 10.1007/978-3-319-46976-8_12
SSID ssj0014509
Score 2.698226
Snippet Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows....
Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning and treatment delivery workflows....
SourceID unpaywall
pubmedcentral
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1822
SubjectTerms Abdomen
Abdominal CT
Accuracy
Algorithms
Computed tomography
Deep learning
Digestive System - diagnostic imaging
Duodenum
Endoscopy
Esophagus
Gallbladder
Gastrointestinal system
Gastrointestinal tract
Humans
Image processing
Image segmentation
Kidney
Kidney - diagnostic imaging
Kidneys
Liver
Machine learning
Mathematical models
Medical imaging
Methods
Organs
Pancreas
Radiographic Image Interpretation, Computer-Assisted - methods
Radiography, Abdominal - methods
Registration
segmentation
Spleen
Spleen - diagnostic imaging
Statistical analysis
Statistical methods
Statistical models
Stomach
Three-dimensional displays
Tomography, X-Ray Computed - methods
SummonAdditionalLinks – databaseName: Unpaywall
  dbid: UNPAY
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED-NTuLjYcA2INtAQeIFJKf5cBLnsdqYClIrBC2Up8h2bVbRphVLhOCv5-x8qGUIaVJfWl8SWT73fs797ncAr3g6ZzzUknDBOaEhF0QEVBGM5ixiMSJo20VhNE6GU_p-Fs_2IGhrYSxpX4qFVyxXXrG4stzKzUr2W55YP8GTd5bRO7CfxAi_e7A_HX8YfLXvUoxarG-biGKYRA-I6axNTfpZv1wtDJeLeSabGBkK4lYosr1V_gUzb7Il71XFhv_6yZfLrVB0-RA-tpOoGSjfvaoUnvz9l77jrWb5CA4aYOoO6qHHsKeKQ3iwJVd4CHdHTSL-CIaDqlxbtVfXlvASW9PpflLfVk0xU-HiZyDma9s1zD2fuF8W5ZV7gcdm5X4m45p-fn0M08u3k_MhaZoyEEkpLUmUiUAIFqeKp5qyQKoQ_zV4GOtAcqNnrwWfJ6EKfK1DjWgpSplGsyBRiY4RDT2BXrEu1DNwKdUZkyxSTFAaxymnmclrZlSyROEjHOi365PLRrHcNM5Y5vbk4mf5ZPQuNyuaNyvqwOvuik2t1vEf2yOz5J0dCxEpm5_PWhfIm-18neMpOQ2MWiN14GU3jBvRZFd4odYV2vgJiyw8cuBp7THdvTHmZ6YI2IF0x5c6AyPyvTuCXmHFvhtHcOBN53U3poaOvTO1k9sYn8J987UmNZ5Br_xRqecItErxotlafwD9sCIl
  priority: 102
  providerName: Unpaywall
Title Automatic Multi-Organ Segmentation on Abdominal CT With Dense V-Networks
URI https://ieeexplore.ieee.org/document/8291609
https://www.ncbi.nlm.nih.gov/pubmed/29994628
https://www.proquest.com/docview/2117113824
https://www.proquest.com/docview/2068340613
https://pubmed.ncbi.nlm.nih.gov/PMC6076994
https://www.ncbi.nlm.nih.gov/pmc/articles/6076994
UnpaywallVersion submittedVersion
Volume 37
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Xplorer
  customDbUrl:
  eissn: 1558-254X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014509
  issn: 0278-0062
  databaseCode: RIE
  dateStart: 19820101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3fb9MwED6NIQF7GLCNLWNMQeIFhNvEcWznsRpMBakVEi2Mp8h2bVbRJdOWCMFfj-380MomhJSHSLkkOt85_py7-w7glWALLrBRSEghEMFCIhkTjexqzhOeWgTtuyhMpnQ8Jx_P0rMNeNvXwmitffKZHrhTH8tflKp2v8qGHFsw46r17jFOm1qtPmJA0iadAzvG2IjiLiQZZcPZ5IPL4eIDF0VMIkcUaj_CmavKXFuNfHuVu5Dm7YTJh3VxKX79FKvVjdXo9DFMOj2aJJQfg7qSA_X7L4rH_1X0CWy3sDQcNX70FDZ0sQNbN8gKd-DBpA3D78J4VFel53oNfQEv8hWd4Wf9_aItZSpCe4zkovQ9w8KTWfh1WZ2H7-ymWYdf0LRJPr_eg_np-9nJGLUtGZAihFQoyWQsJU-ZFswQHiuN7TdD4NTESjg2eyPFgmIdR8ZgY7FSwrixYjHV1KQWCz2DzaIs9AGEhJiMK55oLglJUyZI5qKaGVGcavuKAIadaXLV8pW7thmr3O9boiy3ds2dXfPWrgG87u-4bLg6_iG764a9l2tHPICjzvp5O5mvc7tHZrHjaiQBvOwv22noYiui0GVtZSLKEw-OAthvnKV_dudsAbA1N-oFHMX3-pViee6pvmnEqL05gDe9w91SrbpYrql2eLdqz-GRk2qSF49gs7qq9QsLqCp57GfSMdyfTz-Nvv0BrOIaBA
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3db9MwED9NQ2LwwMfGWGBAkHgB4TZxHMd5rAZTB0tf6GBvkZ3arKJLpi0Rgr-es_OhlU0IqQ-Rck51vnP8c-7udwBvZLIQkpqCSCUlYVQqokKmCe7mIhIxImjXRSGb8ekJ-3Qan27A-6EWRmvtks_0yF66WP6iKhr7qWwsKIIZW613J2aMxW211hAzYHGb0EEtZ2zAaR-UDNLxPDuyWVxiZOOIUWCpQvE1nNq6zLX9yDVYuQ1r3kyZ3GrKC_nrp1ytru1Hhw8h6zVp01B-jJpajYrff5E8_q-qj-BBB0z9SetJj2FDl9tw_xpd4TbczbpA_A5MJ01dObZX35XwElfT6X_R38-7YqbSx99ELSrXNcw_mPvflvWZ_wGPzdr_SmZt-vnVEzg5_Dg_mJKuKQMpcM5rEqUqVErEiZaJYSIsNMW3hqSxCQtp-eyNkgtOdRgYQw2ipSgRBsVCrrmJEQ3twmZZlXoPfMZMKgoRaaHQmnEiWWrjmikrBNf4Fx6Me9PkRcdYbhtnrHJ3cgnSHO2aW7vmnV09eDuMuGjZOv4hu2OnfZDrZtyD_d76ebecr3I8JSehZWtkHrwebuNCtNEVWeqqQZmAi8jBIw-ets4yPLt3Ng-SNTcaBCzJ9_qdcnnmyL55kHAc7MG7weFuqFafL9dUe3a7aq9gazrPjvPjo9nn53DPjmhTGfdhs75s9AuEV7V66VbVH6_VG6E
linkToUnpaywall http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED-NTuLjYcA2INtAQeIFJKf5cBLnsdqYClIrBC2Up8h2bVbRphVLhOCv5-x8qGUIaVJfWl8SWT73fs797ncAr3g6ZzzUknDBOaEhF0QEVBGM5ixiMSJo20VhNE6GU_p-Fs_2IGhrYSxpX4qFVyxXXrG4stzKzUr2W55YP8GTd5bRO7CfxAi_e7A_HX8YfLXvUoxarG-biGKYRA-I6axNTfpZv1wtDJeLeSabGBkK4lYosr1V_gUzb7Il71XFhv_6yZfLrVB0-RA-tpOoGSjfvaoUnvz9l77jrWb5CA4aYOoO6qHHsKeKQ3iwJVd4CHdHTSL-CIaDqlxbtVfXlvASW9PpflLfVk0xU-HiZyDma9s1zD2fuF8W5ZV7gcdm5X4m45p-fn0M08u3k_MhaZoyEEkpLUmUiUAIFqeKp5qyQKoQ_zV4GOtAcqNnrwWfJ6EKfK1DjWgpSplGsyBRiY4RDT2BXrEu1DNwKdUZkyxSTFAaxymnmclrZlSyROEjHOi365PLRrHcNM5Y5vbk4mf5ZPQuNyuaNyvqwOvuik2t1vEf2yOz5J0dCxEpm5_PWhfIm-18neMpOQ2MWiN14GU3jBvRZFd4odYV2vgJiyw8cuBp7THdvTHmZ6YI2IF0x5c6AyPyvTuCXmHFvhtHcOBN53U3poaOvTO1k9sYn8J987UmNZ5Br_xRqecItErxotlafwD9sCIl
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Automatic+Multi-Organ+Segmentation+on+Abdominal+CT+With+Dense+V-Networks&rft.jtitle=IEEE+transactions+on+medical+imaging&rft.au=Gibson%2C+Eli&rft.au=Giganti%2C+Francesco&rft.au=Hu%2C+Yipeng&rft.au=Bonmati%2C+Ester&rft.date=2018-08-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0062&rft.eissn=1558-254X&rft.volume=37&rft.issue=8&rft.spage=1822&rft_id=info:doi/10.1109%2FTMI.2018.2806309&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0062&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0062&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0062&client=summon