Ultrasound prostate segmentation based on multidirectional deeply supervised V‐Net
Purpose Transrectal ultrasound (TRUS) is a versatile and real‐time imaging modality that is commonly used in image‐guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and mot...
Saved in:
| Published in | Medical physics (Lancaster) Vol. 46; no. 7; pp. 3194 - 3206 |
|---|---|
| Main Authors | , , , , , , , , , , |
| Format | Journal Article |
| Language | English |
| Published |
United States
01.07.2019
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0094-2405 2473-4209 1522-8541 2473-4209 |
| DOI | 10.1002/mp.13577 |
Cover
| Abstract | Purpose
Transrectal ultrasound (TRUS) is a versatile and real‐time imaging modality that is commonly used in image‐guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time‐consuming and subject to inter‐ and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning‐based method which integrates deep supervision into a three‐dimensional (3D) patch‐based V‐Net for prostate segmentation.
Methods and materials
We developed a multidirectional deep‐learning‐based method to automatically segment the prostate for ultrasound‐guided radiation therapy. A 3D supervision mechanism is integrated into the V‐Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross‐entropy (BCE) loss and a batch‐based Dice loss into the stage‐wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well‐trained network and the well‐trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing.
Results
Forty‐four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively.
Conclusion
We developed a novel deeply supervised deep learning‐based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer. |
|---|---|
| AbstractList | Purpose
Transrectal ultrasound (TRUS) is a versatile and real‐time imaging modality that is commonly used in image‐guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time‐consuming and subject to inter‐ and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning‐based method which integrates deep supervision into a three‐dimensional (3D) patch‐based V‐Net for prostate segmentation.
Methods and materials
We developed a multidirectional deep‐learning‐based method to automatically segment the prostate for ultrasound‐guided radiation therapy. A 3D supervision mechanism is integrated into the V‐Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross‐entropy (BCE) loss and a batch‐based Dice loss into the stage‐wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well‐trained network and the well‐trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing.
Results
Forty‐four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively.
Conclusion
We developed a novel deeply supervised deep learning‐based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer. Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation.PURPOSETransrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation.We developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing.METHODS AND MATERIALSWe developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing.Forty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively.RESULTSForty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively.We developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer.CONCLUSIONWe developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer. Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation. We developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing. Forty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively. We developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer. |
| Author | Tian, Sibo Mao, Hui Yang, Xiaofeng Wang, Bo Wang, Tonghe Curran, Walter J. Patel, Pretesh He, Xiuxiu Jani, Ashesh B. Liu, Tian Lei, Yang |
| Author_xml | – sequence: 1 givenname: Yang surname: Lei fullname: Lei, Yang organization: Emory University – sequence: 2 givenname: Sibo surname: Tian fullname: Tian, Sibo organization: Emory University – sequence: 3 givenname: Xiuxiu surname: He fullname: He, Xiuxiu organization: Emory University – sequence: 4 givenname: Tonghe surname: Wang fullname: Wang, Tonghe organization: Emory University – sequence: 5 givenname: Bo surname: Wang fullname: Wang, Bo organization: Emory University – sequence: 6 givenname: Pretesh surname: Patel fullname: Patel, Pretesh organization: Emory University – sequence: 7 givenname: Ashesh B. surname: Jani fullname: Jani, Ashesh B. organization: Emory University – sequence: 8 givenname: Hui surname: Mao fullname: Mao, Hui organization: Emory University – sequence: 9 givenname: Walter J. surname: Curran fullname: Curran, Walter J. organization: Emory University – sequence: 10 givenname: Tian surname: Liu fullname: Liu, Tian organization: Emory University – sequence: 11 givenname: Xiaofeng surname: Yang fullname: Yang, Xiaofeng email: xyang43@emory.edu organization: Emory University |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/31074513$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kMtKxDAUhoMoOqOCTyBd6qJjLk0zWYp4A28Lx21I01OJpBeb1qE7H8Fn9EnM2NGdQiA_hy_nnHxTtFnVFSB0QPCMYExPymZGGBdiA01oIlicUCw30QRjmcQ0wXwHTb1_wRinjONttMMIFgknbIIeF65rta_7Ko-atvad7iDy8FxCFaKtqyjTHvIohLJ3nc1tC2ZV1y7KARo3RL5voH2zK-rp8_3jDro9tFVo52F_fe-ixcX549lVfHN_eX12ehMbxrmI00xLaYhIC2IyKTNKUjqHIuOZLAg3IDmm88xgwqXUAgqYAxgKIAqRcpEC20XHY9--avSw1M6pprWlbgdFsFqZUWWjvs0E9mhkwy9fe_CdKq034JyuoO69opQRSWQ4AT1co31WQv7b88daAGYjYIIx30KhjB1tBZfW_TP898E_e8YjurQOhj85dfsw8l8zFJvw |
| CitedBy_id | crossref_primary_10_1016_j_media_2022_102418 crossref_primary_10_1016_j_media_2020_101845 crossref_primary_10_1109_TMI_2021_3139999 crossref_primary_10_3390_rs13132460 crossref_primary_10_1109_TMI_2024_3400899 crossref_primary_10_1002_mp_15264 crossref_primary_10_1002_mp_14134 crossref_primary_10_1016_j_isprsjprs_2022_12_023 crossref_primary_10_1002_mp_14895 crossref_primary_10_1016_j_ultrasmedbio_2024_10_016 crossref_primary_10_1088_1361_6560_ac5a93 crossref_primary_10_1088_1361_6560_ab8077 crossref_primary_10_1002_acm2_13337 crossref_primary_10_1016_j_patcog_2022_108890 crossref_primary_10_1007_s10278_023_00783_3 crossref_primary_10_1016_j_zemedi_2022_10_005 crossref_primary_10_1016_j_eswa_2023_122128 crossref_primary_10_1088_1361_6560_ac344d crossref_primary_10_1088_1361_6560_aba410 crossref_primary_10_1088_1361_6560_acf5c5 crossref_primary_10_1088_1361_6560_ab63bb crossref_primary_10_1016_j_compmedimag_2023_102241 crossref_primary_10_1016_j_eswa_2025_126978 crossref_primary_10_1088_1361_6560_ab8cd6 crossref_primary_10_1016_j_brachy_2023_04_003 crossref_primary_10_1038_s41391_023_00684_0 crossref_primary_10_1259_bjr_20210644 crossref_primary_10_1088_1361_6560_abf2f9 crossref_primary_10_1002_mp_14128 crossref_primary_10_1016_j_semradonc_2022_06_008 crossref_primary_10_1007_s10489_023_04676_4 crossref_primary_10_1016_j_crad_2020_08_027 crossref_primary_10_14366_usg_20085 crossref_primary_10_1088_1361_6560_ac3c13 crossref_primary_10_1007_s10462_022_10179_4 crossref_primary_10_1016_j_ejmp_2020_07_028 crossref_primary_10_3390_app14156550 crossref_primary_10_1088_1361_6560_abd953 crossref_primary_10_1088_1361_6560_ad0d45 crossref_primary_10_3390_app12031390 crossref_primary_10_3390_diagnostics11020354 crossref_primary_10_3390_app11020844 crossref_primary_10_3390_electronics11111800 crossref_primary_10_1016_j_compbiomed_2024_109531 crossref_primary_10_1109_TPAMI_2023_3289667 crossref_primary_10_1007_s00330_020_07482_5 crossref_primary_10_1088_1361_6560_ac3b34 crossref_primary_10_1016_j_cmpb_2022_106752 crossref_primary_10_1016_j_radonc_2019_09_028 crossref_primary_10_1088_1361_6560_abc5a6 crossref_primary_10_1002_mp_16378 crossref_primary_10_1002_mp_15522 crossref_primary_10_1007_s10489_023_05143_w crossref_primary_10_1177_17562872221128791 crossref_primary_10_1002_mp_15206 crossref_primary_10_1016_j_brachy_2024_10_009 crossref_primary_10_1002_mp_16811 crossref_primary_10_1007_s10278_023_00839_4 crossref_primary_10_1002_mp_13933 crossref_primary_10_1002_mp_14307 crossref_primary_10_1002_mp_14901 crossref_primary_10_1002_mp_14508 crossref_primary_10_1016_j_ijmedinf_2023_105279 crossref_primary_10_1590_1809_4430_eng_agric_v44e20230097_2024 crossref_primary_10_3389_fphys_2023_1177351 crossref_primary_10_1016_j_ejmp_2021_05_003 crossref_primary_10_1016_j_ejrad_2021_109717 crossref_primary_10_1002_mp_15032 crossref_primary_10_1002_mp_14066 crossref_primary_10_1109_TUFFC_2023_3255843 crossref_primary_10_1002_mp_14386 crossref_primary_10_1002_mp_13970 crossref_primary_10_1016_j_bspc_2023_105337 crossref_primary_10_1016_j_ultrasmedbio_2024_10_005 crossref_primary_10_1117_1_JEI_33_1_013048 crossref_primary_10_1002_mp_13966 crossref_primary_10_14366_usg_20102 crossref_primary_10_55007_dufed_1181996 crossref_primary_10_1002_mp_14818 crossref_primary_10_1088_1361_6560_ab843e crossref_primary_10_1016_j_asoc_2021_107386 crossref_primary_10_1088_1361_6560_abfce2 crossref_primary_10_1002_acm2_70034 crossref_primary_10_1016_j_compmedimag_2024_102326 crossref_primary_10_1016_j_media_2022_102620 crossref_primary_10_3389_fphy_2024_1398393 crossref_primary_10_1109_TUFFC_2022_3169684 |
| Cites_doi | 10.1007/s00345-015-1523-6 10.1609/aaai.v31i1.10761 10.1016/j.media.2017.05.001 10.1007/s11548-018-1742-6 10.1109/TMI.2015.2502540 10.1007/s12350-019-01594-2 10.1109/TMI.2014.2371823 10.1109/CRV.2008.15 10.1016/j.cmpb.2012.04.006 10.1109/ISSPIT.2006.270795 10.1016/j.media.2010.10.002 10.1016/j.ijrobp.2014.08.350 10.1117/12.912188 10.1002/mp.13416 10.1109/3DV.2016.79 10.1371/journal.pone.0076645 10.1007/978-3-642-33140-4_17 10.1002/mp.13458 10.1016/j.compbiomed.2016.05.002 10.1117/12.878072 10.1002/mp.13560 10.1016/j.compmedimag.2004.07.007 10.1117/12.594403 10.1016/0895-6111(96)00048-1 10.1109/TBME.2009.2037491 10.1016/j.media.2018.05.010 10.1007/s11548-008-0247-0 10.1109/BioCAS.2014.6981659 10.1117/12.877888 10.1118/1.4906129 10.1109/IJCNN.2017.7965852 10.1118/1.2777005 10.1109/TMI.2006.884630 |
| ContentType | Journal Article |
| Copyright | 2019 American Association of Physicists in Medicine 2019 American Association of Physicists in Medicine. |
| Copyright_xml | – notice: 2019 American Association of Physicists in Medicine – notice: 2019 American Association of Physicists in Medicine. |
| DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7X8 ADTOC UNPAY |
| DOI | 10.1002/mp.13577 |
| DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic Unpaywall for CDI: Periodical Content Unpaywall |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic MEDLINE |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Medicine Physics |
| EISSN | 2473-4209 |
| EndPage | 3206 |
| ExternalDocumentID | oai:pubmedcentral.nih.gov:6625925 31074513 10_1002_mp_13577 MP13577 |
| Genre | article Journal Article |
| GrantInformation_xml | – fundername: Dunwoody Golf Club Prostate Cancer Research Award – fundername: National Institutes of Health funderid: R01CA215718 – fundername: Department of Defense (DoD) Prostate Cancer Research Program funderid: W81XWH‐13‐1‐0269; W81XWH‐17‐1‐0438; W81XWH‐17‐1‐0439 – fundername: Winship Cancer Institute of Emory University – fundername: NVIDIA Corporation – fundername: National Institutes of Health grantid: R01CA215718 – fundername: Department of Defense (DoD) Prostate Cancer Research Program grantid: W81XWH-17-1-0438 – fundername: NCI NIH HHS grantid: R01 CA215718 – fundername: Department of Defense (DoD) Prostate Cancer Research Program grantid: W81XWH-13-1-0269 – fundername: Department of Defense (DoD) Prostate Cancer Research Program grantid: W81XWH-17-1-0439 |
| GroupedDBID | --- --Z -DZ .GJ 0R~ 1OB 1OC 29M 2WC 33P 36B 3O- 4.4 53G 5GY 5RE 5VS AAHHS AAHQN AAIPD AAMNL AANLZ AAQQT AASGY AAXRX AAYCA AAZKR ABCUV ABDPE ABEFU ABFTF ABJNI ABLJU ABQWH ABTAH ABXGK ACAHQ ACBEA ACCFJ ACCZN ACGFO ACGFS ACGOF ACPOU ACXBN ACXQS ADBBV ADBTR ADKYN ADOZA ADXAS ADZMN AEEZP AEGXH AEIGN AENEX AEQDE AEUYR AFBPY AFFPM AFWVQ AHBTC AIACR AIAGR AITYG AIURR AIWBW AJBDE ALMA_UNASSIGNED_HOLDINGS ALUQN ALVPJ AMYDB ASPBG BFHJK C45 CS3 DCZOG DRFUL DRMAN DRSTM DU5 EBD EBS EJD EMB EMOBN F5P HDBZQ HGLYW I-F KBYEO LATKE LEEKS LOXES LUTES LYRES MEWTI O9- OVD P2P P2W PALCI PHY RJQFR RNS ROL SAMSI SUPJJ SV3 TEORI TN5 TWZ USG WOHZO WXSBR XJT ZGI ZVN ZXP ZY4 ZZTAW AAMMB AAYXX ADMLS AEFGJ AEYWJ AGHNM AGXDD AGYGG AIDQK AIDYY AIQQE CITATION LH4 CGR CUY CVF ECM EIF NPM 7X8 ABUFD ADTOC UNPAY |
| ID | FETCH-LOGICAL-c3557-6ba99c176f1cb99b21628efb5b9f15ce95028bc01599a7efe8eec2ee7f76576e3 |
| IEDL.DBID | UNPAY |
| ISSN | 0094-2405 2473-4209 1522-8541 |
| IngestDate | Sun Oct 26 03:44:25 EDT 2025 Fri Sep 05 13:39:46 EDT 2025 Wed Feb 19 02:31:22 EST 2025 Thu Apr 24 23:08:57 EDT 2025 Wed Oct 01 04:33:01 EDT 2025 Wed Jan 22 16:39:52 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 7 |
| Keywords | deep learning deeply supervised network prostate segmentation transrectal ultrasound (TRUS) |
| Language | English |
| License | 2019 American Association of Physicists in Medicine. |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c3557-6ba99c176f1cb99b21628efb5b9f15ce95028bc01599a7efe8eec2ee7f76576e3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| OpenAccessLink | https://proxy.k.utb.cz/login?url=https://www.ncbi.nlm.nih.gov/pmc/articles/6625925 |
| PMID | 31074513 |
| PQID | 2231919919 |
| PQPubID | 23479 |
| PageCount | 13 |
| ParticipantIDs | unpaywall_primary_10_1002_mp_13577 proquest_miscellaneous_2231919919 pubmed_primary_31074513 crossref_citationtrail_10_1002_mp_13577 crossref_primary_10_1002_mp_13577 wiley_primary_10_1002_mp_13577_MP13577 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | July 2019 2019-07-00 2019-Jul 20190701 |
| PublicationDateYYYYMMDD | 2019-07-01 |
| PublicationDate_xml | – month: 07 year: 2019 text: July 2019 |
| PublicationDecade | 2010 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States |
| PublicationTitle | Medical physics (Lancaster) |
| PublicationTitleAlternate | Med Phys |
| PublicationYear | 2019 |
| References | 2015; 34 2017; 41 2014; 90 2010; 57 2012 2011 2012; 8316 2015; 33 2008 2016; 74 2006 2018; 10576 2008; 3 2011; 15 2018; 10574 2013; 8 2005; 29 2007; 34 2012; 108 2016; 35 2018; 48 2016; 9784 2011; 7964 2015; 9351 2016; 7 2011; 7962 2015; 42 2019; 46 1979; 6 2019 2017 2005; 5747 2016 2014 1996; 20 2007; 25 2018; 13 e_1_2_8_28_1 e_1_2_8_29_1 Sarkar S (e_1_2_8_3_1) 2016; 7 e_1_2_8_24_1 e_1_2_8_25_1 e_1_2_8_27_1 Khellaf F (e_1_2_8_26_1) 2018; 10574 e_1_2_8_2_1 e_1_2_8_5_1 e_1_2_8_4_1 Ghavami N (e_1_2_8_33_1) 2018; 10576 e_1_2_8_7_1 e_1_2_8_6_1 e_1_2_8_9_1 e_1_2_8_8_1 e_1_2_8_20_1 Yang X (e_1_2_8_23_1) 2016; 9784 e_1_2_8_21_1 e_1_2_8_22_1 Holm S (e_1_2_8_41_1) 1979; 6 e_1_2_8_40_1 e_1_2_8_17_1 e_1_2_8_18_1 Ronneberger O (e_1_2_8_37_1) 2015; 9351 e_1_2_8_39_1 e_1_2_8_19_1 e_1_2_8_13_1 e_1_2_8_36_1 e_1_2_8_14_1 e_1_2_8_35_1 e_1_2_8_15_1 e_1_2_8_38_1 e_1_2_8_16_1 e_1_2_8_32_1 e_1_2_8_10_1 e_1_2_8_31_1 e_1_2_8_11_1 e_1_2_8_34_1 e_1_2_8_12_1 e_1_2_8_30_1 |
| References_xml | – volume: 3 start-page: 485 year: 2008 end-page: 492 article-title: Atlas‐based prostate segmentation using an hybrid registration publication-title: Int J Comput Assist Radiol Surg – volume: 33 start-page: 1651 year: 2015 end-page: 1659 article-title: Multiparametric ultrasound in the detection of prostate cancer: a systematic review publication-title: World J Urol – year: 2019 article-title: A learning‐based automatic segmentation and quantification method on left ventricle in gated myocardial perfusion SPECT imaging: a feasibility publication-title: J Nucl Cardiol – volume: 90 start-page: 1225 year: 2014 end-page: 1233 article-title: Automated segmentation of the parotid gland based on atlas registration and machine learning: a longitudinal MRI study in head‐and‐neck radiation therapy publication-title: Int J Radiat Oncol Biol Phys – volume: 10574 start-page: 105740J year: 2018 article-title: Left ventricle segmentation in 3D ultrasound by combining structured random forests with active shape models publication-title: Proc SPIE – volume: 34 start-page: 950 year: 2015 end-page: 961 article-title: A multi‐atlas‐based segmentation framework for prostate brachytherapy publication-title: IEEE Trans Med Imaging – volume: 48 start-page: 107 year: 2018 end-page: 116 article-title: A deep learning approach for real time prostate segmentation in freehand ultrasound guided biopsy publication-title: Med Image Anal – start-page: 689 year: 2011 end-page: 696 – volume: 6 start-page: 65 year: 1979 end-page: 70 article-title: A simple sequentially rejective multiple test procedure publication-title: Scand J Stat – year: 2019 article-title: Learning‐based automatic segmentation of arteriovenous malformations on contrast CT images in brain stereotactic radiosurgery publication-title: Med Phys – volume: 108 start-page: 262 year: 2012 end-page: 287 article-title: A survey of prostate segmentation methodologies in ultrasound, magnetic resonance and computed tomography images publication-title: Comput Meth Prog Bio – volume: 8316 start-page: 83162O year: 2012 article-title: 3D prostate segmentation of ultrasound images combining longitudinal image registration and machine learning publication-title: Proc SPIE – volume: 9351 start-page: 234 year: 2015 end-page: 241 article-title: U‐Net: convolutional networks for biomedical image segmentation publication-title: Proc MICCAI – year: 2016 – volume: 10576 start-page: 1057603 year: 2018 article-title: Automatic slice segmentation of intraoperative transrectal ultrasound images using convolutional neural networks publication-title: Proc SPIE – volume: 29 start-page: 43 year: 2005 end-page: 51 article-title: Segmentation of abdominal ultrasound images of the prostate using a priori information and an adapted noise filter publication-title: Comput Med Imag Grap – volume: 8 start-page: e76645 year: 2013 article-title: PCG‐Cut: graph driven segmentation of the prostate central gland publication-title: PLoS ONE – volume: 42 start-page: 877 year: 2015 end-page: 891 article-title: Rotationally resliced 3D prostate TRUS segmentation using convex optimization with shape priors publication-title: Med Phys – year: 2012 – volume: 13 start-page: 749 year: 2018 end-page: 757 article-title: Prostate segmentation in transrectal ultrasound using magnetic resonance imaging priors publication-title: Int J Comput Assist Radiol Surg – volume: 34 start-page: 4109 year: 2007 end-page: 4125 article-title: Fast prostate segmentation in 3D TRUS images based on continuity constraint using an autoregressive model publication-title: Med Phys – volume: 7964 start-page: 796432 year: 2011 article-title: Automatic 3D segmentation of ultrasound images using atlas registration and statistical texture prior publication-title: Proc SPIE – volume: 15 start-page: 226 year: 2011 end-page: 237 article-title: Semi‐automatic segmentation for prostate interventions publication-title: Med Image Anal – start-page: 178 year: 2017 end-page: 184 – volume: 7 start-page: 1 year: 2016 end-page: 15 article-title: A review of imaging methods for prostate cancer detection publication-title: Biomed Eng Comput Biol – volume: 74 start-page: 74 year: 2016 end-page: 90 article-title: Fully automatic prostate segmentation from transrectal ultrasound images based on radial bas‐relief initialization and slice‐based propagation publication-title: Comput Biol Med – volume: 20 start-page: 131 year: 1996 end-page: 140 article-title: Automated texture‐based segmentation of ultrasound images of the prostate publication-title: Comput Med Imag Grap – year: 2008 – volume: 46 start-page: 1707 year: 2019 end-page: 1718 article-title: Deeply supervised 3D fully convolutional networks with group dilated convolution for automatic MRI prostate segmentation publication-title: Med Phys – year: 2006 – volume: 5747 start-page: 1648 year: 2005 end-page: 1657 article-title: Prostate ultrasound image segmentation using level set‐based region flow with shape guidance publication-title: Proc SPIE – volume: 9784 start-page: 97842F year: 2016 article-title: 3D transrectal ultrasound (TRUS) prostate segmentation based on optimal feature learning framework publication-title: Proc SPIE – start-page: 117 year: 2014 end-page: 120 – volume: 57 start-page: 1158 year: 2010 end-page: 1166 article-title: Discrete deformable model guided by partial active shape model for TRUS image segmentation publication-title: IEEE Trans Biomed Eng – volume: 25 start-page: 1645 year: 2007 end-page: 1654 article-title: Semiautomatic 3‐D prostate segmentation from TRUS images using spherical harmonics publication-title: IEEE Trans Med Imaging – year: 2017 – volume: 41 start-page: 40 year: 2017 end-page: 54 article-title: 3D deeply supervised network for automated segmentation of volumetric medical images publication-title: Med Image Anal – volume: 35 start-page: 921 year: 2016 end-page: 932 article-title: Learning‐based multi‐label segmentation of transrectal ultrasound images for prostate brachytherapy publication-title: IEEE Trans Med Imaging – volume: 7962 start-page: 79622K year: 2011 article-title: 3D segmentation of prostate ultrasound images using wavelet transform publication-title: Proc SPIE – volume: 46 start-page: 2157 year: 2019 end-page: 2168 article-title: Automatic multiorgan segmentation in thorax CT images using U‐net‐GAN publication-title: Med Phys – ident: e_1_2_8_38_1 doi: 10.1007/s00345-015-1523-6 – ident: e_1_2_8_32_1 doi: 10.1609/aaai.v31i1.10761 – ident: e_1_2_8_39_1 doi: 10.1016/j.media.2017.05.001 – volume: 10576 start-page: 1057603 year: 2018 ident: e_1_2_8_33_1 article-title: Automatic slice segmentation of intraoperative transrectal ultrasound images using convolutional neural networks publication-title: Proc SPIE – ident: e_1_2_8_35_1 doi: 10.1007/s11548-018-1742-6 – ident: e_1_2_8_19_1 doi: 10.1109/TMI.2015.2502540 – ident: e_1_2_8_30_1 doi: 10.1007/s12350-019-01594-2 – volume: 7 start-page: 1 year: 2016 ident: e_1_2_8_3_1 article-title: A review of imaging methods for prostate cancer detection publication-title: Biomed Eng Comput Biol – ident: e_1_2_8_10_1 doi: 10.1109/TMI.2014.2371823 – ident: e_1_2_8_12_1 doi: 10.1109/CRV.2008.15 – volume: 9351 start-page: 234 year: 2015 ident: e_1_2_8_37_1 article-title: U‐Net: convolutional networks for biomedical image segmentation publication-title: Proc MICCAI – volume: 6 start-page: 65 year: 1979 ident: e_1_2_8_41_1 article-title: A simple sequentially rejective multiple test procedure publication-title: Scand J Stat – ident: e_1_2_8_6_1 doi: 10.1016/j.cmpb.2012.04.006 – ident: e_1_2_8_9_1 doi: 10.1109/ISSPIT.2006.270795 – ident: e_1_2_8_5_1 doi: 10.1016/j.media.2010.10.002 – ident: e_1_2_8_11_1 doi: 10.1016/j.ijrobp.2014.08.350 – ident: e_1_2_8_21_1 doi: 10.1117/12.912188 – ident: e_1_2_8_29_1 doi: 10.1002/mp.13416 – ident: e_1_2_8_36_1 – ident: e_1_2_8_27_1 doi: 10.1109/3DV.2016.79 – ident: e_1_2_8_13_1 doi: 10.1371/journal.pone.0076645 – ident: e_1_2_8_25_1 doi: 10.1007/978-3-642-33140-4_17 – ident: e_1_2_8_28_1 doi: 10.1002/mp.13458 – ident: e_1_2_8_4_1 doi: 10.1016/j.compbiomed.2016.05.002 – ident: e_1_2_8_24_1 doi: 10.1117/12.878072 – volume: 9784 start-page: 97842F year: 2016 ident: e_1_2_8_23_1 article-title: 3D transrectal ultrasound (TRUS) prostate segmentation based on optimal feature learning framework publication-title: Proc SPIE – ident: e_1_2_8_31_1 doi: 10.1002/mp.13560 – ident: e_1_2_8_8_1 doi: 10.1016/j.compmedimag.2004.07.007 – ident: e_1_2_8_14_1 doi: 10.1117/12.594403 – ident: e_1_2_8_16_1 doi: 10.1016/0895-6111(96)00048-1 – ident: e_1_2_8_7_1 doi: 10.1109/TBME.2009.2037491 – volume: 10574 start-page: 105740J year: 2018 ident: e_1_2_8_26_1 article-title: Left ventricle segmentation in 3D ultrasound by combining structured random forests with active shape models publication-title: Proc SPIE – ident: e_1_2_8_20_1 doi: 10.1016/j.media.2018.05.010 – ident: e_1_2_8_2_1 doi: 10.1007/s11548-008-0247-0 – ident: e_1_2_8_15_1 doi: 10.1109/BioCAS.2014.6981659 – ident: e_1_2_8_22_1 doi: 10.1117/12.877888 – ident: e_1_2_8_18_1 doi: 10.1118/1.4906129 – ident: e_1_2_8_34_1 doi: 10.1109/IJCNN.2017.7965852 – ident: e_1_2_8_40_1 doi: 10.1118/1.2777005 – ident: e_1_2_8_17_1 doi: 10.1109/TMI.2006.884630 |
| SSID | ssj0006350 |
| Score | 2.6341093 |
| Snippet | Purpose
Transrectal ultrasound (TRUS) is a versatile and real‐time imaging modality that is commonly used in image‐guided prostate cancer interventions (e.g.,... Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy... |
| SourceID | unpaywall proquest pubmed crossref wiley |
| SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 3194 |
| SubjectTerms | Deep Learning deeply supervised network Humans Image Processing, Computer-Assisted - methods Male Observer Variation Prostate - diagnostic imaging prostate segmentation Supervised Machine Learning transrectal ultrasound (TRUS) Ultrasonography |
| Title | Ultrasound prostate segmentation based on multidirectional deeply supervised V‐Net |
| URI | https://onlinelibrary.wiley.com/doi/abs/10.1002%2Fmp.13577 https://www.ncbi.nlm.nih.gov/pubmed/31074513 https://www.proquest.com/docview/2231919919 https://www.ncbi.nlm.nih.gov/pmc/articles/6625925 |
| UnpaywallVersion | submittedVersion |
| Volume | 46 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVEBS databaseName: Inspec with Full Text customDbUrl: eissn: 2473-4209 dateEnd: 20241103 omitProxy: false ssIdentifier: ssj0006350 issn: 0094-2405 databaseCode: ADMLS dateStart: 20070101 isFulltext: true titleUrlDefault: https://www.ebsco.com/products/research-databases/inspec-full-text providerName: EBSCOhost |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB2VXQHiUKAU2AqqgBCcsts4sR0fV0BVIXZVQReVU2Q7E6hI0qhJhMqJn8Bv7C-p7SQrlS_BKZEychyN43n2PL8BeIZcySALtR-olPgRUXajCZUveBqHjBmE7bQ7F0t2sIreHNPjDQiGszCOtK_VybTMi2l58tlxK6tCzwae2IxZxE7oNRgzauD3CMar5eH8Y6c2GdlsAXUaqWaRFVNXuZJEPDTd2BOD-OwemRWVrfjA-dVw9AvGvAU327KS519lnl-Fry7-7N-Gd0PPO9rJl2nbqKn-9pOo43992h3Y7NGoN-8e3YUNLLfgxqLPt2_BdUcQ1fU9OFrlzZmsbREmr7InRQxG9Wr8VPRnl0rPxsPUMzeOo9iFSrfP6KWIVX7u1W1lJyZr9eHi-48lNtuw2n999PLA7ysy-NrgEu4zJYXQAWdZoJUQigSMxJgpqkQWUI2CGriitIEYQkiOGcaImiDyjDOzsMHwPozK0xIfgmcsCTMzsbBFAbVKZRSlMk7TVCgdII0m8GJwTKJ7uXJbNSNPOqFlkhRV4lw4gSdry6qT6PidzeDbxPw_NikiSzxt68TAI7NkNShZTOBB5_R1K6Flq9IgnMDT9Sj4yyueu-HxR4NkceiuO__S2iMYNWctPjaIp1G7MJ6_Wrx9v9uP9UvsPQGe |
| linkProvider | Unpaywall |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB2VrQBxKFC-tgIUEIJTdmsntuNjhagqpF1VqIvKKYrtSalI0qhJhMqpP6G_kV-C7SQrlS_BKZEychyN43n2PL8BeIVCZSSPdEiUoWFMldtoQhVKYZKIc4uwvXbnYskPVvH7Y3a8AWQ8C-NJ-1qdzqqinFWnnz23si71fOSJzblD7JTdgE3OLPyewOZqebj3qVebjF22gHmNVLvISpivXEljEdlu7MpRfHaXzsvaVXwQ4no4-gVj3oHbXVVnF1-zorgOX3382b8LH8ae97STL7OuVTP97SdRx__6tHuwNaDRYK9_dB82sNqGW4sh374NNz1BVDcP4GhVtOdZ44owBbU7KWIxatDgSTmcXaoCFw9NYG88R7EPlX6fMTCIdXERNF3tJiZn9fH75dUS24ew2n939PYgHCoyhNriEhFylUmpieA50UpKRQmnCeaKKZkTplEyC1eUthBDykxgjgmipogiF9wubDB6BJPqrMInEFhLyu1MLF1RQK1MFscmS4wxUmmCLJ7Cm9ExqR7kyl3VjCLthZZpWtapd-EUXqwt616i43c2o29T-_-4pEhW4VnXpBYe2SWrRclyCo97p69biRxblZFoCi_Xo-Avr3jth8cfDdLFob_u_EtrT2HSnnf4zCKeVj0fxvgP55gACg |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Ultrasound+prostate+segmentation+based+on+multidirectional+deeply+supervised+V-Net&rft.jtitle=Medical+physics+%28Lancaster%29&rft.au=Lei%2C+Yang&rft.au=Tian%2C+Sibo&rft.au=He%2C+Xiuxiu&rft.au=Wang%2C+Tonghe&rft.date=2019-07-01&rft.eissn=2473-4209&rft.volume=46&rft.issue=7&rft.spage=3194&rft_id=info:doi/10.1002%2Fmp.13577&rft_id=info%3Apmid%2F31074513&rft.externalDocID=31074513 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0094-2405&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0094-2405&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0094-2405&client=summon |