Automatic Brain Tumor Detection and Segmentation Using U-Net Based Fully Convolutional Networks
A major challenge in brain tumor treatment planning and quantitative evaluation is determination of the tumor extent. The noninvasive magnetic resonance imaging (MRI) technique has emerged as a front-line diagnostic tool for brain tumors without ionizing radiation. Manual segmentation of brain tumor...
Saved in:
| Published in | Medical Image Understanding and Analysis Vol. 723; pp. 506 - 517 |
|---|---|
| Main Authors | , , , , |
| Format | Book Chapter |
| Language | English |
| Published |
Switzerland
Springer International Publishing AG
2017
Springer International Publishing |
| Series | Communications in Computer and Information Science |
| Online Access | Get full text |
| ISBN | 9783319609638 3319609637 |
| ISSN | 1865-0929 1865-0937 |
| DOI | 10.1007/978-3-319-60964-5_44 |
Cover
| Summary: | A major challenge in brain tumor treatment planning and quantitative evaluation is determination of the tumor extent. The noninvasive magnetic resonance imaging (MRI) technique has emerged as a front-line diagnostic tool for brain tumors without ionizing radiation. Manual segmentation of brain tumor extent from 3D MRI volumes is a very time-consuming task and the performance is highly relied on operator’s experience. In this context, a reliable fully automatic segmentation method for the brain tumor segmentation is necessary for an efficient measurement of the tumor extent. In this study, we propose a fully automatic method for brain tumor segmentation, which is developed using U-Net based deep convolutional networks. Our method was evaluated on Multimodal Brain Tumor Image Segmentation (BRATS 2015) datasets, which contain 220 high-grade brain tumor and 54 low-grade tumor cases. Cross-validation has shown that our method can obtain promising segmentation efficiently. |
|---|---|
| Bibliography: | H. Dong, G. Yang and F. Liu—contributed equally to this study. |
| ISBN: | 9783319609638 3319609637 |
| ISSN: | 1865-0929 1865-0937 |
| DOI: | 10.1007/978-3-319-60964-5_44 |