Lung segmentation based on random forest and multi-scale edge detection

To achieve an automatic and accurate segmentation of lungs and improve the clinical efficiency of computer-aided diagnosis, the authors present a lung segmentation algorithm based on the random forest method and a multi-scale edge detection technique. The algorithm carries a first step of lung regio...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 13; no. 10; pp. 1745 - 1754
Main Authors Liu, Caixia, Zhao, Ruibin, Pang, Mingyong
Format Journal Article
LanguageEnglish
Published The Institution of Engineering and Technology 22.08.2019
Subjects
Online AccessGet full text
ISSN1751-9659
1751-9667
DOI10.1049/iet-ipr.2019.0130

Cover

More Information
Summary:To achieve an automatic and accurate segmentation of lungs and improve the clinical efficiency of computer-aided diagnosis, the authors present a lung segmentation algorithm based on the random forest method and a multi-scale edge detection technique. The algorithm carries a first step of lung region extraction and a second step of lung nodule segmentation. By combining texture information, the improved superpixel generation method can better deal with initial segmentation on lung computed tomography images with inhomogeneous intensity. Then, the lung region is further extracted by using the random forest classifier on the superpixel features, and the lung contours are corrected with a proposed circle tracing technique. Finally, the segmentation is further refined by employing a multi-scale edge detection technique, which enables their method to detect suspicious nodules with various intensities and sizes adaptively. The effectiveness of the proposed approach is demonstrated on a group of datasets by comparing with the corresponding ground truths as well as the classical algorithms. Experimental results show that the proposed method has a higher precision than the compared algorithms in a fully automatic fashion.
ISSN:1751-9659
1751-9667
DOI:10.1049/iet-ipr.2019.0130