Early Experiences with Crowdsourcing Airway Annotations in Chest CT

Measuring airways in chest computed tomography (CT) images is important for characterizing diseases such as cystic fibrosis, yet very time-consuming to perform manually. Machine learning algorithms offer an alternative, but need large sets of annotated data to perform well. We investigate whether cr...

Full description

Saved in:
Bibliographic Details
Published inDeep Learning and Data Labeling for Medical Applications Vol. 10008; pp. 209 - 218
Main Authors Cheplygina, Veronika, Perez-Rovira, Adria, Kuo, Wieying, Tiddens, Harm A. W. M., de Bruijne, Marleen
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319469751
3319469754
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-46976-8_22

Cover

More Information
Summary:Measuring airways in chest computed tomography (CT) images is important for characterizing diseases such as cystic fibrosis, yet very time-consuming to perform manually. Machine learning algorithms offer an alternative, but need large sets of annotated data to perform well. We investigate whether crowdsourcing can be used to gather airway annotations which can serve directly for measuring the airways, or as training data for the algorithms. We generate image slices at known locations of airways and request untrained crowd workers to outline the airway lumen and airway wall. Our results show that the workers are able to interpret the images, but that the instructions are too complex, leading to many unusable annotations. After excluding unusable annotations, quantitative results show medium to high correlations with expert measurements of the airways. Based on this positive experience, we describe a number of further research directions and provide insight into the challenges of crowdsourcing in medical images from the perspective of first-time users.
ISBN:9783319469751
3319469754
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-46976-8_22