Understanding the Mechanisms of Deep Transfer Learning for Medical Images

The ability to automatically learn task specific feature representations has led to a huge success of deep learning methods. When large training data is scarce, such as in medical imaging problems, transfer learning has been very effective. In this paper, we systematically investigate the process of...

Full description

Saved in:
Bibliographic Details
Published inDeep Learning and Data Labeling for Medical Applications Vol. 10008; pp. 188 - 196
Main Authors Ravishankar, Hariharan, Sudhakar, Prasad, Venkataramani, Rahul, Thiruvenkadam, Sheshadri, Annangi, Pavan, Babu, Narayanan, Vaidya, Vivek
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319469751
3319469754
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-46976-8_20

Cover

More Information
Summary:The ability to automatically learn task specific feature representations has led to a huge success of deep learning methods. When large training data is scarce, such as in medical imaging problems, transfer learning has been very effective. In this paper, we systematically investigate the process of transferring a Convolutional Neural Network, trained on ImageNet images to perform image classification, to kidney detection problem in ultrasound images. We study how the detection performance depends on the extent of transfer. We show that a transferred and tuned CNN can outperform a state-of-the-art feature engineered pipeline and a hybridization of these two techniques achieves 20 % higher performance. We also investigate how the evolution of intermediate response images from our network. Finally, we compare these responses to state-of-the-art image processing filters in order to gain greater insight into how transfer learning is able to effectively manage widely varying imaging regimes.
ISBN:9783319469751
3319469754
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-46976-8_20