From YouTube to the brain: Transfer learning can improve brain-imaging predictions with deep learning

Deep learning has recently achieved best-in-class performance in several fields, including biomedical domains such as X-ray images. Yet, data scarcity poses a strict limit on training successful deep learning systems in many, if not most, biomedical applications, including those involving brain imag...

Full description

Saved in:
Bibliographic Details
Published inNeural networks Vol. 153; pp. 325 - 338
Main Authors Malik, Nahiyan, Bzdok, Danilo
Format Journal Article
LanguageEnglish
Published United States 01.09.2022
Subjects
Online AccessGet full text
ISSN0893-6080
1879-2782
1879-2782
DOI10.1016/j.neunet.2022.06.014

Cover

More Information
Summary:Deep learning has recently achieved best-in-class performance in several fields, including biomedical domains such as X-ray images. Yet, data scarcity poses a strict limit on training successful deep learning systems in many, if not most, biomedical applications, including those involving brain images. In this study, we translate state-of-the-art transfer learning techniques for single-subject prediction of simpler (sex and age) and more complex phenotypes (number of people in household, household income, fluid intelligence and smoking behavior). We fine-tuned 2D and 3D ResNet-18 convolutional neural networks for target phenotype predictions from brain images of ∼40,000 UK Biobank participants, after pretraining on YouTube videos from the Kinetics dataset and natural images from the ImageNet dataset. Transfer learning was effective on several phenotypes, especially sex and age classification. Additionally, transfer learning in particular outperformed deep learning models trained from scratch especially on smaller sample sizes. The out-of-sample performance using transfer learning from previously learned knowledge based on real-world images and videos could unlock the potential in many areas of imaging neuroscience where deep learning solutions are currently infeasible.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0893-6080
1879-2782
1879-2782
DOI:10.1016/j.neunet.2022.06.014