Towards robust and generalizable super-resolution generative adversarial networks for magnetic resonance neuroimaging: a cross-population approach
Magnetic resonance imaging (MRI) is fundamental to neuroscience, where detailed structural brain scans improve clinical diagnoses and provide accurate neuroanatomical information. Apart from time-consuming scanning protocols, higher image resolution can be obtained with super resolution algorithms....
Saved in:
Published in | bioRxiv |
---|---|
Main Authors | , , , |
Format | Paper |
Language | English |
Published |
Cold Spring Harbor Laboratory
17.06.2022
|
Edition | 1.1 |
Subjects | |
Online Access | Get full text |
ISSN | 2692-8205 |
DOI | 10.1101/2022.06.13.495858 |
Cover
Summary: | Magnetic resonance imaging (MRI) is fundamental to neuroscience, where detailed structural brain scans improve clinical diagnoses and provide accurate neuroanatomical information. Apart from time-consuming scanning protocols, higher image resolution can be obtained with super resolution algorithms. We investigated the generalization abilities of Super Resolution Generative Adversarial Neural Networks (SRGANs) across different populations. T1-weighted scans from three large cohorts were used, spanning older subjects, newborns, and patients with brain tumor- or treatment-induced tissue changes. Upsampling quality was validated using synthetic and anatomical metrics. Models were first trained on each cohort, yielding high image quality and anatomical fidelity. When applied across cohorts, no artifacts were introduced by the SRGANs. SRGANs that were trained on a dataset combining all cohorts also did not induce any population-based artifacts. We showed that SRGANs provide a prime example of robust AI, where application on unseen populations did not introduce artifacts due to training data bias (e.g., insertion or removal of tumor-related signals and contrast inversion). This is an important step in the deployment of SRGANs in real-world settings. |
---|---|
Bibliography: | Competing Interest Statement: The authors have declared no competing interest. |
ISSN: | 2692-8205 |
DOI: | 10.1101/2022.06.13.495858 |