Modality-specificity multi-aware evidence fusion algorithm using CFP and OCT for fundus diseases diagnosis
Automatic diagnosis of fundus diseases is crucial for clinical decision support. Existing single-modality and single-aware methods often underutilize the rich information present in fundus images. To overcome this, we propose a novel modality-specific multi-aware evidence fusion (MSMAE) algorithm fo...
Saved in:
| Published in | Pattern recognition Vol. 169; p. 111957 |
|---|---|
| Main Authors | , , , , |
| Format | Journal Article |
| Language | English |
| Published |
Elsevier Ltd
01.01.2026
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0031-3203 |
| DOI | 10.1016/j.patcog.2025.111957 |
Cover
| Summary: | Automatic diagnosis of fundus diseases is crucial for clinical decision support. Existing single-modality and single-aware methods often underutilize the rich information present in fundus images. To overcome this, we propose a novel modality-specific multi-aware evidence fusion (MSMAE) algorithm for accurate fundus disease diagnosis and grading. Our approach first identifies tissue regions by extracting keypoints from color fundus photographs (CFP) and segmenting supervoxels from optical coherence tomography (OCT) using a conditional random field supervoxel (CRF-SV) method. For each modality, tissue-aware, structure-aware, and global-aware features are extracted to comprehensively capture local and contextual information. An evidence fusion module integrates these multi-aware predictions within each modality, while an evidence knowledge fusion module leverages prior knowledge to reconcile discrepancies across modalities. Furthermore, a progressive transfer fusion curriculum learning strategy is introduced to guide the model from simple to complex grading tasks. Experimental results on benchmark datasets demonstrate that MSMAE consistently outperforms state-of-the-art methods in both disease diagnosis and grading. The codes are available at https://github.com/ecustyy/msmae. |
|---|---|
| ISSN: | 0031-3203 |
| DOI: | 10.1016/j.patcog.2025.111957 |