Explainable multimodal data fusion framework for heart failure detection: Integrating CNN and XGBoost
Heart Failure (HF) presents a significant healthcare challenge, with increasing prevalence and substantial impacts on patient quality of life. Current diagnostic methods primarily rely on clinical assessment, including electrocardiograms (ECG), echocardiograms, questionnaires, and blood tests. Howev...
        Saved in:
      
    
          | Published in | Biomedical signal processing and control Vol. 100; p. 106997 | 
|---|---|
| Main Authors | , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
            Elsevier Ltd
    
        01.02.2025
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 1746-8094 | 
| DOI | 10.1016/j.bspc.2024.106997 | 
Cover
| Summary: | Heart Failure (HF) presents a significant healthcare challenge, with increasing prevalence and substantial impacts on patient quality of life. Current diagnostic methods primarily rely on clinical assessment, including electrocardiograms (ECG), echocardiograms, questionnaires, and blood tests. However, integrating information from multiple modalities could enhance diagnostic accuracy. This article proposes an explainable multimodal data fusion approach, combining electrocardiogram signals with selected blood test results, to improve heart failure detection. Specifically, a Convolutional Neural Network (CNN) model is utilized for electrocardiogram classification, while an XGBoost algorithm analyzes blood test results. The study utilizes the MIMIC-IV database, including MIMIC-IV-ECG. Pre-processing techniques are applied to ensure good data quality. Two fusion strategies, intermediate and late fusion, are explored to combine electrocardiogram and blood test information effectively. Feature importance scores are utilized to select the best blood tests for further investigation. These features are then integrated into the late fusion approach, which achieves significant improvements in heart failure detection accuracy when tested on unseen data from unseen subjects, emphasizing its generalizability. The late fusion approach yields a high accuracy of 97.46%, with strong sensitivity (97.16%) and specificity (97.67%). These results highlight the potential of multimodal fusion in enhancing heart failure detection accuracy, with implications for improving patient care and outcomes. The study contributes to the advancement of diagnostic methodologies and underscores the importance of leveraging diverse data modalities in medical diagnosis. | 
|---|---|
| ISSN: | 1746-8094 | 
| DOI: | 10.1016/j.bspc.2024.106997 |