Multi-Modal Graph-Aware Transformer with Contrastive Fusion for Brain Tumor Segmentation
Accurate segmentation of brain tumors in MRI images is critical for early diagnosis, surgical planning, and effective treatment strategies. Traditional deep learning models such as U-Net, Attention U-Net, and Swin-U-Net have demonstrated commendable success in tumor segmentation by leveraging Convol...
Saved in:
| Published in | Journal of electronics, electromedical engineering, and medical informatics Vol. 7; no. 4; pp. 1226 - 1239 |
|---|---|
| Main Authors | , , , , , |
| Format | Journal Article |
| Language | English |
| Published |
15.10.2025
|
| Online Access | Get full text |
| ISSN | 2656-8632 2656-8632 |
| DOI | 10.35882/jeeemi.v7i4.993 |
Cover
| Abstract | Accurate segmentation of brain tumors in MRI images is critical for early diagnosis, surgical planning, and effective treatment strategies. Traditional deep learning models such as U-Net, Attention U-Net, and Swin-U-Net have demonstrated commendable success in tumor segmentation by leveraging Convolutional Neural Networks (CNNs) and transformer-based encoders. However, these models often fall short in effectively capturing complex inter-modality interactions and long-range spatial dependencies, particularly in tumor regions with diffuse or poorly defined boundaries. Additionally, they suffer from limited generalization capabilities and demand substantial computational resources. AIM: To overcome these limitations, a novel approach named Graph-Aware Transformer with Contrastive Fusion (GAT-CF) is introduced. This model enhances segmentation performance by integrating spatial attention mechanisms of transformers with graph-based relational reasoning across multiple MRI modalities, namely T1, T2, FLAIR, and T1CE. The graph-aware structure models inter-slice and intra-slice relationships more effectively, promoting better structural understanding of tumor regions. Furthermore, a multi-modal contrastive learning strategy is employed to align semantic features and distinguish complementary modality-specific information, thereby improving the model’s discriminative power. The fusion of these techniques facilitates improved contextual understanding and more accurate boundary delineation in complex tumor regions. When evaluated on the BraTS2021 dataset, the proposed GAT-CF model achieved a Dice score of 99.1% and an IoU of 98.4%, surpassing the performance of state-of-the-art architectures like Swin-UNet and SegResNet. It also demonstrated superior accuracy in detecting and enhancing tumor voxels and core tumor regions, highlighting its robustness, precision, and potential for clinical adoption in neuroimaging applications |
|---|---|
| AbstractList | Accurate segmentation of brain tumors in MRI images is critical for early diagnosis, surgical planning, and effective treatment strategies. Traditional deep learning models such as U-Net, Attention U-Net, and Swin-U-Net have demonstrated commendable success in tumor segmentation by leveraging Convolutional Neural Networks (CNNs) and transformer-based encoders. However, these models often fall short in effectively capturing complex inter-modality interactions and long-range spatial dependencies, particularly in tumor regions with diffuse or poorly defined boundaries. Additionally, they suffer from limited generalization capabilities and demand substantial computational resources. AIM: To overcome these limitations, a novel approach named Graph-Aware Transformer with Contrastive Fusion (GAT-CF) is introduced. This model enhances segmentation performance by integrating spatial attention mechanisms of transformers with graph-based relational reasoning across multiple MRI modalities, namely T1, T2, FLAIR, and T1CE. The graph-aware structure models inter-slice and intra-slice relationships more effectively, promoting better structural understanding of tumor regions. Furthermore, a multi-modal contrastive learning strategy is employed to align semantic features and distinguish complementary modality-specific information, thereby improving the model’s discriminative power. The fusion of these techniques facilitates improved contextual understanding and more accurate boundary delineation in complex tumor regions. When evaluated on the BraTS2021 dataset, the proposed GAT-CF model achieved a Dice score of 99.1% and an IoU of 98.4%, surpassing the performance of state-of-the-art architectures like Swin-UNet and SegResNet. It also demonstrated superior accuracy in detecting and enhancing tumor voxels and core tumor regions, highlighting its robustness, precision, and potential for clinical adoption in neuroimaging applications |
| Author | Kumar, Prashant Ammu, V. Roopa, C. Chowdhury, Rini Suganthi, R. Evance Leethial, R. |
| Author_xml | – sequence: 1 givenname: Rini orcidid: 0009-0009-6592-4216 surname: Chowdhury fullname: Chowdhury, Rini – sequence: 2 givenname: Prashant orcidid: 0009-0007-3378-615X surname: Kumar fullname: Kumar, Prashant – sequence: 3 givenname: R. orcidid: 0000-0002-7045-5321 surname: Suganthi fullname: Suganthi, R. – sequence: 4 givenname: V. orcidid: 0009-0007-1970-6758 surname: Ammu fullname: Ammu, V. – sequence: 5 givenname: R. orcidid: 0009-0002-6636-060X surname: Evance Leethial fullname: Evance Leethial, R. – sequence: 6 givenname: C. orcidid: 0009-0003-1831-0680 surname: Roopa fullname: Roopa, C. |
| BookMark | eNqFkD1vwjAURa2KSqWUvaP_QKg_SGKPFBWoBOrQDN2iR_JcjBIH2QkR_75p6dCt07tPV-cO556MXOOQkEfOZjJWSjwdEbG2s3Nq5zOt5Q0ZiyROIpVIMfqT78g0hCNjTKg0jjkbk49dV7U22jUlVHTt4XSIFj14pJkHF0zja_S0t-2BLhvXegitPSNddcE2jg41ffZgHc26esjv-Fmja6Edygdya6AKOP29E5KtXrLlJtq-rV-Xi21UKCmjVKA2ptCG70HthS4AEyUMyNjoolRSzaWRrEwQh48nWKQSU5GWkiEze67lhPDrbOdOcOmhqvKTtzX4S85Z_iMnv8rJv-Xkg5yBYVem8E0IHs3_yBeSOW5d |
| ContentType | Journal Article |
| DBID | AAYXX CITATION ADTOC UNPAY |
| DOI | 10.35882/jeeemi.v7i4.993 |
| DatabaseName | CrossRef Unpaywall for CDI: Periodical Content Unpaywall |
| DatabaseTitle | CrossRef |
| DatabaseTitleList | CrossRef |
| Database_xml | – sequence: 1 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2656-8632 |
| EndPage | 1239 |
| ExternalDocumentID | 10.35882/jeeemi.v7i4.993 10_35882_jeeemi_v7i4_993 |
| GroupedDBID | AAYXX ALMA_UNASSIGNED_HOLDINGS CITATION M~E ADTOC UNPAY |
| ID | FETCH-LOGICAL-c833-72e9ffc9f1ba8b29cae682fa35f9cd83843f30d6eecd816ec73e727d30e0fb193 |
| IEDL.DBID | UNPAY |
| ISSN | 2656-8632 |
| IngestDate | Sun Oct 19 05:41:23 EDT 2025 Sat Oct 25 05:10:26 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 4 |
| Language | English |
| License | https://creativecommons.org/licenses/by-sa/4.0 cc-by-sa |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c833-72e9ffc9f1ba8b29cae682fa35f9cd83843f30d6eecd816ec73e727d30e0fb193 |
| ORCID | 0009-0002-6636-060X 0009-0007-3378-615X 0000-0002-7045-5321 0009-0007-1970-6758 0009-0009-6592-4216 0009-0003-1831-0680 |
| OpenAccessLink | https://proxy.k.utb.cz/login?url=https://jeeemi.org/index.php/jeeemi/article/download/993/332 |
| PageCount | 14 |
| ParticipantIDs | unpaywall_primary_10_35882_jeeemi_v7i4_993 crossref_primary_10_35882_jeeemi_v7i4_993 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-10-15 |
| PublicationDateYYYYMMDD | 2025-10-15 |
| PublicationDate_xml | – month: 10 year: 2025 text: 2025-10-15 day: 15 |
| PublicationDecade | 2020 |
| PublicationTitle | Journal of electronics, electromedical engineering, and medical informatics |
| PublicationYear | 2025 |
| SSID | ssj0002875510 |
| Score | 2.3089075 |
| Snippet | Accurate segmentation of brain tumors in MRI images is critical for early diagnosis, surgical planning, and effective treatment strategies. Traditional deep... |
| SourceID | unpaywall crossref |
| SourceType | Open Access Repository Index Database |
| StartPage | 1226 |
| Title | Multi-Modal Graph-Aware Transformer with Contrastive Fusion for Brain Tumor Segmentation |
| URI | https://jeeemi.org/index.php/jeeemi/article/download/993/332 |
| UnpaywallVersion | publishedVersion |
| Volume | 7 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2656-8632 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0002875510 issn: 2656-8632 databaseCode: M~E dateStart: 20190101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3dS8MwEA-6PYgPfouKSh58UcjWNV2Tgi9TnCJsCE6YD1KS9CJ-rJO5Kvrg3-5lbWX6IvrWtEcIdyX3u0vud4TsGekpX4DPMPpJGOJbj0nwNVPKVWImgRDa5SE73fDsKjjvN_sz5LCshbkHgEF-hD-hC3QcEcXLeqHLeuJo5IcqqaNvrXOOG3A1bCISr5DqVfeide36ySFMYTLkfn4yyZsIJMu5X8RdUIsi_s0TzWXpk3p7VY-PU-6lvUhuyoXlt0oeatlY18z7D87G_658iSwUuJO2cpFlMgPpCpmfYiNcJf1JMS7rDBMUPHVE1qz1qkZAeyW4hRF1eVvqKK1G6tntlLSduXwbxc_0yLWboL1sgM-XcDsoyprSNdJrn_SOz1jReIEZyTkTPkTWmsg2tJLaj4yCUPpW8aaNTCK5DLjlXhIC4KgRghEcEAYl3APPakSE66SSDlPYIFRLjYAowJjQupJVjNyFkSAij1thIm43yX5pg_gpp9eIMSyZ2CvOlRc7e8WosE1y8GWkX4W3_iK8TSrjUQY7CC_GepfMdj5Odot_6RPZANlE |
| linkProvider | Unpaywall |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fS8MwEA6yPYgP_hYnKnnwRSFb13RNCr5McQ5BEZwwH6Qk6UXUrRtzdehf72VtRX0RfWvaI4S7kvvukvuOkAMjPeUL8BlGPwlDfOsxCb5mSrlKzCQQQrs85OVV2L0NLvqt_gI5LmthngBgmB_hz-kCHUdE8bJR6LKROBr5kUoa6FsbnOMGXA1biMQrpHp7dd2-c_3kEKYwGXI_P5nkLQSS5dyv4jGoRxH_5okWs3Ss3mZqMPjiXjor5L5cWH6r5LmeTXXdvP_gbPzvylfJcoE7aTsXWSMLkK6TpS9shBukPy_GZZejBAXPHZE1a8_UBGivBLcwoS5vSx2l1US9uJ2SdjKXb6P4mZ64dhO0lw3x-QYehkVZU7pJep2z3mmXFY0XmJGcM-FDZK2JbFMrqf3IKAilbxVv2cgkksuAW-4lIQCOmiEYwQFhUMI98KxGRLhFKukohW1CtdQIiAKMCa0rWcXIXRgJIvK4FSbitkYOSxvE45xeI8awZG6vOFde7OwVo8Jq5OjTSL8K7_xFeJdUppMM9hBeTPV-8Rd9AMXz2BM |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multi-Modal+Graph-Aware+Transformer+with+Contrastive+Fusion+for+Brain+Tumor+Segmentation&rft.jtitle=Journal+of+electronics%2C+electromedical+engineering%2C+and+medical+informatics&rft.au=Chowdhury%2C+Rini&rft.au=Kumar%2C+Prashant&rft.au=Suganthi%2C+R.&rft.au=Ammu%2C+V.&rft.date=2025-10-15&rft.issn=2656-8632&rft.eissn=2656-8632&rft.volume=7&rft.issue=4&rft.spage=1226&rft.epage=1239&rft_id=info:doi/10.35882%2Fjeeemi.v7i4.993&rft.externalDBID=n%2Fa&rft.externalDocID=10_35882_jeeemi_v7i4_993 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2656-8632&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2656-8632&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2656-8632&client=summon |