Using a hybrid attention mechanism as a method to improve the efficiency of network intrusion detection systems

The subject matter of this article is a HybridAttention mechanism integrated into a deep neural architecture for Network Intrusion Detection Systems (NIDS). This study aims to develop and study a HybridAttention mechanism based on a combination of global (self-attention) and local (dynamic local att...

Full description

Saved in:
Bibliographic Details
Published inRadìoelektronnì ì komp'ûternì sistemi (Online) Vol. 2025; no. 3; pp. 177 - 188
Main Authors Nikitenko, Andrii, Bashkov, Yevhen
Format Journal Article
LanguageEnglish
Published National Aerospace University «Kharkiv Aviation Institute 10.09.2025
Subjects
Online AccessGet full text
ISSN1814-4225
2663-2012
2663-2012
DOI10.32620/reks.2025.3.13

Cover

More Information
Summary:The subject matter of this article is a HybridAttention mechanism integrated into a deep neural architecture for Network Intrusion Detection Systems (NIDS). This study aims to develop and study a HybridAttention mechanism based on a combination of global (self-attention) and local (dynamic local attention) models to improve the quality of traffic classification in real-time NIDS. The tasks to be solved are as follows: analyzing the applicability of existing attention mechanisms in network intrusion detection; integrating various attention types into a CNN-BiGRU architecture; developing a HybridAttention mechanism based on dynamic window alignment; optimizing the model using Optuna; and experimentally evaluating its performance on benchmark datasets using standard classification metrics. The methods used are: deep learning modeling with CNN-BiGRU architecture, integration of various attention mechanisms, including a novel HybridAttention, hyperparameter optimization using Optuna, and performance evaluation based on standard classification metrics. The results of this study show that the proposed HybridAttention mechanism demonstrates superiority over individual types of attention in all key metrics. The model achieved up to 99.85% accuracy on the NSL-KDD dataset training data and demonstrated strong generalization on the UNSW-NB15 dataset, achieving up to 98.06% accuracy in multi-class classification and up to 99.20% in binary classification. The proposed model also outperformed state-of-the-art approaches for processing unbalanced data and detecting various types of attacks. Conclusions. The scientific novelty of the results obtained is as follows: a HybridAttention mechanism combining self-attention and dynamic local attention was developed to enhance sequential pattern recognition in network traffic; the CNN-BiGRU architecture was improved by integrating multiple attention modules; systematic hyperparameter optimization using Optuna improved generalization on imbalanced data; and the proposed model outperformed existing approaches on benchmark datasets in detecting both known and novel cyberattacks.
ISSN:1814-4225
2663-2012
2663-2012
DOI:10.32620/reks.2025.3.13